Tensorflow: Using Adam optimizer

Posted on

Question :

Tensorflow: Using Adam optimizer

I am experimenting with some simple models in tensorflow, including one that looks very similar to the first MNIST for ML Beginners example, but with a somewhat larger dimensionality. I am able to use the gradient descent optimizer with no problems, getting good enough convergence. When I try to use the ADAM optimizer, I get errors like this:

tensorflow.python.framework.errors.FailedPreconditionError: Attempting to use uninitialized value Variable_21/Adam
     [[Node: Adam_2/update_Variable_21/ApplyAdam = ApplyAdam[T=DT_FLOAT, use_locking=false, _device="/job:localhost/replica:0/task:0/cpu:0"](Variable_21, Variable_21/Adam, Variable_21/Adam_1, beta1_power_2, beta2_power_2, Adam_2/learning_rate, Adam_2/beta1, Adam_2/beta2, Adam_2/epsilon, gradients_11/add_10_grad/tuple/control_dependency_1)]]

where the specific variable that complains about being uninitialized changes depending on the run. What does this error mean? And what does it suggest is wrong? It seems to occur regardless of the learning rate I use.

Answer #1:

The AdamOptimizer class creates additional variables, called “slots”, to hold values for the “m” and “v” accumulators.

See the source here if you’re curious, it’s actually quite readable:
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/training/adam.py#L39 . Other optimizers, such as Momentum and Adagrad use slots too.

These variables must be initialized before you can train a model.

The normal way to initialize variables is to call tf.initialize_all_variables() which adds ops to initialize the variables present in the graph when it is called.

(Aside: unlike its name suggests, initialize_all_variables() does not initialize anything, it only add ops that will initialize the variables when run.)

What you must do is call initialize_all_variables() after you have added the optimizer:

...build your model...
# Add the optimizer
train_op = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)
# Add the ops to initialize variables.  These will include 
# the optimizer slots added by AdamOptimizer().
init_op = tf.initialize_all_variables()

# launch the graph in a session
sess = tf.Session()
# Actually intialize the variables
# now train your model
for ...:
Answered By: Touts

Answer #2:

FailedPreconditionError: Attempting to use uninitialized value is one of the most frequent errors related to tensorflow. From official documentation, FailedPreconditionError

This exception is most commonly raised when running an operation that
reads a tf.Variable before it has been initialized.

In your case the error even explains what variable was not initialized: Attempting to use uninitialized value Variable_1. One of the TF tutorials explains a lot about variables, their creation/initialization/saving/loading

Basically to initialize the variable you have 3 options:

I almost always use the first approach. Remember you should put it inside a session run. So you will get something like this:

with tf.Session() as sess:

If your are curious about more information about variables, read this documentation to know how to report_uninitialized_variables and check is_variable_initialized.

Answered By: Salvador Dali

Answer #3:

You need to call tf.global_variables_initializer() on you session, like

init = tf.global_variables_initializer()

Full example is available in this great tutorial

Answered By: LYu

Answer #4:

run init after AdamOptimizer?and without define init before or run init




Answered By: booyoungxu

Answer #5:

I was having a similar problem. (No problems training with GradientDescent optimizer, but error raised when using to Adam Optimizer, or any other optimizer with its own variables)

Changing to an interactive session solved this problem for me.

sess = tf.Session()


sess = tf.InteractiveSession()
Answered By: Cameron Fraser

Leave a Reply

Your email address will not be published.