tensorflow - In tf.slim, whether I need to add the dependency to the loss -


in tf.slim, have used batch_norm.

my question is: whether need explicitly add dependency loss?

i think, slim knew have used batch_norm, whether has automatically add dependency loss? confused.

yes, need.

could follow instructions here:

note: when training, moving_mean , moving_variance need updated. default update ops placed in tf.graphkeys.update_ops, need added dependency train_op. example:

update_ops = tf.get_collection(tf.graphkeys.update_ops) tf.control_dependencies(update_ops):   train_op = optimizer.minimize(loss) 

Comments

Popular posts from this blog

resizing Telegram inline keyboard -

command line - How can a Python program background itself? -

php - "cURL error 28: Resolving timed out" on Wordpress on Azure App Service on Linux -