Created
June 6, 2019 00:29
-
-
Save amohant4/92c2710b4e9c5b432c930b56c5b96558 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
global_step = tf.Variable(0, trainable=False) # Variable to store number of iterations | |
starter_learning_rate = 0.1 # Initial Learning rate | |
learning_rate = tf.train.exponential_decay( | |
starter_learning_rate, global_step, # Function applied by TF on the varible (same formula as shown above) | |
100000, 0.96, staircase=True) # make staircase=True to force an integer division and thus create a step decay | |
# Passing global_step to minimize() will increment it at each step. | |
learning_step = ( | |
tf.train.GradientDescentOptimizer(learning_rate) # We create an instance of the optimizer with updated learning rate each time | |
.minimize(...my loss..., global_step=global_step) # global step (# iterations) is updated by the minimize function | |
) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment