Created
June 25, 2017 14:07
-
-
Save j-min/a07b235877a342a1b4f3461f45cf33b3 to your computer and use it in GitHub Desktop.
learning rate decay in pytorch
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# http://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html | |
def exp_lr_scheduler(optimizer, epoch, init_lr=0.001, lr_decay_epoch=7): | |
"""Decay learning rate by a factor of 0.1 every lr_decay_epoch epochs.""" | |
lr = init_lr * (0.1**(epoch // lr_decay_epoch)) | |
if epoch % lr_decay_epoch == 0: | |
print('LR is set to {}'.format(lr)) | |
for param_group in optimizer.param_groups: | |
param_group['lr'] = lr | |
return optimizer |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Actually you should pass the current learning rate not the initial lr. forgive me if you passing the change lr always. Thank you