Update Adam learning rate after 10 epochs #9853
Answered
by
rohitgr7
cmdrootaccess
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
hello, how can i update the learning rate of adam optimizer after 10 epochs ?
|
Beta Was this translation helpful? Give feedback.
Answered by
rohitgr7
Oct 7, 2021
Replies: 1 comment 3 replies
-
you can use LambdaLR where lambda function can be something like: lambda epoch: return lr * (0.1 if epoch in self.lr_decay_epoch else 1) |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
s-rog
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
you can use LambdaLR where lambda function can be something like: