-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix resetting epoch loop restarting flag in LearningRateFinder #19819
Fix resetting epoch loop restarting flag in LearningRateFinder #19819
Conversation
Could you please check this fix @awaelchli , @carmocca ? |
Thank you for the fix @clumsy, very nice! |
Interesting, batch_size_finder actually worked for me but I can check and make the change, sure. There was a more critical bug for LR finder, I'm afraid, @awaelchli: #19814 |
This flag reset is seem to be in place for batch_size_finder here, @awaelchli |
What does this PR do?
lr_finder
does not resetfit_loop.epoch_loop.restarting
toFalse
which causes the validation check to trigger validation on the firstadvance()
call because of this check.Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--19819.org.readthedocs.build/en/19819/