Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Improved Documentation For Learning Rate Schedulers #5365

Merged
merged 12 commits into from
Aug 23, 2021

Conversation

gabeorlanski
Copy link
Contributor

@gabeorlanski gabeorlanski commented Aug 18, 2021

Partially addresses #4835.

Changes proposed in this pull request:

  • Improved documentation and a basic config snippet example for the learning rate schedulers
  • Moved the Pytorch wrapped learning rates into their own file pytorch_lr_schedulers.py so that they have their own docs page
  • Added links to the original PyTorch LRSchedulers so that people can read their docs as well.
  • Added formula for Noam to its docs page.
  • The only LR scheduler without an example is slanted_triangular.py as I have not used it before and do not know of a valid config for it.

Before submitting

  • I've read and followed all steps in the Making a pull request
    section of the CONTRIBUTING docs.
  • I've updated or added any relevant docstrings following the syntax described in the
    Writing docstrings section of the CONTRIBUTING docs.
  • If this PR fixes a bug, I've added a test that will fail without my fix.
  • If this PR adds a new feature, I've added tests that sufficiently cover my new functionality.

After submitting

  • All GitHub Actions jobs for my pull request have passed.
  • codecov/patch reports high test coverage (at least 90%).
    You can find this under the "Actions" tab of the pull request once the other checks have finished.

@AkshitaB AkshitaB self-requested a review August 23, 2021 02:51
Copy link
Contributor

@AkshitaB AkshitaB left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is excellent! Thank you for the contribution.

@AkshitaB AkshitaB merged commit 01e8a35 into allenai:main Aug 23, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants