-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve pytorch examples for fp16 #9796
Improve pytorch examples for fp16 #9796
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your PR, that's great! One last thing is to adapt the template example here if you don't mind.
Co-authored-by: Sylvain Gugger <[email protected]>
Co-authored-by: Sylvain Gugger <[email protected]>
Co-authored-by: Sylvain Gugger <[email protected]>
…m/ak314/transformers into improve-pytorch-examples-for-fp16
Thanks for addressing the comments, this is good to go IMO. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very cool! LGTM!
What does this PR do?
When fp16 is True in pytorch training examples, use padding to multiple of 8, if the currently used data collator allows, to speed up training. If no collator was used, add one using the padding option.
Fixes #9752
Who can review?
Trainer: @sgugger