-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Trainer] Change num_train_epochs default value #8113
Conversation
Thanks for your contribution! |
21f290a
to
604e75e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #8113 +/- ##
========================================
Coverage 56.46% 56.46%
========================================
Files 596 596
Lines 91583 91583
========================================
Hits 51711 51711
Misses 39872 39872 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@@ -127,7 +127,7 @@ class TrainingArguments: | |||
The epsilon hyperparameter for the [`AdamW`] optimizer. | |||
max_grad_norm (`float`, *optional*, defaults to 1.0): | |||
Maximum gradient norm (for gradient clipping). | |||
num_train_epochs(`float`, *optional*, defaults to 3.0): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
中文文档同步修改
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
* change num_train_epochs default value * update docs
PR types
Others
PR changes
Others
Description
Change num_train_epochs default value.