Skip to content

Commit

Permalink
Check max_time when setting defaults for min/max epochs (#9072)
Browse files Browse the repository at this point in the history
Co-authored-by: tchaton <[email protected]>
  • Loading branch information
2 people authored and lexierule committed Sep 1, 2021
1 parent 5dffe74 commit c0c3742
Show file tree
Hide file tree
Showing 3 changed files with 11 additions and 3 deletions.
5 changes: 4 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,15 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).


## [1.4.5] - 2021-08-31

- Fixed not setting a default value for `max_epochs` if `max_time` was specified on the `Trainer` constructor ([#9072](https://github.com/PyTorchLightning/pytorch-lightning/pull/9072))

## [1.4.4] - 2021-08-24

- Fixed a bug in the binary search mode of auto batch size scaling where exception was raised if the first trainer run resulted in OOM ([#8954](https://github.com/PyTorchLightning/pytorch-lightning/pull/8954))
- Fixed a bug causing logging with `log_gpu_memory='min_max'` not working ([#9013](https://github.com/PyTorchLightning/pytorch-lightning/pull/9013))


## [1.4.3] - 2021-08-17

- Fixed plateau scheduler stepping on incomplete epoch ([#8861](https://github.com/PyTorchLightning/pytorch-lightning/pull/8861))
Expand Down
4 changes: 2 additions & 2 deletions pytorch_lightning/trainer/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -380,8 +380,8 @@ def __init__(
self.tuner = Tuner(self)

fit_loop = FitLoop(
min_epochs=(1 if (min_epochs is None and min_steps is None) else min_epochs),
max_epochs=(1000 if (max_epochs is None and max_steps is None) else max_epochs),
min_epochs=(1 if (min_epochs is None and min_steps is None and max_time is None) else min_epochs),
max_epochs=(1000 if (max_epochs is None and max_steps is None and max_time is None) else max_epochs),
)
training_epoch_loop = TrainingEpochLoop(min_steps, max_steps)
training_batch_loop = TrainingBatchLoop()
Expand Down
5 changes: 5 additions & 0 deletions tests/callbacks/test_timer.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,11 @@ def on_fit_start(self):
trainer.fit(TestModel())
assert "callbacks list already contains a Timer" in caplog.text

seconds = 1
trainer = Trainer(max_time=dict(seconds=seconds))
assert trainer.max_epochs is None
assert trainer.max_steps is None


@pytest.mark.parametrize(
"duration,expected",
Expand Down

0 comments on commit c0c3742

Please sign in to comment.