-
Notifications
You must be signed in to change notification settings - Fork 322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add types in bolts/callbacks #444
Conversation
Hello @ydcjeff! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-01-04 11:08:33 UTC |
I need help with the failed mypy check. |
Codecov Report
@@ Coverage Diff @@
## master #444 +/- ##
==========================================
- Coverage 79.15% 79.10% -0.05%
==========================================
Files 102 102
Lines 5906 5921 +15
==========================================
+ Hits 4675 4684 +9
- Misses 1231 1237 +6
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
matplotlib is needed for adding types in confused logits callbacks. how shall we do it? |
@ydcjeff can we resolve conflits? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM other than the below comment! Thank you for your work!
Co-authored-by: Akihiro Nitta <[email protected]>
the docs fail, probably we need to add |
I have tested locally, the docs build still fail even if installed matplotlib. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ydcjeff Sorry for the delay! I looked through how other OSS projects deal with types of optional packages, and I found that, in Optuna, they assign object
to the types (class names) when optional packages are unavailable. https://github.com/optuna/optuna/blob/392088c25d661b997c4143fd18ccdb7480ffa825/optuna/integration/pytorch_lightning.py#L4-L12
Do you think following this convention sound reasonable?
Co-authored-by: Akihiro Nitta <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ydcjeff lgtm. Thank you!
self, | ||
trainer: Trainer, | ||
pl_module: LightningModule, | ||
batch: Sequence, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here
self._train_batch_idx = batch_idx | ||
|
||
def log_histograms(self, batch, group="") -> None: | ||
def log_histograms(self, batch: Sequence, group: str = "") -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here
self._trainer = trainer | ||
|
||
def on_train_batch_start( | ||
self, trainer, pl_module, batch, batch_idx, dataloader_idx | ||
): | ||
self, trainer: Trainer, pl_module: LightningModule, batch: Sequence, batch_idx: int, dataloader_idx: int |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for batch it is too strict, this could be almost anything (Any)
@ydcjeff mind send another PR addressing @awaelchli comments? |
What does this PR do?
Part of #434
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃