Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add typing to the Trainer object within LightningModules #12337

Closed
johnhenning opened this issue Mar 15, 2022 · 2 comments · Fixed by #12345
Closed

Add typing to the Trainer object within LightningModules #12337

johnhenning opened this issue Mar 15, 2022 · 2 comments · Fixed by #12345
Labels
Milestone

Comments

@johnhenning
Copy link
Contributor

johnhenning commented Mar 15, 2022

Proposed refactor

I want to add typing for the trainer pointer within the LightningModule.

Motivation

This would improve autocompletion for IDEs.

Pitch

I would like to make the following addition:

(LightningModule)[https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/core/lightning.py#L97]

class LightningModule(...):
    ...
    def __init__(self, *args: Any, **kwargs: Any) -> None:
        ...
        self.trainer = None

changed to

class LightningModule(...):
    ...
    def __init__(self, *args: Any, **kwargs: Any) -> None:
        ...
        self.trainer: Optional[Trainer] = None

Additional context

See changes here


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

cc @Borda

@akihironitta
Copy link
Contributor

Hi @jlhbaseball15, thanks for your suggestion! We're trying to add annotations to the whole codebase in #7073. Would you be interested in submitting a PR?

@akihironitta akihironitta added the good first issue Good for newcomers label Mar 15, 2022
@johnhenning
Copy link
Contributor Author

Yep, I already have a branch on a fork. I'd be happy to work on more typing overall as well!

@carmocca carmocca added this to the 1.6 milestone Mar 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants