-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP: Accelerator refactor #5385
WIP: Accelerator refactor #5385
Conversation
from pytorch_lightning.accelerators.accelerator import Accelerator | ||
from pytorch_lightning.accelerators.cpu import CPUAccelerator | ||
from pytorch_lightning.accelerators.gpu import GPUAccelerator | ||
from pytorch_lightning.accelerators.tpu import TPUAccelerator |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 ... similar to trainer new attributes _distrib_type
and _device_type
on the other hand, it will be breaking change... :/
self.use_dp = False | ||
self.use_ddp = False | ||
self.use_ddp2 = False | ||
self.use_horovod = False | ||
self.use_single_gpu = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is refactored to _distrib_type
but we can rename it if you want...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I know, we need to update the branch with these changes. It's a bit challenging now with keeping uptodate. But we will keep it in mind thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
probably no need for an update as you will cut this work to multiple PRs, just for you to be aware of it... :]
could we have some high-level summary, I see that accelerators are for HW (as it is in the description) but some summary of how the plugins (I would move them on the same level as accelerators as there re rather partners not sub-category) are linked, and can you have "any" combination? |
We made some slides (see link in description above) for a high level overview. For some plugins and accelerators multiple combinations are possible, but some configurations are obviously not. So for our built-in modes, we take care of valid configuration. But the user can define their own accelerator and compatible plugins if they want to, or re-use our built-in ones. |
Hello @justusschock! Thanks for updating this PR.
Comment last updated at 2021-01-22 09:54:14 UTC |
… + enable custom fp16 plugin automatically
What does this PR do?
Fixes #4510
This PR separates the Accelerator (Hardware Part) from the actual different training routines.
Remaining TODOs:
So far this PR was co-authored with @awaelchli !
cc @awaelchli @tchaton @SeanNaren who will likely work on this!
Slides for motivation and high-level overview
List of PRs to look out for (when rebasing, code we need to manually copy over to new files):
#5221, #5195, #5300, #5388