Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove legacy plugins #5950

Merged
merged 26 commits into from
Feb 16, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
103533c
remove legacy plugins
Borda Feb 12, 2021
28b4089
imports
Borda Feb 12, 2021
9bb9d2e
formatting
Borda Feb 13, 2021
f6095d6
fix docs references
awaelchli Feb 14, 2021
6b4ad1f
fix cluster environment inheritance
awaelchli Feb 14, 2021
62e5ea4
remove precision connector
awaelchli Feb 14, 2021
cb477c2
sequential plugin
awaelchli Feb 14, 2021
38d9920
fix rpc example
awaelchli Feb 14, 2021
c54d878
pep8
awaelchli Feb 14, 2021
00513ce
rename
awaelchli Feb 14, 2021
499a743
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
453634f
Merge remote-tracking branch 'origin/refactor/legacy-cleanup3' into r…
awaelchli Feb 15, 2021
96927e8
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
6028546
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
d7a765d
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
874f020
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
b42daa5
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
2aa8fe4
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
c0abb89
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
84716be
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
4dda459
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 15, 2021
713c85f
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 16, 2021
612f266
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 16, 2021
a41eeca
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 16, 2021
7d6b39d
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 16, 2021
b6c3575
Merge branch 'master' into refactor/legacy-cleanup3
mergify[bot] Feb 16, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 0 additions & 4 deletions .yapfignore
Original file line number Diff line number Diff line change
@@ -1,5 +1 @@
.git/*


# TODO
pytorch_lightning/plugins/legacy/*
10 changes: 5 additions & 5 deletions docs/source/advanced/multi_gpu.rst
Original file line number Diff line number Diff line change
Expand Up @@ -580,9 +580,9 @@ Below are the possible configurations we support.

Implement Your Own Distributed (DDP) training
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you need your own way to init PyTorch DDP you can override :meth:`pytorch_lightning.plugins.legacy.ddp_plugin.DDPPlugin.init_ddp_connection`.
If you need your own way to init PyTorch DDP you can override :meth:`pytorch_lightning.plugins.training_type.ddp.DDPPlugin.init_ddp_connection`.

If you also need to use your own DDP implementation, override :meth:`pytorch_lightning.plugins.legacy.ddp_plugin.DDPPlugin.configure_ddp`.
If you also need to use your own DDP implementation, override :meth:`pytorch_lightning.plugins.training_type.ddp.DDPPlugin.configure_ddp`.


----------
Expand Down Expand Up @@ -679,7 +679,7 @@ In addition, we use Gradient Checkpointing to reduce GPU memory requirements fur

Reference: https://arxiv.org/abs/1811.06965

.. note:: DDPSequentialPlugin is currently supported only for Pytorch 1.6.
.. note:: RPCSequentialPlugin is currently supported only for Pytorch 1.6.

To get started, install FairScale using the command below. We install a specific branch which contains PyTorch related fixes for Sequential Parallelism.

Expand All @@ -692,7 +692,7 @@ This should be kept within the ``sequential_module`` variable within your ``Ligh

.. code-block:: python

from pytorch_lightning.plugins.legacy.ddp_sequential_plugin import DDPSequentialPlugin
from pytorch_lightning.plugins.training_type.rpc_sequential import RPCSequentialPlugin
from pytorch_lightning import LightningModule

class MyModel(LightningModule):
Expand All @@ -702,7 +702,7 @@ This should be kept within the ``sequential_module`` variable within your ``Ligh

# Split my module across 4 gpus, one layer each
model = MyModel()
plugin = DDPSequentialPlugin(balance=[1, 1, 1, 1])
plugin = RPCSequentialPlugin(balance=[1, 1, 1, 1])
trainer = Trainer(accelerator='ddp', gpus=4, plugins=[plugin])
trainer.fit(model)

Expand Down
12 changes: 6 additions & 6 deletions pl_examples/basic_examples/conv_sequential_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
to balance across your GPUs.

To run:
python conv_model_sequential_example.py --accelerator ddp --gpus 4 --max_epochs 1 --batch_size 256 --use_ddp_sequential
python conv_model_sequential_example.py --accelerator ddp --gpus 4 --max_epochs 1 --batch_size 256 --use_rpc_sequential
"""
import math
from argparse import ArgumentParser
Expand All @@ -32,7 +32,7 @@
from pl_examples import cli_lightning_logo
from pytorch_lightning import Trainer
from pytorch_lightning.metrics.functional import accuracy
from pytorch_lightning.plugins.legacy.ddp_sequential_plugin import DDPSequentialPlugin
from pytorch_lightning.plugins import RPCSequentialPlugin
from pytorch_lightning.utilities import _BOLTS_AVAILABLE, _FAIRSCALE_PIPE_AVAILABLE

if _BOLTS_AVAILABLE:
Expand Down Expand Up @@ -201,7 +201,7 @@ def instantiate_datamodule(args):
if __name__ == "__main__":
cli_lightning_logo()
parser = ArgumentParser(description="Pipe Example")
parser.add_argument("--use_ddp_sequential", action="store_true")
parser.add_argument("--use_rpc_sequential", action="store_true")
parser = Trainer.add_argparse_args(parser)
parser = pl_bolts.datamodules.CIFAR10DataModule.add_argparse_args(parser)
args = parser.parse_args()
Expand All @@ -212,8 +212,8 @@ def instantiate_datamodule(args):
cifar10_dm = instantiate_datamodule(args)

plugins = None
if args.use_ddp_sequential:
plugins = DDPSequentialPlugin()
if args.use_rpc_sequential:
plugins = RPCSequentialPlugin()

model = LitResnet(batch_size=args.batch_size, manual_optimization=not args.automatic_optimization)

Expand All @@ -223,4 +223,4 @@ def instantiate_datamodule(args):

if trainer.accelerator_backend.rpc_enabled:
# Called at the end of trainer to ensure all processes are killed
trainer.accelerator_backend.ddp_plugin.exit_rpc_process()
trainer.training_type_plugin.exit_rpc_process()
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.

from pytorch_lightning.plugins.legacy.plugin import LightningPlugin


class ClusterEnvironment(LightningPlugin):
class ClusterEnvironment:

def __init__(self):
self._world_size = None
Expand Down
Empty file.
144 changes: 0 additions & 144 deletions pytorch_lightning/plugins/legacy/apex.py

This file was deleted.

Loading