Skip to content

Commit

Permalink
Reroute profiler to profilers (#12308)
Browse files Browse the repository at this point in the history
Co-authored-by: Carlos Mocholí <[email protected]>
Co-authored-by: Akihiro Nitta <[email protected]>
  • Loading branch information
3 people committed Jun 22, 2022
1 parent 64d3ddf commit 7090ba9
Show file tree
Hide file tree
Showing 20 changed files with 324 additions and 26 deletions.
4 changes: 2 additions & 2 deletions .github/CODEOWNERS
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@
/src/pytorch_lightning/loops @tchaton @awaelchli @justusschock @carmocca
/src/pytorch_lightning/overrides @tchaton @SeanNaren @borda
/src/pytorch_lightning/plugins @tchaton @SeanNaren @awaelchli @justusschock
/src/pytorch_lightning/profiler @williamfalcon @tchaton @borda @carmocca
/src/pytorch_lightning/profiler/pytorch.py @nbcsm @guotuofeng
/src/pytorch_lightning/profilers @williamfalcon @tchaton @borda @carmocca
/src/pytorch_lightning/profilers/pytorch.py @nbcsm @guotuofeng
/src/pytorch_lightning/strategies @tchaton @SeanNaren @awaelchli @justusschock @kaushikb11
/src/pytorch_lightning/trainer @williamfalcon @borda @tchaton @SeanNaren @carmocca @awaelchli @justusschock @kaushikb11
/src/pytorch_lightning/trainer/connectors @tchaton @SeanNaren @carmocca @borda
Expand Down
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -143,6 +143,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Deprecated LightningCLI's registries in favor of importing the respective package ([#13221](https://github.com/PyTorchLightning/pytorch-lightning/pull/13221))



- Deprecated `pytorch_lightning.profiler` in favor of `pytorch_lightning.profilers` ([#12308](https://github.com/PyTorchLightning/pytorch-lightning/pull/12308))


### Removed

- Removed the deprecated `Logger.close` method ([#13149](https://github.com/PyTorchLightning/pytorch-lightning/pull/13149))
Expand Down
2 changes: 1 addition & 1 deletion dockers/tpu-tests/tpu_test_cases.jsonnet
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ local tputests = base.BaseTest {
# TODO (@kaushikb11): Add device stats tests here
coverage run --source=pytorch_lightning -m pytest -v --capture=no \
strategies/test_tpu_spawn.py \
profiler/test_xla_profiler.py \
profilers/test_xla_profiler.py \
accelerators/test_tpu.py \
models/test_tpu.py \
plugins/environments/test_xla_environment.py
Expand Down
2 changes: 1 addition & 1 deletion docs/source-pytorch/api_references.rst
Original file line number Diff line number Diff line change
Expand Up @@ -232,7 +232,7 @@ others
profiler
--------

.. currentmodule:: pytorch_lightning.profiler
.. currentmodule:: pytorch_lightning.profilers

.. autosummary::
:toctree: api
Expand Down
2 changes: 1 addition & 1 deletion docs/source-pytorch/common/trainer.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1213,7 +1213,7 @@ See the :doc:`profiler documentation <../tuning/profiler>`. for more details.

.. testcode::

from pytorch_lightning.profiler import SimpleProfiler, AdvancedProfiler
from pytorch_lightning.profilers import SimpleProfiler, AdvancedProfiler

# default used by the Trainer
trainer = Trainer(profiler=None)
Expand Down
4 changes: 2 additions & 2 deletions docs/source-pytorch/tuning/profiler_advanced.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,11 @@ Find bottlenecks in your code (advanced)
************************
Profile cloud TPU models
************************
To profile TPU models use the :class:`~pytorch_lightning.profiler.xla.XLAProfiler`
To profile TPU models use the :class:`~pytorch_lightning.profilers.xla.XLAProfiler`

.. code-block:: python
from pytorch_lightning.profiler import XLAProfiler
from pytorch_lightning.profilers import XLAProfiler
profiler = XLAProfiler(port=9001)
trainer = Trainer(profiler=profiler)
Expand Down
4 changes: 2 additions & 2 deletions docs/source-pytorch/tuning/profiler_basic.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ The simple profiler measures all the standard methods used in the training loop
**************************************
Profile the time within every function
**************************************
To profile the time within every function, use the :class:`~pytorch_lightning.profiler.advanced.AdvancedProfiler` built on top of Python's `cProfiler <https://docs.python.org/3/library/profile.html#module-cProfile>`_.
To profile the time within every function, use the :class:`~pytorch_lightning.profilers.advanced.AdvancedProfiler` built on top of Python's `cProfiler <https://docs.python.org/3/library/profile.html#module-cProfile>`_.


.. code-block:: python
Expand Down Expand Up @@ -101,7 +101,7 @@ If the profiler report becomes too long, you can stream the report to a file:

.. code-block:: python
from pytorch_lightning.profiler import AdvancedProfiler
from pytorch_lightning.profilers import AdvancedProfiler
profiler = AdvancedProfiler(dirpath=".", filename="perf_logs")
trainer = Trainer(profiler=profiler)
Expand Down
8 changes: 4 additions & 4 deletions docs/source-pytorch/tuning/profiler_expert.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,12 @@ Find bottlenecks in your code (expert)
***********************
Build your own profiler
***********************
To build your own profiler, subclass :class:`~pytorch_lightning.profiler.base.Profiler`
To build your own profiler, subclass :class:`~pytorch_lightning.profilers.profiler.Profiler`
and override some of its methods. Here is a simple example that profiles the first occurrence and total calls of each action:

.. code-block:: python
from pytorch_lightning.profiler import Profiler
from pytorch_lightning.profilers import Profiler
from collections import defaultdict
import time
Expand Down Expand Up @@ -69,7 +69,7 @@ To profile a specific action of interest, reference a profiler in the LightningM

.. code-block:: python
from pytorch_lightning.profiler import SimpleProfiler, PassThroughProfiler
from pytorch_lightning.profilers import SimpleProfiler, PassThroughProfiler
class MyModel(LightningModule):
Expand All @@ -90,7 +90,7 @@ Here's the full code:

.. code-block:: python
from pytorch_lightning.profiler import SimpleProfiler, PassThroughProfiler
from pytorch_lightning.profilers import SimpleProfiler, PassThroughProfiler
class MyModel(LightningModule):
Expand Down
12 changes: 6 additions & 6 deletions docs/source-pytorch/tuning/profiler_intermediate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,11 @@ Find bottlenecks in your code (intermediate)
**************************
Profile pytorch operations
**************************
To understand the cost of each PyTorch operation, use the :class:`~pytorch_lightning.profiler.pytorch.PyTorchProfiler` built on top of the `PyTorch profiler <https://pytorch.org/docs/master/profiler.html>`__.
To understand the cost of each PyTorch operation, use the :class:`~pytorch_lightning.profilers.pytorch.PyTorchProfiler` built on top of the `PyTorch profiler <https://pytorch.org/docs/master/profiler.html>`__.

.. code-block:: python
from pytorch_lightning.profiler import PyTorchProfiler
from pytorch_lightning.profilers import PyTorchProfiler
profiler = PyTorchProfiler()
trainer = Trainer(profiler=profiler)
Expand Down Expand Up @@ -65,11 +65,11 @@ The profiler will generate an output like this:
***************************
Profile a distributed model
***************************
To profile a distributed model, use the :class:`~pytorch_lightning.profiler.pytorch.PyTorchProfiler` with the *filename* argument which will save a report per rank.
To profile a distributed model, use the :class:`~pytorch_lightning.profilers.pytorch.PyTorchProfiler` with the *filename* argument which will save a report per rank.

.. code-block:: python
from pytorch_lightning.profiler import PyTorchProfiler
from pytorch_lightning.profilers import PyTorchProfiler
profiler = PyTorchProfiler(filename="perf-logs")
trainer = Trainer(profiler=profiler)
Expand Down Expand Up @@ -153,11 +153,11 @@ to extend the scope of profiled functions.
*****************************
Visualize profiled operations
*****************************
To visualize the profiled operations, enable **emit_nvtx** in the :class:`~pytorch_lightning.profiler.pytorch.PyTorchProfiler`.
To visualize the profiled operations, enable **emit_nvtx** in the :class:`~pytorch_lightning.profilers.pytorch.PyTorchProfiler`.

.. code-block:: python
from pytorch_lightning.profiler import PyTorchProfiler
from pytorch_lightning.profilers import PyTorchProfiler
profiler = PyTorchProfiler(emit_nvtx=True)
trainer = Trainer(profiler=profiler)
Expand Down
8 changes: 4 additions & 4 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -78,10 +78,10 @@ module = [
"pytorch_lightning.strategies.single_tpu",
"pytorch_lightning.strategies.tpu_spawn",
"pytorch_lightning.strategies.strategy",
"pytorch_lightning.profiler.advanced",
"pytorch_lightning.profiler.base",
"pytorch_lightning.profiler.pytorch",
"pytorch_lightning.profiler.simple",
"pytorch_lightning.profilers.advanced",
"pytorch_lightning.profilers.base",
"pytorch_lightning.profilers.pytorch",
"pytorch_lightning.profilers.simple",
"pytorch_lightning.trainer.callback_hook",
"pytorch_lightning.trainer.connectors.callback_connector",
"pytorch_lightning.trainer.connectors.data_connector",
Expand Down
31 changes: 31 additions & 0 deletions src/pytorch_lightning/profiler/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from pytorch_lightning.profiler.base import AbstractProfiler, BaseProfiler
from pytorch_lightning.profilers.advanced import AdvancedProfiler
from pytorch_lightning.profilers.base import PassThroughProfiler
from pytorch_lightning.profilers.profiler import Profiler
from pytorch_lightning.profilers.pytorch import PyTorchProfiler
from pytorch_lightning.profilers.simple import SimpleProfiler
from pytorch_lightning.profilers.xla import XLAProfiler

__all__ = [
"AbstractProfiler",
"BaseProfiler",
"Profiler",
"AdvancedProfiler",
"PassThroughProfiler",
"PyTorchProfiler",
"SimpleProfiler",
"XLAProfiler",
]
24 changes: 24 additions & 0 deletions src/pytorch_lightning/profiler/advanced.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from pytorch_lightning.profilers.advanced import AdvancedProfiler as NewAdvancedProfiler
from pytorch_lightning.utilities import rank_zero_deprecation


class AdvancedProfiler(NewAdvancedProfiler):
def __init__(self, *args, **kwargs) -> None: # type: ignore[no-untyped-def]
rank_zero_deprecation(
"`pytorch_lightning.profiler.AdvancedProfiler` is deprecated in v1.7 and will be removed in v1.9."
" Use the equivalent `pytorch_lightning.profilers.AdvancedProfiler` class instead."
)
super().__init__(*args, **kwargs)
74 changes: 74 additions & 0 deletions src/pytorch_lightning/profiler/base.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Profiler to check if there are any bottlenecks in your code."""
from abc import ABC, abstractmethod
from typing import Any

from pytorch_lightning.profilers.base import PassThroughProfiler as NewPassThroughProfiler
from pytorch_lightning.profilers.profiler import Profiler
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation


class AbstractProfiler(ABC):
"""Specification of a profiler.
See deprecation warning below
.. deprecated:: v1.6
`AbstractProfiler` was deprecated in v1.6 and will be removed in v1.8.
Please use `Profiler` instead.
"""

@abstractmethod
def start(self, action_name: str) -> None:
"""Defines how to start recording an action."""

@abstractmethod
def stop(self, action_name: str) -> None:
"""Defines how to record the duration once an action is complete."""

@abstractmethod
def summary(self) -> str:
"""Create profiler summary in text format."""

@abstractmethod
def setup(self, **kwargs: Any) -> None:
"""Execute arbitrary pre-profiling set-up steps as defined by subclass."""

@abstractmethod
def teardown(self, **kwargs: Any) -> None:
"""Execute arbitrary post-profiling tear-down steps as defined by subclass."""


class BaseProfiler(Profiler):
"""
.. deprecated:: v1.6
`BaseProfiler` was deprecated in v1.6 and will be removed in v1.8.
Please use `Profiler` instead.
"""

def __init__(self, *args, **kwargs): # type: ignore[no-untyped-def]
rank_zero_deprecation(
"`BaseProfiler` was deprecated in v1.6 and will be removed in v1.8. Please use `Profiler` instead."
)
super().__init__(*args, **kwargs)


class PassThroughProfiler(NewPassThroughProfiler):
def __init__(self, *args, **kwargs) -> None: # type: ignore[no-untyped-def]
rank_zero_deprecation(
"`pytorch_lightning.profiler.PassThroughProfiler` is deprecated in v1.7 and will be removed in v1.9."
" Use the equivalent `pytorch_lightning.profilers.PassThroughProfiler` class instead."
)
super().__init__(*args, **kwargs)
30 changes: 30 additions & 0 deletions src/pytorch_lightning/profiler/profiler.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from pytorch_lightning.profilers.profiler import Profiler as NewProfiler
from pytorch_lightning.utilities import rank_zero_deprecation


class Profiler(NewProfiler):
"""
.. deprecated:: v1.6
`pytorch_lightning.profiler.Profiler` is deprecated in v1.7 and will be removed in v1.9.
Use the equivalent `pytorch_lightning.profilers.Profiler` class instead.
"""

def __init__(self, *args, **kwargs) -> None: # type: ignore[no-untyped-def]
rank_zero_deprecation(
"`pytorch_lightning.profiler.Profiler` is deprecated in v1.7 and will be removed in v1.9."
" Use the equivalent `pytorch_lightning.profilers.Profiler` class instead."
)
super().__init__(*args, **kwargs)
44 changes: 44 additions & 0 deletions src/pytorch_lightning/profiler/pytorch.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from pytorch_lightning.profilers.pytorch import PyTorchProfiler as NewPyTorchProfiler
from pytorch_lightning.profilers.pytorch import RegisterRecordFunction as NewRegisterRecordFuncion
from pytorch_lightning.profilers.pytorch import ScheduleWrapper as NewScheduleWrapper
from pytorch_lightning.utilities import rank_zero_deprecation


class RegisterRecordFunction(NewRegisterRecordFuncion):
def __init__(self, *args, **kwargs) -> None: # type: ignore[no-untyped-def]
rank_zero_deprecation(
"`pytorch_lightning.profiler.pytorch.RegisterRecordFunction` is deprecated in v1.7 and will be removed in"
" in v1.9. Use the equivalent `pytorch_lightning.profilers.pytorch.RegisterRecordFunction` class instead."
)
super().__init__(*args, **kwargs)


class ScheduleWrapper(NewScheduleWrapper):
def __init__(self, *args, **kwargs) -> None: # type: ignore[no-untyped-def]
rank_zero_deprecation(
"`pytorch_lightning.profiler.pytorch.ScheduleWrapper` is deprecated in v1.7 and will be removed in v1.9."
" Use the equivalent `pytorch_lightning.profilers.pytorch.ScheduleWrapper` class instead."
)
super().__init__(*args, **kwargs)


class PyTorchProfiler(NewPyTorchProfiler):
def __init__(self, *args, **kwargs) -> None: # type: ignore[no-untyped-def]
rank_zero_deprecation(
"`pytorch_lightning.profiler.PyTorchProfiler` is deprecated in v1.7 and will be removed in v1.9."
" Use the equivalent `pytorch_lightning.profilers.PyTorchProfiler` class instead."
)
super().__init__(*args, **kwargs)
Loading

0 comments on commit 7090ba9

Please sign in to comment.