Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Function that print each function's execution time #1174

Merged
merged 1 commit into from
Feb 27, 2020

Conversation

Squadrick
Copy link
Member

This came in super handy for #1173 and would be helpful while tackling #1143.

I can close this PR if you don't think it needs to be included in the codebase.

@gabrieldemarmiesse
Copy link
Member

With pytest, you don't need special code (that's kind of why I'm advocating to run the tests with pytest, no bazel, it's way more futureproof). For example, to profile the stochastic_weight_average_test.py, I just run:

pytest -v --durations=10 tensorflow_addons/optimizers/stochastic_weight_averaging_test.py

and I get:

================================================================== test session starts ==================================================================
platform linux -- Python 3.7.4, pytest-5.3.5, py-1.8.1, pluggy-0.13.1 -- /opt/conda/bin/python
cachedir: .pytest_cache
rootdir: /mnt/c/Users/yolo/Desktop/projects/addons
plugins: typeguard-2.7.1
collected 7 items

tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_assign_batchnorm PASSED                                           [ 14%]
tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_averaging PASSED                                                  [ 28%]
tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_fit_simple_linear_model PASSED                                    [ 42%]
tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_get_config PASSED                                                 [ 57%]
tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_optimizer_failure PASSED                                          [ 71%]
tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_optimizer_string PASSED                                           [ 85%]
tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_session SKIPPED                                                   [100%] 
=================================================================== warnings summary ====================================================================
/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/pywrap_tensorflow_internal.py:15
  /opt/conda/lib/python3.7/site-packages/tensorflow_core/python/pywrap_tensorflow_internal.py:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
    import imp

tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_assign_batchnorm
  /opt/conda/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training_utils.py:1389: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working
    if isinstance(sample_weight_mode, collections.Mapping):

tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_assign_batchnorm
  /opt/conda/lib/python3.7/site-packages/tensorflow_core/python/framework/indexed_slices.py:348: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working
    if not isinstance(values, collections.Sequence):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
=============================================================== slowest 10 test durations ===============================================================
11.52s call     tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_fit_simple_linear_model
1.74s call     tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_assign_batchnorm
0.13s call     tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_averaging
0.01s setup    tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_assign_batchnorm
0.01s call     tensorflow_addons/optimizers/stochastic_weight_averaging_test.py::SWATest::test_get_config

(0.00 durations hidden.  Use -vv to show these durations.)
======================================================= 6 passed, 1 skipped, 3 warnings in 16.05s =======================================================

But for the moment, since we're still going to work with bazel for a while, it's better to have tools to help us time the tests. Thanks for the pull request!

Copy link
Member

@gabrieldemarmiesse gabrieldemarmiesse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! We're going to get a super fast CI at this rate! 😃

@gabrieldemarmiesse gabrieldemarmiesse merged commit 0f44e04 into tensorflow:master Feb 27, 2020
@Squadrick
Copy link
Member Author

Woah, did not know you could do that with pytest, that's amazing.

@Squadrick Squadrick deleted the time_utils branch February 27, 2020 21:06
@gabrieldemarmiesse
Copy link
Member

And that's nothing, just wait until you see #677 😄

@Squadrick
Copy link
Member Author

Squadrick commented Feb 27, 2020

Alright, so I have good news. I think we can get pytest working with bazel. We'll need to define our own bazel rule (say tfa_py_test) instead of using the default py_test. I'll take a crack at this tomorrow.

@gabrieldemarmiesse
Copy link
Member

gabrieldemarmiesse commented Feb 27, 2020

Or we could just use bazel only to build the SO files and use pytest for everything else. It would require nearly no changes to the repo since pytest is already compatible with unittest. We could also get rid of the BUILD files for python if we decide to go with the native python wheels builder.

It would mean less work for you.

@Squadrick
Copy link
Member Author

But what I really like about Bazel is the incremental and dependent building/testing it offers. If you're working in a python only world, then going with pytest would be A+, but since we do cross-language calls (Python->C++), Bazel is very useful since it can figure out what to recompile and whatnot, this wouldn't be available in pytest. If we ever decide to completely stop all C++/CUDA code (which depends on the progress of XLA and MLIR) then we can drop bazel all together.

@gabrieldemarmiesse
Copy link
Member

Even if we keep C++/Cuda code, I'm not sure running the tests with bazel is our best option. I believe we need some discussion regarding all of this before implementing anything. Let's open an issue and we'll be able to discuss it there :)

jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants