Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable pytest and bazel tests. #1243

Merged
merged 10 commits into from
Mar 9, 2020
Merged

Enable pytest and bazel tests. #1243

merged 10 commits into from
Mar 9, 2020

Conversation

gabrieldemarmiesse
Copy link
Member

@gabrieldemarmiesse gabrieldemarmiesse commented Mar 7, 2020

Used the idea of Jason and @Squadrick to make bazel run pytest.

It's possible to lauch the tests with the pytest command and with the bazel command. In the build, tests are all started by pytest but I made a single new build (py35, linux) to test with bazel and ensure we're still compatible.

Some goodies:

The 25 slowest tests are printed in the CI logs

It's just an option from the command line; --durations=25.

The code coverage is printed in the CI logs

Thanks to the --cov=tensorflow_addons command line option (needs pip install pytest-cov).
Because we're now people who like fancy things, we can even have a beautiful HTML report by just adding an option from the command line: --cov-report html.

We can use assert all the time.

No need to remember specific ones. Note that this is only available if at the bottom of the file, sys.exit(pytest.main([__file__])) is used instead of tf.test.main(). This is possible on all files, and we should enable it progressively by changing the last line. One example is in this pull request: see tensorflow_addons/seq2seq/basic_decoder_test.py.

We don't have to subclass tf.test.TestCase, but we can if we need self.

See tensorflow_addons/register_test.py where simple functions are used.

We can run individual methods instead of the whole file.

pytest ./tensorflow_addons/layers/wrappers_test.py::WeightNormalizationTest::test_removal_Conv2D

Automatic post mortem debugging

--pdb start the interactive Python debugger on errors or KeyboardInterrupt (useful when tests are stuck).

Some baddies

We still need to remember to declare the test in the BUILD file

Otherwise, pytest ./tensorflow_addons will run it, but not bazel test.

We still need to remember to use the boilerplate at the bottom of the test file

Otherwise, pytest ./tensorflow_addons will run it, but not bazel test.
We should use sys.exit(pytest.main([__file__])) from now on (bazel will run the file with pytest when this is written, instead of unittest).

Pytest prevents us from using certain functions.

On Windows, for some unknown reason, tf.compat.v1.test.get_temp_dir() crash with pytest.
Black magic 😨

If this pull request is too big, I can extract some smaller pull requests from it to reduce the size.

And thanks everyone for the great feedback in the meeting and in github!

@bot-of-gabrieldemarmiesse

@rahulunair @pkan2 @RaphaelMeudec

You are owners of some files modified in this pull request.
Would you kindly review the changes whenever you have the time to?
Thank you very much.

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks very much! Excited to unify the test setup

--test_output=errors --local_test_jobs=8 \
--crosstool_top=//build_deps/toolchains/gcc7_manylinux2010-nvcc-cuda10.1:toolchain \
//tensorflow_addons/...
bash tools/ci_testing/addons_cpu.sh
Copy link
Member

@seanpmorgan seanpmorgan Mar 9, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would think this is an issue. addons_cpu will compile the custom-ops without any particular toolchain:
https://github.com/tensorflow/addons/blob/master/tools/ci_testing/addons_cpu.sh#L60-L64

I thought the compile at the subsequent build would be skipped since they're already created, but they won't be compatible with tf from pypi. However the test-release-wheel is passing, so I wanted to verify with you that this isn't reverting to python_op fallback?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we add a bazel clean --expunge after this just for sanity.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure thing. I'll make sure the right toolchaine is used.

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks for all your work!

@seanpmorgan seanpmorgan merged commit 88ae5bb into tensorflow:master Mar 9, 2020
@seanpmorgan
Copy link
Member

@gabrieldemarmiesse So this is failing only for py37 since the push:
https://github.com/tensorflow/addons/runs/495036979

ModuleNotFoundError: No module named '_sqlite3' Does that make any sense to you?

@gabrieldemarmiesse
Copy link
Member Author

gabrieldemarmiesse commented Mar 9, 2020

The problem seems to come from the coverage module having problems with the custom_ops docker image.

You can reproduce with

docker run -it --rm tensorflow/tensorflow:2.1.0-custom-op-gpu-ubuntu16 bash
python3.7 -m pip install pytest~=5.3 pytest-xdist~=1.31 pytest-cov~=2.8
python3.7 -c "import coverage"

but in a normal python docker image:

docker run -it --rm python:3.7 bash
pip install pytest~=5.3 pytest-xdist~=1.31 pytest-cov~=2.8
python -c "import coverage"

The error doesn't happen.

So there is a problem with sqllite in the custom_ops docker image. The quickest fix right now is to remove code coverage from the CI. People can still use it locally. That's until we find out how to fix this.

@gabrieldemarmiesse gabrieldemarmiesse deleted the clean_pytest_branch branch March 9, 2020 13:11
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
* Enable pytest AND bazel tests.

* Added an example of plain test function.

* Use tf-cpu with bazel.

* Used OLDEST_PY_VERSION

* Fix oldest py vesion.

* Added a comment for python3

* Add bazel clean.
@rainwoodman
Copy link
Member

Does the pytest support allow declaring tests with pytest.marks.parameterized?

A lot of our tests use nested loops for parametrization can probably read cleaner converted to parameterized.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants