-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add coverage report to pytest #471
Conversation
What does the coverage mean for the module itself, e.g.:
w.r.t:
|
I believe the meaning here is: when the tests run, 100% of the lines of code in test_distance.py are run, while only 95% of the lines of code in distance.py are run. |
Yeah we probably need to exclude test files from the report |
It's interesting that not all of the methods of the mock object are being used though, I think those might be worth keeping an eye on. |
Good point - we'd want to exclude test files but not mocks, which are in a |
Our overall test coverage is not bad I would say. We also have some integration tests (ssh connection required) that are not accounted for in the report. |
Unfortunately our coverage decreases after excluding the test files
|
(x + c)/(y + c) > x/y, for all {0 < c}, {0 < x < y} |
indeed |
A lot of code is tested through integration tests (41 deselected). For example:
I believe all types of transformation are tested but to do that we need the profiles |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand all the CI stuff, but it is a good idea and it seems to work. Maybe we should also add the code coverage badge to the readme.
Now that we have profiles in blob storage, can we allow these tests from CI? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Learned something new. Thanks!
I thought about it but the profiles are big (>100MB for USA hydro). I don't know if GitHub will be happy about it and/or we want to wait for the checks to finish on each push |
The test coverage badge would definitely be a huge motivation for writing more tests! |
That would be nice. Looking at pandas, they seem to report coverage to an external service which provides the badge. Will look into this when I can. |
We could limit it to the smallest profiles, or use some mock profiles for this purpose, or maybe the github cache action so we only download them once in a while. |
Hopefully we can somehow get at least the profiles tests included. |
Purpose
Whenever we run pytest, include a coverage report in the output. This will help with awareness of test coverage, and enable future improvements such as analysis over time and in different areas of the code, or additional ci checks (e.g. we could fail the build if code coverage decreases).
What the code is doing
New packages:
Add the necessary packages to Pipfile and requirements.txt. Consolidated them under the
dev-packages
section and removed version pinning for these only. This makes them available whether runningpytest
directly or viatox
.Also added python 3.9 to the test matrix.
Testing
green checkmark
Usage Example/Visuals
The report looks like this (see the recent github logs for full output)
Time estimate
5 mins