-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add tests and CIs, upgrade packaging and metafiles, drop Py2 and add Py3.10, standardize infra and config, and more #33
Conversation
847509a
to
f91d3a5
Compare
92f85bd
to
be59f7f
Compare
be59f7f
to
816fae3
Compare
35e4c5b
to
feab98d
Compare
Hey @martinRenou , this should be almost ready to go so you can follow it up with more detailed checks for the content and hopefully a preview of some sort, except for the test failure—seems like only on Python <3.7-3.8, the plot test is using |
Thanks for starting this! I will look into this in the coming days. Hoping to add some tests with some checks on the output HTML content, and maybe publish the outputs as build artifacts. |
Also, I forgot to mention—I'll also open another issue to add our standard contributing guide and release checklist, and update the various text files—Readme, Authors, license, init docstring, file headers, links, etc. I didn't include that here either, nor some deprecated/bad practices in the source, to avoid delaying your further work on the tests. That will be all from my end in terms of what is needed for 0.2.0; for future releases we can consider employing a suite of pre-commit checks, fixers, formatters and linters and running CodeQL, Semgrep, Mega-Linter etc on CIs, but I deferred that as well as more invasive and less vital changes like a
That would be a lot simpler to implement, if take a modest amount more effort to view, than pinning up a whole Netlify build and supporting infra, and probably makes a lot more sense overall. |
Thanks for that work! Do you want to merge this PR before I look into improving those tests? Or should I open a PR directly against this branch? |
feab98d
to
65f84b1
Compare
65f84b1
to
c4fb5e7
Compare
The former sounds good; I skipped that subtest for now on Windows + Python 3.6. I'd appreciate having someone formally review it before merging, though... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That looks good to me, I just have a couple of comments :)
|
||
# ---- Constants | ||
|
||
OPEN_BROWSER_OPTION = '--open-browser' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Passing this option results in tests failing with:
E File "/home/martin/miniconda3/envs/docrepr/lib/python3.9/subprocess.py", line 1052, in __del__
E _warn("subprocess %s is still running" % self.pid,
E ResourceWarning: subprocess 9964 is still running
I am not very familiar with Pytest, maybe there is a way to tell it to ignore that subprocesses are still running?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, thanks. The issue isn't really with Pytest, but rather with webbrowser
, as apparently on some platforms it isn't cleaning up after itself properly with the subprocess it spawns the browser in, or at least handling the warnings it produces. See bpo-5995. As a note, it seems to be Linux or webbrowser-specific, since it doesn't occur for me with a presumably essentially identical clean Python 3.9 miniconda env on Windows.
Could you provide the full test output as well as your full Python, OS and default web browser versions? Also, is your web browser running when you run the test, and does that change the outcome? Based on that, I can determine how to best silence teh warnings.
We could silence the warning with filterwarnings
at the Pytest config or Python invocation level, but then we would miss all, or at least many ResourceWarnings
. If possible, it would be nice to silence the warnings just around the offending line, but I don't believe that'll work since it gets called by the destructor when the object is GC'ed; I'll need to see your full output to know exactly when that ends up being. As such, probably the best approach is adding a precise as possible warnings filter to the fixture where we handle --open-browser
, but I want to confirm your output first.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the full output:
docrepr/tests/test_output.py::test_sphinxify[empty_oinfo] FAILED [ 10%]
docrepr/tests/test_output.py::test_sphinxify[basic] FAILED [ 20%]
docrepr/tests/test_output.py::test_sphinxify[render_math] FAILED [ 30%]
docrepr/tests/test_output.py::test_sphinxify[no_render_math] FAILED [ 40%]
docrepr/tests/test_output.py::test_sphinxify[numpy_sin] FAILED [ 50%]
========================================================== FAILURES ===========================================================
_________________________________________________ test_sphinxify[empty_oinfo] _________________________________________________
cls = <class '_pytest.runner.CallInfo'>, func = <function call_runtest_hook.<locals>.<lambda> at 0x7f2974cf2700>, when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)
@classmethod
def from_call(
cls,
func: "Callable[[], TResult]",
when: "Literal['collect', 'setup', 'call', 'teardown']",
reraise: Optional[
Union[Type[BaseException], Tuple[Type[BaseException], ...]]
] = None,
) -> "CallInfo[TResult]":
excinfo = None
start = timing.time()
precise_start = timing.perf_counter()
try:
> result: Optional[TResult] = func()
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:255: in <lambda>
lambda: ihook(item=item, **kwds), when=when, reraise=reraise
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_hooks.py:265: in __call__
return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_manager.py:80: in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
yield from unraisable_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
def unraisable_exception_runtest_hook() -> Generator[None, None, None]:
with catch_unraisable_exception() as cm:
yield
if cm.unraisable:
if cm.unraisable.err_msg is not None:
err_msg = cm.unraisable.err_msg
else:
err_msg = "Exception ignored in"
msg = f"{err_msg}: {cm.unraisable.object!r}\n\n"
msg += "".join(
traceback.format_exception(
cm.unraisable.exc_type,
cm.unraisable.exc_value,
cm.unraisable.exc_traceback,
)
)
> warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E pytest.PytestUnraisableExceptionWarning: Exception ignored in: <function Popen.__del__ at 0x7f29757a79d0>
E
E Traceback (most recent call last):
E File "/home/martin/miniconda3/envs/docrepr/lib/python3.9/subprocess.py", line 1052, in __del__
E _warn("subprocess %s is still running" % self.pid,
E ResourceWarning: subprocess 37968 is still running
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning
____________________________________________________ test_sphinxify[basic] ____________________________________________________
cls = <class '_pytest.runner.CallInfo'>, func = <function call_runtest_hook.<locals>.<lambda> at 0x7f2974d3b820>, when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)
@classmethod
def from_call(
cls,
func: "Callable[[], TResult]",
when: "Literal['collect', 'setup', 'call', 'teardown']",
reraise: Optional[
Union[Type[BaseException], Tuple[Type[BaseException], ...]]
] = None,
) -> "CallInfo[TResult]":
excinfo = None
start = timing.time()
precise_start = timing.perf_counter()
try:
> result: Optional[TResult] = func()
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:255: in <lambda>
lambda: ihook(item=item, **kwds), when=when, reraise=reraise
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_hooks.py:265: in __call__
return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_manager.py:80: in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
yield from unraisable_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
def unraisable_exception_runtest_hook() -> Generator[None, None, None]:
with catch_unraisable_exception() as cm:
yield
if cm.unraisable:
if cm.unraisable.err_msg is not None:
err_msg = cm.unraisable.err_msg
else:
err_msg = "Exception ignored in"
msg = f"{err_msg}: {cm.unraisable.object!r}\n\n"
msg += "".join(
traceback.format_exception(
cm.unraisable.exc_type,
cm.unraisable.exc_value,
cm.unraisable.exc_traceback,
)
)
> warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E pytest.PytestUnraisableExceptionWarning: Exception ignored in: <function Popen.__del__ at 0x7f29757a79d0>
E
E Traceback (most recent call last):
E File "/home/martin/miniconda3/envs/docrepr/lib/python3.9/subprocess.py", line 1052, in __del__
E _warn("subprocess %s is still running" % self.pid,
E ResourceWarning: subprocess 38018 is still running
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning
_________________________________________________ test_sphinxify[render_math] _________________________________________________
cls = <class '_pytest.runner.CallInfo'>, func = <function call_runtest_hook.<locals>.<lambda> at 0x7f29326f8f70>, when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)
@classmethod
def from_call(
cls,
func: "Callable[[], TResult]",
when: "Literal['collect', 'setup', 'call', 'teardown']",
reraise: Optional[
Union[Type[BaseException], Tuple[Type[BaseException], ...]]
] = None,
) -> "CallInfo[TResult]":
excinfo = None
start = timing.time()
precise_start = timing.perf_counter()
try:
> result: Optional[TResult] = func()
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:255: in <lambda>
lambda: ihook(item=item, **kwds), when=when, reraise=reraise
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_hooks.py:265: in __call__
return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_manager.py:80: in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
yield from unraisable_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
def unraisable_exception_runtest_hook() -> Generator[None, None, None]:
with catch_unraisable_exception() as cm:
yield
if cm.unraisable:
if cm.unraisable.err_msg is not None:
err_msg = cm.unraisable.err_msg
else:
err_msg = "Exception ignored in"
msg = f"{err_msg}: {cm.unraisable.object!r}\n\n"
msg += "".join(
traceback.format_exception(
cm.unraisable.exc_type,
cm.unraisable.exc_value,
cm.unraisable.exc_traceback,
)
)
> warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E pytest.PytestUnraisableExceptionWarning: Exception ignored in: <function Popen.__del__ at 0x7f29757a79d0>
E
E Traceback (most recent call last):
E File "/home/martin/miniconda3/envs/docrepr/lib/python3.9/subprocess.py", line 1052, in __del__
E _warn("subprocess %s is still running" % self.pid,
E ResourceWarning: subprocess 38103 is still running
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning
_______________________________________________ test_sphinxify[no_render_math] ________________________________________________
cls = <class '_pytest.runner.CallInfo'>, func = <function call_runtest_hook.<locals>.<lambda> at 0x7f2932640310>, when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)
@classmethod
def from_call(
cls,
func: "Callable[[], TResult]",
when: "Literal['collect', 'setup', 'call', 'teardown']",
reraise: Optional[
Union[Type[BaseException], Tuple[Type[BaseException], ...]]
] = None,
) -> "CallInfo[TResult]":
excinfo = None
start = timing.time()
precise_start = timing.perf_counter()
try:
> result: Optional[TResult] = func()
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:255: in <lambda>
lambda: ihook(item=item, **kwds), when=when, reraise=reraise
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_hooks.py:265: in __call__
return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_manager.py:80: in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
yield from unraisable_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
def unraisable_exception_runtest_hook() -> Generator[None, None, None]:
with catch_unraisable_exception() as cm:
yield
if cm.unraisable:
if cm.unraisable.err_msg is not None:
err_msg = cm.unraisable.err_msg
else:
err_msg = "Exception ignored in"
msg = f"{err_msg}: {cm.unraisable.object!r}\n\n"
msg += "".join(
traceback.format_exception(
cm.unraisable.exc_type,
cm.unraisable.exc_value,
cm.unraisable.exc_traceback,
)
)
> warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E pytest.PytestUnraisableExceptionWarning: Exception ignored in: <function Popen.__del__ at 0x7f29757a79d0>
E
E Traceback (most recent call last):
E File "/home/martin/miniconda3/envs/docrepr/lib/python3.9/subprocess.py", line 1052, in __del__
E _warn("subprocess %s is still running" % self.pid,
E ResourceWarning: subprocess 38142 is still running
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning
__________________________________________________ test_sphinxify[numpy_sin] __________________________________________________
cls = <class '_pytest.runner.CallInfo'>, func = <function call_runtest_hook.<locals>.<lambda> at 0x7f29725ff1f0>, when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)
@classmethod
def from_call(
cls,
func: "Callable[[], TResult]",
when: "Literal['collect', 'setup', 'call', 'teardown']",
reraise: Optional[
Union[Type[BaseException], Tuple[Type[BaseException], ...]]
] = None,
) -> "CallInfo[TResult]":
excinfo = None
start = timing.time()
precise_start = timing.perf_counter()
try:
> result: Optional[TResult] = func()
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/runner.py:255: in <lambda>
lambda: ihook(item=item, **kwds), when=when, reraise=reraise
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_hooks.py:265: in __call__
return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/pluggy/_manager.py:80: in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
yield from unraisable_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
def unraisable_exception_runtest_hook() -> Generator[None, None, None]:
with catch_unraisable_exception() as cm:
yield
if cm.unraisable:
if cm.unraisable.err_msg is not None:
err_msg = cm.unraisable.err_msg
else:
err_msg = "Exception ignored in"
msg = f"{err_msg}: {cm.unraisable.object!r}\n\n"
msg += "".join(
traceback.format_exception(
cm.unraisable.exc_type,
cm.unraisable.exc_value,
cm.unraisable.exc_traceback,
)
)
> warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E pytest.PytestUnraisableExceptionWarning: Exception ignored in: <function Popen.__del__ at 0x7f29757a79d0>
E
E Traceback (most recent call last):
E File "/home/martin/miniconda3/envs/docrepr/lib/python3.9/subprocess.py", line 1052, in __del__
E _warn("subprocess %s is still running" % self.pid,
E ResourceWarning: subprocess 38237 is still running
../../miniconda3/envs/docrepr/lib/python3.9/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning
---------------------------------------------------- Captured stderr call -----------------------------------------------------
ATTENTION: default value of option mesa_glthread overridden by environment.
ATTENTION: default value of option mesa_glthread overridden by environment.
ATTENTION: default value of option mesa_glthread overridden by environment.
ATTENTION: default value of option mesa_glthread overridden by environment.
===================================================== slowest 5 durations =====================================================
0.50s call docrepr/tests/test_output.py::test_sphinxify[empty_oinfo]
0.25s call docrepr/tests/test_output.py::test_sphinxify[numpy_sin]
0.11s call docrepr/tests/test_output.py::test_sphinxify[no_render_math]
0.11s call docrepr/tests/test_output.py::test_sphinxify[basic]
0.11s call docrepr/tests/test_output.py::test_sphinxify[render_math]
=================================================== short test summary info ===================================================
FAILED docrepr/tests/test_output.py::test_sphinxify[empty_oinfo] - pytest.PytestUnraisableExceptionWarning: Exception ignore...
FAILED docrepr/tests/test_output.py::test_sphinxify[basic] - pytest.PytestUnraisableExceptionWarning: Exception ignored in: ...
FAILED docrepr/tests/test_output.py::test_sphinxify[render_math] - pytest.PytestUnraisableExceptionWarning: Exception ignore...
FAILED docrepr/tests/test_output.py::test_sphinxify[no_render_math] - pytest.PytestUnraisableExceptionWarning: Exception ign...
FAILED docrepr/tests/test_output.py::test_sphinxify[numpy_sin] - pytest.PytestUnraisableExceptionWarning: Exception ignored ...
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 5 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
====================================================== 5 failed in 1.97s ======================================================
In this env I am running Python 3.9.9, Fedora Linux
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, is your web browser running when you run the test, and does that change the outcome?
Whether or not the browser is already open, the test will fail the same way.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the detailed feedback! Based on that and per the above, I think the best solution is
adding a precise as possible warnings filter to the fixture where we handle --open-browser
but that'll be a lot easier for you to add and test rather than me flying blind when I can't repro, so I'll let you take care of that if you don't mind. You can add the warnings filter to _open_browser
in the open_browser
fixture in conftest.py
.
It would be convenient to have py2 removal in a separate commit just in case some folks are still using it. |
@CAM-Gerlach if you're fine with this, we could probably ignore my comments for now, merge this PR, and open issues for them? I can try to push a follow-up PR to fix the Windows issue you've been seeing. |
Really sorry for the long delay and thanks for your patience! I actually left for a trip to see family from Dec. 21 through a few days ago, and due to some family circumstances didn't have much spare time to try to catch up with everything until now, sorry. My fault! I'll try to avoid such long blocks in the future.
Sure; it already is in a separate atomic commit, e8ec355, which will stay that why when we merge.
Yeah; I'm pretty sure the best solution is
but that'll be a lot easier for you to add and test rather than me flying blind when I can't repro, so I'll go ahead with that. Thanks again for your understanding! |
No worries at all! Hope you had a nice trip :) Thanks for merging :) |
Fixes #22
Fixes #23
Fixes #24
Closes #17