Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write monitor-output of tests to console #46

Open
danielhuppmann opened this issue Sep 22, 2021 · 6 comments
Open

Write monitor-output of tests to console #46

danielhuppmann opened this issue Sep 22, 2021 · 6 comments
Assignees
Labels
enhancement New feature or request

Comments

@danielhuppmann
Copy link

danielhuppmann commented Sep 22, 2021

Is your feature request related to a problem? Please describe.
Inspecting the database after every test-run to get the output of pytest-monitor seems to be quite a hassle.

Describe the solution you'd like
Looking at pytest-benchmark, they write a summary of the output to the console, see screenshot below.

Describe alternatives you've considered
None.

Additional context
None.

@js-dieu
Copy link
Collaborator

js-dieu commented Sep 24, 2021

Hello

Thanks for the proposition! Indeed, extracting data is a pain ....
I see two points:

  1. metrics extracts for exploitation
  2. metrics summary

Point number 1 can be addressed through the monitor-server-api (see https://github.com/CFMTech/monitor-server-api.git)

However, point 2 needs a development. Should we propose a report with top consumers on a given resource or just dumps the output ? I believe a full report in someone's terminal is not a good idea... but we can definitely report for a given number of test given an axis (cpu, memory or time)

@js-dieu js-dieu self-assigned this Sep 24, 2021
@js-dieu js-dieu added the enhancement New feature or request label Sep 24, 2021
@danielhuppmann
Copy link
Author

Thanks for the pointer to monitor-server-api, will take a deeper look - though this seems like a more sophisticated stack than what I had in mind...

About the metrics summary, yes, this is tricky... One thought, though: the main advantage why I would rather use pytest-monitor instead of pytest-benchmark is that is reports across multiple dimensions - so having a summary-output of only one dimension (cpu/memory/time) defeats its main advantage.

@js-dieu
Copy link
Collaborator

js-dieu commented Sep 27, 2021

One thought, though: the main advantage why I would rather use pytest-monitor instead of pytest-benchmark is that is reports across multiple dimensions - so having a summary-output of only one dimension (cpu/memory/time) defeats its main advantage.

Got your point 👍🏻

@joukewitteveen
Copy link

Would something along the lines of pytest_terminal_summary be an option, where we list the tests along with their duration and peak memory usage?

@joukewitteveen
Copy link

I've made a little proof-of-concept: pytest-resource-usage. Feel free to copy anything from that experiment.

@js-dieu
Copy link
Collaborator

js-dieu commented Jun 26, 2022

Hello @joukewitteveen

Thanks for the PoC. You are not the first one requesting this output. I was lacking time lately, but I'll make the implementation next week. I've seen the code you propose, I'll surely inspire myself from this. Thanks for that! 👍🏻

js-dieu added a commit to js-dieu/pytest-monitor that referenced this issue Dec 28, 2023
js-dieu added a commit to js-dieu/pytest-monitor that referenced this issue Dec 28, 2023
js-dieu added a commit to js-dieu/pytest-monitor that referenced this issue Dec 28, 2023
An item setup from markers is now computed by the `PyTestMonitorMarkerProcessor` which build a `PyTestMonitorItemConfig`
in turned stashed at the item level.
js-dieu added a commit to js-dieu/pytest-monitor that referenced this issue Dec 28, 2023
The pytest_configure hook now uses the `PyTestMonitorMarkerProcessor` to iterate over all known markers in order to
generate the documentation.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants