Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade to pyperf 2.6.0 #272

Merged
merged 3 commits into from
Apr 22, 2023
Merged

Upgrade to pyperf 2.6.0 #272

merged 3 commits into from
Apr 22, 2023

Conversation

itamaro
Copy link
Contributor

@itamaro itamaro commented Apr 14, 2023

needed for loop_factory parameter support in bench_async_func

@hugovk
Copy link
Member

hugovk commented Apr 15, 2023

The 3.12 failure is a known issue (#263) but please could you check the new PyPy failures?

@itamaro
Copy link
Contributor Author

itamaro commented Apr 16, 2023

thanks @hugovk, I'm actually not sure what these failures are about, nothing jumps at me from the logs.
could you trigger retries perhaps for the pypy jobs?

@hugovk
Copy link
Member

hugovk commented Apr 16, 2023

Restarted, and they fail in the same way.

@itamaro
Copy link
Contributor Author

itamaro commented Apr 16, 2023

Hmm that's odd. I'll need to look closer and get a local repro.

@corona10
Copy link
Member

corona10 commented Apr 20, 2023

Hmm that's odd. I'll need to look closer and get a local repro.

pyperf doesn't have CI for pypy yet. So I will add CIs for pypy to pyperf first.

@corona10
Copy link
Member

corona10 commented Apr 21, 2023

I ran the pyperformance benchmark, and I got a different exit code per pyperf version

  • command: $> pypy -m pyperformance run -b all --debug-single-value -o foo.json
  • pyperf 2.5.0: echo $? -> 0
  • pyperf 2.6.0: echo $? -> 1

There might be some regression from pyperf 2.6.0

@corona10
Copy link
Member

corona10 commented Apr 21, 2023

@hugovk
Okay, pypy failed for running several tests, and the change of pyperf makes a different output (I verified the issue by reverting the specific patch)
: psf/pyperf@d3f3e6a

Two ways to solve

  1. Revert pyperf change and release pyperf 2.6.1
  2. Set pypy CI as the experimental task

Which do you prefer? I think that the pyperf 2.6.0 is doing what it should.
(So I personally prefer option 2)

@itamaro
Copy link
Contributor Author

itamaro commented Apr 21, 2023

If 2.6.0 is behaving correctly, could we update the test to match the output?

@corona10
Copy link
Member

If 2.6.0 is behaving correctly, could we update the test to match the output?

From that side of view, since pypy fails to run the pyperformance benchmark suite,
There are two ways to solve.

  1. If the benchmark is wrong benchmark code should be updated.
  2. If the pypy implementation issue, then pypy should update their implementation (we already did for CPython 3.11 and 3.12)

And I prefer to update GHA file instead of touching pyperformance code directly if not, we have to release the pyperformance with considering the implementation issue.

python: pypy-3.9
experimental: false
- os: ubuntu-latest
python: pypy-3.8
experimental: false
- os: ubuntu-latest
python: pypy-3.7
experimental: false

Then we don't have to consider the implementation issue when we release the pyperformance.

@corona10
Copy link
Member

cc @cfbolz

@cfbolz
Copy link

cfbolz commented Apr 22, 2023

ok, found it. it's not really a pypy bug as such (ie it would affect all the python implementations that are counted as having a jit) but a weird interaction of things.

the direct cause is this pyperf change: psf/pyperf@d5349bb#diff-d1600cb0c5deea10c84136091897ee5086097ac9e35be850235c1f28b7a48316L74

it removes the warmups argument to Runner.__init__. This parameter has not actually done anything in a long time. However, it is still passed into the constructor in three benchmarks:

my suggestion is to simply remove the warmups use in these three benchmarks.

@corona10
Copy link
Member

my suggestion is to simply remove the warmups use in these three benchmarks.

@cfbolz
Thanks for your investigation!!
In that case, it will be 'If the benchmark is wrong, benchmark code should be updated.' case. :)

@itamaro
Would you like to follow what @cfbolz suggested and let's see if it changes to green status.

@itamaro
Copy link
Contributor Author

itamaro commented Apr 22, 2023

Thanks @cfbolz , all checks are green now!

1 similar comment
@itamaro
Copy link
Contributor Author

itamaro commented Apr 22, 2023

Thanks @cfbolz , all checks are green now!

Copy link
Member

@corona10 corona10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@corona10 corona10 merged commit 5e6d663 into python:main Apr 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants