-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Numpy 2.0 #689
base: main
Are you sure you want to change the base?
Conversation
28bb96a
to
2808634
Compare
Removed the custom Complex C types... I have no context as to why they were necessary, so that definitely needs careful review. This is code that seems to have been there mostly from the beginning. Edit seems to be needed to define operations like |
d83b37d
to
eca375b
Compare
Also removes special case for old unsupported numpy 1.16
In what context? If it is in array indexing, then this syntax was only added on Python 3.11 (I found out recently about this). |
No, this is C(++) code, multiplication of complex scalars. Maybe @michaelosthege finds this interesting xD? These are the breaking changes: https://numpy.org/devdocs/numpy_2_0_migration_guide.html#complex-types-underlying-type-changes |
This will help with publishing a repodata patch for conda-forge as per <pymc-devs#688 (comment)>. This will no longer be necessary when we are finished with <pymc-devs#689>.
This will help with publishing a repodata patch for conda-forge as per <#688 (comment)>. This will no longer be necessary when we are finished with <#689>.
Friendly note that it will be necessary to revert 1235d08 on the next rebase in order to unpin |
Maybe the answer to this last problem and the overflow problem are in NEP 50: https://numpy.org/neps/nep-0050-scalar-promotion.html One other part of the tests with many failures is the rewriting tests for ops involving random variables (e.g. |
Do you have a branch with the most recent changes? We can start triaging the failing tests and see what's problematic vs inconsequential |
Here is the most recent version: brendan-m-murphy#2 |
I'll make a list of failing tests here (as I look into them):
|
I think based on how the new numpy scalar promotion rules work, if I'm not sure how much this matters. For the loop test, I just increased the precision from 16 to 32, since it didn't matter for that test. And for the autocaster tests, I've just changed the expected value in one case to match what happens with the new numpy scalar promotion rules (https://numpy.org/neps/nep-0050-scalar-promotion.html#impact-on-operators-involving-python-int-float-and-complex). I could revert these and change the autocaster logic to do what happened before numpy 2.0. UPDATE: this seems to be causing problems elsewhere, since I think I will change how the autocaster makes comparisons to match how |
Here is the status of the tests on my fork: For python 3.10:
For python 3.12:
I had to pin numpy to 2.0.2, since there are new failing tests with 2.1. The |
@brendan-m-murphy Thanks so much for the work and sorry for the delay. It sounds like some promising progress. To your last round of observations:
It doesn't make sense that the test would be passing before and failing now?
That sounds fine, let's set a nonzero atol
I think this was addressed in #972 (Have you rebased on top of it?)
There might have been a recent PR that fixed this?
These sort of changes are some of the trickiest, but tbh also unlikely to affect any users whatsoever. Feel free to mark it as xfail and open a separate issue for us to figure out what should be done.
Did the behavior of numpy.unique change? I think when you do unique of a matrix you get a vector back. I didn't find the test in your log (just skimmed). I'll look at it again when the noise is removed from addressing the previous points. |
@ricardoV94 Sorry for the slow response. This is still on my todo list, but I'm pretty busy at the moment. I'll try to rebase my branch onto main soon and try your suggestions. |
Thanks @brendan-m-murphy ! |
@ricardoV94 I finally got some time to do the rebase this weekend. It didn't go perfectly, so I had to make some fixes afterwards. Are we still aiming for compatibility with numpy < 2? I think we'll need conditional import statements to do that. For instance, There are still a few failing tests (e.g. Here is the latest: https://github.com/brendan-m-murphy/pytensor/pull/2/checks. (Some of the mypy failures are related to the import statements I mentioned earlier.) |
Yes backward compat would be nice to have |
@ricardoV94 Sorry for the delay. I started looking at this again last Friday. I'm not sure how I missed this before, but the failing test On my branch, if I run this test with numpy 1.26, it passes, and if I run with numpy 2.0, it fails (if I use pytest to run the test). If I run the code in a script, then it passes. It's a bit weirder, because for this test, Anyway... I will try to find the exact numpy commit where this happens. If you have any ideas, let me know. |
An update: this is the numpy PR that first causes this behaviour to occur: numpy/numpy@44ba7ca I don't know why this is happening, but using numpy after this PR causes Prior to this PR, this doesn't happen: the state of the bit generator when When running the same code from a script, it seems that In more detail, with Prior to the PR I liked above, the state of However, after that PR, if a test value is computed during It seems that replacing Hopefully there won't be too much left to fix now. |
Re RNGs, we also had some issues with new numpy stuff on PyMC that @lucianopaz looked at. Not sure if the same, but the sounds like it might.
Awesome. Let us know if you're blocked somewhere! |
@ricardoV94 There are two numba tests related to RNGs that are failing. The two tests are Before I changed I'm having trouble figuring out how to get numba to do a similar copy. The impression I got was that, by the time the function is compiled, it is too late to copy the RNG. I don't know much about numba, so maybe there is a way to make the copy happen as python code. I tried to git bisect numpy using the If I run these tests with When running the first test with Jax, it raises a warning saying it will use a copy of the shared RNG, which presumably is what we want to happen with numba. Maybe the code in the Jax linker that replaces shared RNG inputs could be extracted into a helper function (in I might try this, but other ideas/advice would be appreciated. Besides that, the only failing tests are the numba benchmarks, due to a change in the latest numba release (the This is the latest run of the tests: https://github.com/brendan-m-murphy/pytensor/actions/runs/13055274796/job/36424753373?pr=2 |
That's incredibly close. Perhaps @aseyboldt or @kc611 know how we can get the deepcopy of the RNG state to work with numba |
Actually we're just using obj mode for the copy in Numba? pytensor/pytensor/link/numba/dispatch/random.py Lines 32 to 41 in 911c6a3
Can't you swap that to use deepcopy instead? |
@brendan-m-murphy just to see if I understand, the problem is that while this was true in numpy<2, it is no longer true now? import numpy as np
rng = np.random.default_rng(123)
rng_copy = copy(rng)
assert np.isclose(rng.uniform(), rng_copy.uniform()) |
Yes exactly. I'm not sure they intended this change, but it seems like |
Awesome, that works! |
Can we ignore the benchmarks for now? I can try to fix it, but it seems like it would be better for that to be a different PR, since I don't think it depends on the numpy version. I haven't implemented backwards compatibility everywhere, but I can work on it. I made some of the imports backwards compatible and added a warning if the old imports are being used (mostly for debugging, but it would be a hint to anyone who happens to upgrade pytensor without upgrading numpy). Logistically, should I force push to this PR? Or open a new one? Also the commit history is a bit of a mess on my branch; I can try to clean it up by rebasing. |
New PR sounds like the best. We can definitely ignore benchmarks, but are they failing to run or just slow? |
They're raising errors due to |
Ah that's not related to your changes. We should see them in our CI. Unless because of our numpy pin, we also end up pinning numba to a lower version? |
@brendan-m-murphy the numba incompat should be fixed by #1186 |
Okay, PR is open here: #1194! @ricardoV94 I added numpy 1.26 and 2.1 to CI, so the workflow needs approval. |
Description
Ruff suggested a few things to change but I cannot test because I install both
numpy=2.0.0.rc1
andnumba
(any version including the most recent0.59.1
) (incompatibilities in python abi).Related Issue
Checklist
Type of change