Xformers wheel for PyTorch 2.0+cu117 on Windows #5962
Replies: 19 comments 75 replies
-
|
Beta Was this translation helpful? Give feedback.
-
Ok so I've edited my launch.py to " pip3 install numpy --pre torch==2.0.0.dev20221223+cu117 torchvision torchaudio --force-reinstall --extra-index-url https://download.pytorch.org/whl/nightly/cu117" which seems to install correctly. I've extracted the files from the xformers whl and dumped them into "venv\Lib\site-packages" which isn't working out. |
Beta Was this translation helpful? Give feedback.
-
Is this safe to use with python 3.10.6? |
Beta Was this translation helpful? Give feedback.
-
Where can I find the required steps to build the latest pytorch and xformers for Python 3.10.6 on Windows 10? |
Beta Was this translation helpful? Give feedback.
-
I'm a little off topic, but I decided to do this on Linux, so I installed torch2.0 with python 3.10.8, built a new virtual environment and started it (without the xformres argument yet), and I was wondering if this error would go away if I build and install xformers?
|
Beta Was this translation helpful? Give feedback.
-
Dear all, since many of you have requested for python 3.10 wheel, here it is: |
Beta Was this translation helpful? Give feedback.
-
constant 30-33 it/s on my 4090 using torch 2.0 and your python 3.10 xformers whl. It was 25-29 varying speed before, so it's great. |
Beta Was this translation helpful? Give feedback.
-
@aliencaocao I think u need to add an extra instruction to remove the dev string in the version.py for torch after the update to 2.0, otherwise it breaks face restoration from codeformer entirely. See the error below trition. |
Beta Was this translation helpful? Give feedback.
-
I just noticed that the instructions for 3.10 have been added and when I tried to build and install again, I received an error saying that the cuda versions are different and should be matched. I downgraded to 11.7. as per the instructions, but I thought it a bit strange that I did not get this error yesterday, so I am writing it down here, apparently it is more complicated than I thought 🤔. |
Beta Was this translation helpful? Give feedback.
-
ok, is there any chance we can install this on a GIT PULL update from automatic1111? |
Beta Was this translation helpful? Give feedback.
-
Awesome stuff, seeing about 8 it/s improvement on my 4090. Don't forget to update your cudnn libs to the torch folder when you finish all the other steps above. |
Beta Was this translation helpful? Give feedback.
-
Everyone in this thread should check https://pytorch.org/blog/compromised-nightly-dependency/ |
Beta Was this translation helpful? Give feedback.
-
Is there a 4 dummies on this step by step? |
Beta Was this translation helpful? Give feedback.
-
I installed it last week and it worked well, got around 7% better performance on 3070, Now I tried to install it again but that nightly build of torch seems to be missing from https://download.pytorch.org/whl/nightly/cu117 (related to hacking incident?) Now I get this error :
|
Beta Was this translation helpful? Give feedback.
-
Here's the latest whl, compiled on xformers commit #6cd1b36 from today, pytorch 2 from yesterday (01062023), python 3.10.6, cuda 1.17, windows 11 22h2, and rtx 4090: Tested working on an auto1111 env with the same latest pytorch nighly. Getting 33-35 it/s for a single 512*512 euler_a img on a 4090 as expected. And a bonus: virustotal scans for the paranoids: Original uploaded version by @aliencaocao: Both have the exact same generic PUA flag but that's nothing. |
Beta Was this translation helpful? Give feedback.
-
Update: when I try to install pt2 now it won't install it with cuda enabled. Anyone have any insights? I also tried to pip install the version I used a few weeks back and still no luck |
Beta Was this translation helpful? Give feedback.
-
Xformers wheel that was compiled on pt2 with cu118 from today (26-01-23): Compiled on win 11 22h2, python 3.10.9, cuda118 and rtx 4090. |
Beta Was this translation helpful? Give feedback.
-
Xformers v17, compiled on pt2 from 06-02-23, on win 11, python 3.10.9, cuda 11.8 and 4090: |
Beta Was this translation helpful? Give feedback.
-
24.) Automatic1111 Web UI - PC - Free this method installs latest cuda dll files too test py
|
Beta Was this translation helpful? Give feedback.
-
Built with:
Windows 10
Python 3.9.13 (Python 3.10.9 also available below)
CUDA 11.8
torch 2.0.0.dev20221223+cu117 (latest Torch 2.0 dev on 23 Dec)
I have built xformers latest master (facebookresearch/xformers@e163309) on PyTorch 2.0. For anyone who would like to try out and see if there is performance improvements (esp with torch 2.0), you can download it here:
Python 3.9: https://1drv.ms/u/s!AvJPuRJUdWx_8hbpWdFpr234H5e_?e=eScCID
Python 3.10: https://1drv.ms/u/s!AvJPuRJUdWx_8hi5p0ecOiV3yJtT?e=Nd6rE5
You will need to install this version of pytorch:
Python 3.9 wheel:
pip3 install numpy --pre torch==2.0.0.dev20221223+cu117 torchvision torchaudio --force-reinstall --extra-index-url https://download.pytorch.org/whl/nightly/cu117
Python 3.10 wheel:
pip3 install numpy --pre torch==2.0.0.dev20221224+cu117 torchvision torchaudio --force-reinstall --extra-index-url https://download.pytorch.org/whl/nightly/cu117
I am running a RTX 3080Ti, and I am seeing notable speed improvements just by drop-in replacing torch 2.0 and the new xformers built with it, from 7.5it/s to 8.1it/s (768x768 Euler)
NOTE: Although the version of my wheel display as
0.0.15+e163309.d20221224
, it is actually newer than all the current0.0.16dev
wheels. The 0.0.16 ones are just folks at xformers modifying the version ahead of build temporarily, while I did not do this.Beta Was this translation helpful? Give feedback.
All reactions