-
Notifications
You must be signed in to change notification settings - Fork 6.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
VRAM does not released #724
Comments
show full log |
D:\Fooocus_webui>.\python_embeded\python.exe -s Fooocus\entry_with_update.py To create a public link, set |
After generating the first image, the VRAM usage is 21.5GB. During the generation of the second image, the VRAM usage increases to 23.5GB. After both images are generated, the VRAM usage remains at 23.5GB. |
will take a look soon |
with --disable-smart-memory, |
fixed in 2.1.699. |
Please remove |
2.1.699 is ok |
I don't know from when it started.
After generating an image, the VRAM does not get released (21.5/24 on 4090). Subsequent image generation becomes very slow (due to VRAM being full). I'm using the Windows compressed package "Fooocus_win64_2-1-60.7z". My Windows 10 driver version is 531.61.
The text was updated successfully, but these errors were encountered: