You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've got a 3090 and a 4070 Super booked up. No matter which PCIE slot I put them in, Tortoise loads my 4070 Super: "Found GPU NVIDIA GeForce RTX 4070 Super". When I try to generate, I get
"Possible latent mismatch: click the "(Re)Compute Voice Latents" button and then try again. Error: CUDA out of memory. Tried to allocate 92.00 MiB. GPU 0 has a total capacty of 11.99 GiB of which 0 bytes is free. Of the allocated memory 7.53 GiB is allocated by PyTorch, and 246.95 MiB is reserved by PyTorch but unallocated."
Is that normal? I can generate fine if I take out the 4070 Super and reboot my PC, but then I can't train DeepFaceLab or run StableDiffusion in the background like I want.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I've got a 3090 and a 4070 Super booked up. No matter which PCIE slot I put them in, Tortoise loads my 4070 Super: "Found GPU NVIDIA GeForce RTX 4070 Super". When I try to generate, I get
"Possible latent mismatch: click the "(Re)Compute Voice Latents" button and then try again. Error: CUDA out of memory. Tried to allocate 92.00 MiB. GPU 0 has a total capacty of 11.99 GiB of which 0 bytes is free. Of the allocated memory 7.53 GiB is allocated by PyTorch, and 246.95 MiB is reserved by PyTorch but unallocated."
Is that normal? I can generate fine if I take out the 4070 Super and reboot my PC, but then I can't train DeepFaceLab or run StableDiffusion in the background like I want.
Beta Was this translation helpful? Give feedback.
All reactions