-
Notifications
You must be signed in to change notification settings - Fork 6.9k
Which GPU should I buy for ComfyUI
This is a tier list of which consumer GPUs I would recommend for using with ComfyUI.
In AI the most important thing is the software stack which is why this is ranked this way.
All Nvidia GPUs from the last 10 years (since Maxwell) are supported in pytorch and they work very well.
3000 series and above are recommended for best performance. More vram is always preferable.
Officially supported in pytorch.
Works well if the card is officially supported by ROCm but they are slow compared to price equivalent Nvidia GPUs mainly because of the lack of an optimized implementation of torch.nn.functional.scaled_dot_product_attention for consumer GPUs.
Unsupported cards might be a real pain to get running.
Officially supported in pytorch. It works but they love randomly breaking things with OS updates.
It works but it requires a custom pytorch extension and there are sometimes some weird issues.
I expect things to improve over time especially once it is officially supported in pytorch.
It requires a pytorch extension (pytorch directml) or a custom zluda pytorch build.
You will have a painful experience.
Things might improve in the future once they have pytorch ROCm working on windows.
Pytorch doesn't work at all.
Some quotes from someone with knowledge of the hardware and software stack: "Avoid", "Nothing works", "Worthless for any AI use"