-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is Multiple-GPU support available now? #2467
Comments
An update on this will be really appreciated. I am trying to run inference on a dataset with more than 300k entries. For one instance it takes about 1.05 secs and uses about 10% of one of two P5000s. Please give an update on this. |
Yeah, any update on this would be really helpful. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Any updates about this issue? ^^ |
There is a PR for Lightning Lite integration #2700 that will add this feature and should be merged soon-ish! |
Is your feature/enhancement request related to a problem? Please describe.
I have 2 GPUs. Only 1 is being accessed. With that one, I get a CUDA out of memory error. I am hoping that by leveraging both GPUs this situation will improve.
Describe the solution you'd like
Allow more than 1 gpu to be specified.
Additional context
#389 was closed as duplicate of #37. But, #37 was closed without addressing #389 and by giving some magic hand-waving that GPU support is better now (circa 2020).
The text was updated successfully, but these errors were encountered: