-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
T5-base goes out of memory on 4 GPUs with as small batch size as 4 #9311
Comments
Here are things you may try (they are unrelated to each other, so you can try in any order that resonates):
And please edit your Issue to show the command line you use, so we can see what cl args and/or hyper parameters you're using. |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
Environment info
transformers
version: 3.5.1Who can help
Trainer: @sgugger
T5: @patrickvonplaten
examples/seq2seq: @patil-suraj
Information
Model I am using T5-base with batch size of 8 and on 4 GPUs, I am always getting out of memory even with small batch sizes, This looks like a bug as this model is not really big. I am under time pressure. Is there anyone who could help me with this bug? thanks
The tasks I am working on is:
Error Stack
The text was updated successfully, but these errors were encountered: