-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uploading/Sharing large models to HuggingFace #7398
Comments
There is no limit to the file sizes on the model hub, however, for uploads that large and if your connection is even slightly unstable, it can indeed fail. If you have another host (S3 bucket or whatever) you can upload the file to, I can handle |
Actually It will abort at the very beginning of uploading process for the large file every time. All my other smaller models could be uploaded smoothly. so I feel it might not be my network issue. |
Btw, we have a public google cloud storage host. Does it work for you if i am still not able to upload the model? |
I can indeed reproduce. For now, can you upload to a GCS or S3 bucket, post the url here, and I'll cp the file? Will take a note to investigate/fix this in the future. |
Could you help us cp these two models to our organization Thank you very much for your help! |
Here you go: https://huggingface.co/castorini |
Will close this for now but we are tracking the "large file upload" issue internally |
Hi,
I am trying to upload a
t5-3b
based model to HuggingFace. The folder to upload has 11G.When I am uploading, it will gives
'Connection aborted.', BrokenPipeError(32, 'Broken pipe')
.Does it because the model is too large and there is a limitation? How could I deal with that?
Thank you for your help!
The text was updated successfully, but these errors were encountered: