Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

File size limit increase for vLLM #15922

Closed
pcmoritz opened this issue May 7, 2024 · 2 comments
Closed

File size limit increase for vLLM #15922

pcmoritz opened this issue May 7, 2024 · 2 comments

Comments

@pcmoritz
Copy link

pcmoritz commented May 7, 2024

Dear all,

we are developing a library for high throughput LLM serving (https://github.com/vllm-project/vllm) and we are running into the 100MB limit. Could you increase it to 200MB for us? Unfortunately the wheel size needs to be a little larger than usual since the wheels include a bunch of GPU code and kernels which need to be compiled for different architectures.

Thanks a lot for your help!

cc @WoosukKwon @zhuohan123 @simon-mo

@simon-mo
Copy link

simon-mo commented May 8, 2024

🙏 We also have a support ticket opened here: pypi/support#3792

@di
Copy link
Member

di commented May 8, 2024

Closing as a duplicate of pypi/support#3792.

Please don't intentionally create duplicate issues.

@di di closed this as completed May 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants