-
-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: vllm.engine.async_llm_engine.AsyncEngineDeadError: Background loop has errored already. #6361
Comments
We have a tracking issue (#5901) for this. Please provide more details there so we can better troubleshoot the underlying cause. |
Faced same issue today. Was running a script calling API concurrently and server generated same error after about 15-20k requests |
Faced with the same problem, the lowest number of requests was 4000, and the highest number was 45,000. |
This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you! |
This issue has been automatically closed due to inactivity. Please feel free to reopen if you feel it is still relevant. Thank you! |
Your current environment
🐛 Describe the bug
32 concurrent massive inferences
The text was updated successfully, but these errors were encountered: