Skip to content

Issues: BerriAI/litellm

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Bug]: Model hub not showing customized costs bug Something isn't working mlops user request
#8573 opened Feb 16, 2025 by xmcp
[Bug]: x-litellm-cache-key header not being returned on cache hit bug Something isn't working
#8570 opened Feb 16, 2025 by mirodrr2
[Bug]: Can't stream Deepseek on Vertex AI Model Garden bug Something isn't working
#8564 opened Feb 15, 2025 by emorling
[Bug]: function_to_dict() fails enums bug Something isn't working
#8539 opened Feb 14, 2025 by vlerenc
Cached / multimodal token are not passed through to Langfuse bug Something isn't working
#8515 opened Feb 13, 2025 by hassiebp
[Feature]: Set currency by env variable
#8513 opened Feb 13, 2025 by Mte90
Ollama Server error '502 Bad Gateway'
#8510 opened Feb 13, 2025 by xvshiting
[Bug]: Message Redaction on OTEL impacts logging output on s3 bug Something isn't working
#8489 opened Feb 12, 2025 by ishaan-jaff
Regional wildcard for AWS/GCP/Azure models
#8486 opened Feb 12, 2025 by tcyran
[Bug]: model health checks failed bug Something isn't working mlops user request
#8483 opened Feb 12, 2025 by saeya211
ProTip! Updated in the last three days: updated:>2025-02-13.