Skip to content

Releases: BerriAI/litellm

v1.66.3.dev1

18 Apr 02:27
Compare
Choose a tag to compare

What's Changed

  • [Feat] Unified Responses API - Add Azure Responses API support by @ishaan-jaff in #10116
  • UI: Make columns resizable/hideable in Models table by @msabramo in #10119

Full Changelog: v1.66.2.dev1...v1.66.3.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.3.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 180.0 210.74489628810068 6.401988824678471 0.003341330284278951 1916 1 38.52582800004711 5506.760536000002
Aggregated Passed ✅ 180.0 210.74489628810068 6.401988824678471 0.003341330284278951 1916 1 38.52582800004711 5506.760536000002

v1.66.3-nightly

17 Apr 20:58
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.66.2-nightly...v1.66.3-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.3-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 250.0 302.3290337319068 6.097097387542003 0.04679789661490572 1824 14 218.4401190000358 5459.562037000012
Aggregated Failed ❌ 250.0 302.3290337319068 6.097097387542003 0.04679789661490572 1824 14 218.4401190000358 5459.562037000012

v1.66.2.dev1

17 Apr 20:36
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.66.2-nightly...v1.66.2.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.2.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 200.0 242.7078390639904 6.1689738182726535 0.0 1844 0 181.44264199997906 6553.659710999966
Aggregated Passed ✅ 200.0 242.7078390639904 6.1689738182726535 0.0 1844 0 181.44264199997906 6553.659710999966

v1.66.2-nightly

17 Apr 05:43
47e811d
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.66.1-nightly...v1.66.2-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.2-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 244.45508035967268 6.136194497326665 0.0 1835 0 169.77143499997283 8723.871383000016
Aggregated Passed ✅ 190.0 244.45508035967268 6.136194497326665 0.0 1835 0 169.77143499997283 8723.871383000016

v1.66.1-nightly

15 Apr 06:05
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.66.0-nightly...v1.66.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 220.0 243.74385918230334 6.268015361621096 0.0 1876 0 197.45038600001408 3855.600032000012
Aggregated Passed ✅ 220.0 243.74385918230334 6.268015361621096 0.0 1876 0 197.45038600001408 3855.600032000012

v1.66.0-stable

13 Apr 05:51
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.65.8-nightly...v1.66.0-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.66.0-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 282.9933559715544 5.995117478456652 0.0 1793 0 223.97943800001485 5176.803935999998
Aggregated Passed ✅ 250.0 282.9933559715544 5.995117478456652 0.0 1793 0 223.97943800001485 5176.803935999998

v1.66.0-nightly

13 Apr 05:19
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.65.8-nightly...v1.66.0-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 252.49209995793416 6.279241190720279 0.0 1878 0 200.85592700002053 5135.250711999987
Aggregated Passed ✅ 230.0 252.49209995793416 6.279241190720279 0.0 1878 0 200.85592700002053 5135.250711999987

v1.65.8-nightly

12 Apr 17:13
069aee9
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.65.7-nightly...v1.65.8-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.65.8-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 220.0 248.0753682003237 6.194614175051195 0.0 1852 0 194.34754100001328 4413.887686999999
Aggregated Passed ✅ 220.0 248.0753682003237 6.194614175051195 0.0 1852 0 194.34754100001328 4413.887686999999

v1.65.7-nightly

11 Apr 05:34
Compare
Choose a tag to compare

What's Changed

  • [Feat SSO] - Allow admins to set default_team_params to have default params for when litellm SSO creates default teams by @ishaan-jaff in #9895
  • [Feat] Emit Key, Team Budget metrics on a cron job schedule by @ishaan-jaff in #9528
  • [Bug Fix MSFT SSO] Use correct field for user email when using MSFT SSO by @ishaan-jaff in #9886
  • [Docs] Tutorial using MSFT auto team assignment with LiteLLM by @ishaan-jaff in #9898

Full Changelog: v1.65.6-nightly...v1.65.7-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.65.7-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 240.0 261.69840098746454 6.131387078558505 0.0 1835 0 214.285206999989 3626.6518760000395
Aggregated Passed ✅ 240.0 261.69840098746454 6.131387078558505 0.0 1835 0 214.285206999989 3626.6518760000395

v1.65.6-nightly

10 Apr 23:04
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.65.5-nightly...v1.65.6-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.65.6-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 209.99145276997868 6.188819872716192 0.0 1852 0 167.33176299999286 4428.401366999992
Aggregated Passed ✅ 190.0 209.99145276997868 6.188819872716192 0.0 1852 0 167.33176299999286 4428.401366999992