-
-
Notifications
You must be signed in to change notification settings - Fork 12.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
✨ feat: add vLLM provider support #6154
Conversation
@hezhijie0327 is attempting to deploy a commit to the LobeHub Team on Vercel. A member of the Team first needs to authorize it. |
Thank you for raising your pull request and contributing to our Community |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #6154 +/- ##
==========================================
- Coverage 91.75% 91.73% -0.03%
==========================================
Files 663 666 +3
Lines 60736 60915 +179
Branches 2842 2844 +2
==========================================
+ Hits 55731 55880 +149
- Misses 5005 5035 +30
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
vllm logo lobehub/lobe-icons#75 |
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
❤️ Great PR @hezhijie0327 ❤️ The growth of project is inseparable from user feedback and contribution, thanks for your contribution! If you are interesting with the lobehub developer community, please join our discord and then dm @arvinxx or @canisminor1990. They will invite you to our private developer channel. We are talking about the lobe-chat development or sharing ai newsletter around the world. |
## [Version 1.55.0](v1.54.0...v1.55.0) <sup>Released on **2025-02-14**</sup> #### ✨ Features - **misc**: Add vLLM provider support. <br/> <details> <summary><kbd>Improvements and Fixes</kbd></summary> #### What's improved * **misc**: Add vLLM provider support, closes [#6154](#6154) ([1708e32](1708e32)) </details> <div align="right"> [](#readme-top) </div>
🎉 This PR is included in version 1.55.0 🎉 The release is available on: Your semantic-release bot 📦🚀 |
## [Version 1.97.0](v1.96.0...v1.97.0) <sup>Released on **2025-02-14**</sup> #### ✨ Features - **misc**: Add vLLM provider support. <br/> <details> <summary><kbd>Improvements and Fixes</kbd></summary> #### What's improved * **misc**: Add vLLM provider support, closes [lobehub#6154](https://github.com/bentwnghk/lobe-chat/issues/6154) ([1708e32](1708e32)) </details> <div align="right"> [](#readme-top) </div>
* ✨ feat: add vLLM provider support * 💄 style: update model list
## [Version 1.55.0](lobehub/lobe-chat@v1.54.0...v1.55.0) <sup>Released on **2025-02-14**</sup> #### ✨ Features - **misc**: Add vLLM provider support. <br/> <details> <summary><kbd>Improvements and Fixes</kbd></summary> #### What's improved * **misc**: Add vLLM provider support, closes [lobehub#6154](lobehub#6154) ([1708e32](lobehub@1708e32)) </details> <div align="right"> [](#readme-top) </div>
* ✨ feat: add vLLM provider support * 💄 style: update model list
* ✨ feat: add vLLM provider support * 💄 style: update model list
## [Version 1.55.0](lobehub/lobe-chat@v1.54.0...v1.55.0) <sup>Released on **2025-02-14**</sup> #### ✨ Features - **misc**: Add vLLM provider support. <br/> <details> <summary><kbd>Improvements and Fixes</kbd></summary> #### What's improved * **misc**: Add vLLM provider support, closes [lobehub#6154](lobehub#6154) ([1708e32](lobehub@1708e32)) </details> <div align="right"> [](#readme-top) </div>
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
[TODO]📝 补充信息 | Additional Information
close #6147
close #2267