-
-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for Stable-LM and OpenAssistant #43
Comments
tianyil1
pushed a commit
to tianyil1/vllm
that referenced
this issue
Jun 5, 2024
fxmarty
pushed a commit
to fxmarty/vllm-public
that referenced
this issue
Jun 12, 2024
* Linting * Fix linting for triton: unmeld if with constexpr
yukavio
pushed a commit
to yukavio/vllm
that referenced
this issue
Jul 3, 2024
Co-authored-by: Andrew Feldman <[email protected]> Co-authored-by: Robert Shaw <[email protected]> Co-authored-by: alexm <[email protected]>
Closed
bigPYJ1151
pushed a commit
to bigPYJ1151/vllm
that referenced
this issue
Dec 10, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The two models are popularly used. As we support LLaMA, it'll not be difficult to support these models.
The text was updated successfully, but these errors were encountered: