-
Notifications
You must be signed in to change notification settings - Fork 369
Related to stop prompt #151
Comments
I agree that adding stop word/sequence support natively would be helpful for using models in a chatbot setting, especially ones that aren't specifically trained for that purpose. Perhaps we could add a list of strings to the |
Here is a naive implementation https://github.com/danforbes/llama-rs/tree/dfo/feat/chat |
@danforbes Sorry, keep forgetting to get back to you on this! Yeah that seems fine to me, want to add |
You need to provide more information than just the "reverse prompt" - among other things you need to provide some kind of callback for receiving user input. There are also some changes to the way that returned tokens are handled (EOT handling as well as reverse prompt handling) that I wasn't sure how to implement cleanly in the existing Edit: Also, the existing |
Hi, some models got very stupid stopping logic (eg. vicuna under instruct mode)
llama-rs can provide a way (like a stop prompt argument) to stop it by giving a text sequence.
What I v investigated are as follows:
https://github.com/Atome-FE/llama-node/blob/main/packages/llama-cpp/src/llama.rs#L152
https://github.com/sobelio/llm-chain/blob/main/llm-chain-llama/src/executor.rs#L96
I think llama.cpp also provide something called reverse prompt or anti prompt (but they just use it in interactive mode)
The text was updated successfully, but these errors were encountered: