LlamaCPP server does not follow the SO specification #11797
Replies: 1 comment
-
No issues. I solved it |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Python Code:
Llama Server in Docker:
Output: {"1": "6+5", "answer": 11}
Moreover, the outputs have never been consistent.
However, llama-cpp-python gives you consistent outputs with grammar.
llama-cpp-python working example: https://github.com/KanishkNavale/llamaso
Could you please help me to fix this?
Beta Was this translation helpful? Give feedback.
All reactions