Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Server: add support for "tool_calls" (MeetKai/functionary model) #5695

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -92,3 +92,4 @@ examples/jeopardy/results.txt
poetry.lock
poetry.toml
nppBackup
functionary-test
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -719,7 +719,7 @@ save-load-state: examples/save-load-state/save-load-state.cpp ggml.o llama.o $(C
$(CXX) $(CXXFLAGS) -c $< -o $(call GET_OBJ_FILE, $<)
$(CXX) $(CXXFLAGS) $(filter-out %.h $<,$^) $(call GET_OBJ_FILE, $<) -o $@ $(LDFLAGS)

server: examples/server/server.cpp examples/server/oai.hpp examples/server/utils.hpp examples/server/httplib.h examples/server/json.hpp examples/server/index.html.hpp examples/server/index.js.hpp examples/server/completion.js.hpp examples/llava/clip.cpp examples/llava/clip.h examples/llava/llava.h examples/llava/llava.cpp common/stb_image.h ggml.o llama.o $(COMMON_DEPS) grammar-parser.o $(OBJS)
server: examples/server/server.cpp examples/server/oai.hpp examples/server/utils.hpp examples/server/httplib.h examples/server/json.hpp examples/server/functionary.hpp examples/server/index.html.hpp examples/server/index.js.hpp examples/server/completion.js.hpp examples/llava/clip.cpp examples/llava/clip.h examples/llava/llava.h examples/llava/llava.cpp common/stb_image.h ggml.o llama.o $(COMMON_DEPS) grammar-parser.o $(OBJS)
$(CXX) $(CXXFLAGS) -c $< -o $(call GET_OBJ_FILE, $<)
$(CXX) $(CXXFLAGS) -c examples/llava/clip.cpp -o $(call GET_OBJ_FILE, examples/llava/clip.cpp) -Wno-cast-qual
$(CXX) $(CXXFLAGS) -Iexamples/server $(filter-out %.h %.hpp $< examples/llava/clip.cpp,$^) $(call GET_OBJ_FILE, $<) $(call GET_OBJ_FILE, examples/llava/clip.cpp) -o $@ $(LDFLAGS) $(LWINSOCK2)
Expand Down
2 changes: 1 addition & 1 deletion examples/server/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
set(TARGET server)
option(LLAMA_SERVER_VERBOSE "Build verbose logging option for Server" ON)
include_directories(${CMAKE_CURRENT_SOURCE_DIR})
add_executable(${TARGET} server.cpp oai.hpp utils.hpp json.hpp httplib.h)
add_executable(${TARGET} server.cpp oai.hpp utils.hpp json.hpp functionary.hpp httplib.h)
install(TARGETS ${TARGET} RUNTIME)
target_compile_definitions(${TARGET} PRIVATE
SERVER_VERBOSE=$<BOOL:${LLAMA_SERVER_VERBOSE}>
Expand Down
110 changes: 110 additions & 0 deletions examples/server/functionary-test.cpp
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
#include <iostream>
#include <string>
#include <vector>
#include <sstream>

#include "json.hpp"
#include "functionary.hpp"

using json = nlohmann::json;

/**
* A simple test program that allow testing functionary.hpp without using server.
Copy link
Collaborator

@phymbert phymbert Feb 27, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As it is a simple unit test, better to go with the ctest approach as in the root repo.

Copy link
Collaborator Author

@ngxson ngxson Feb 27, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah you're right, it should be a ctest. Problem is that the ctest in root CMakeLists is for core library, but not for the examples.

I believe that I'll need to convert this file to ctest anyway, maybe the ctest will run along with behave, I'll see what's the best approach when I have time to continue working with this PR.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in the ci server workflow, we can plug a ctest target only for server. I will push you an example on a side branch

* TODO: how to add this test to CI?
*
* Compile command: clear && g++ functionary-test.cpp -o functionary-test && ./functionary-test
*/

std::string test_oai_input_json = R"(
{
"tools": [
{
"type": "function",
"function": {
"name": "get_n_day_weather_forecast",
"description": "Get an N-day weather forecast",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location."
},
"num_days": {
"type": "integer",
"description": "The number of days to forecast"
}
},
"required": ["location", "format", "num_days"]
}
}
}
],
"messages": [
{
"role": "system",
"content": "You are a helpful assistant"
},
{
"role": "user",
"content": "What is the weather like in Boston?"
},
{
"role": "assistant",
"content": null,
"tool_calls": [
{
"type": "function",
"id":"get_car_price",
"function": {
"arguments": "{\"car_name\": \"Song\"}",
"name": "get_car_price"
}
},
{
"type": "function",
"id":"get_car_price",
"function": {
"arguments": "{\"car_name\": \"Tang\"}",
"name": "get_car_price"
}
}
]
},
{
"role": "tool",
"tool_call_id": "get_car_price",
"name": "get_car_price",
"content": "{\"price\": {\"price\": \"$25000\"}}"
},
{
"role": "tool",
"tool_call_id": "get_car_price",
"name": "get_car_price",
"content": "{\"price\": {\"price\": \"$20000\"}}"
}
]
}
)";


std::string test_response = R"(get_car_price
<|content|>{"car_name": "Song"}
<|from|>assistant
<|recipient|>get_car_price
<|content|>{"car_name": "Tang"}<|stop|>)";

int main() {
auto test_oai_input = json::parse(test_oai_input_json);
auto prompt = llama_functionary::convert_oai_to_prompt(test_oai_input, true);
std::cout << "\n" << prompt << "\n";

std::cout << "\n" << llama_functionary::convert_response_to_oai_choices(test_response) << "\n";

return 0;
}
Loading
Loading