This project provides a Flask-based API wrapper for Ollama, allowing you to interact with Ollama models using an OpenAI-like API interface. It includes features such as API key authentication, model access control, and JSON response formatting.
- OpenAI-like API for Ollama models
- API key authentication
- Model access control
- Streaming and non-streaming responses
- JSON mode for structured outputs
- Token counting and usage tracking
- Python 3.7+
- MySQL database
- Ollama running locally or on a accessible server
-
Clone the repository:
git clone https://github.com/KillCroop2/ollama-api-wrapper.git cd ollama-api-wrapper
-
Install the required packages:
pip install -r requirements.txt
-
Set up the environment variables:
- Copy the
.env.template
file to.env
- Fill in the necessary database credentials in the
.env
file
- Copy the
-
Set up the database:
- Run the SQL commands in
database.sql
to create the necessary tables
- Run the SQL commands in
Update the .env
file with your database credentials:
DB_HOST=<Your MySQL host>
DB_PORT=<Your MySQL port>
DB_USER=<Your MySQL username>
DB_PASSWORD=<Your MySQL password>
DB_NAME=<Your database name>
-
Start the Flask server:
python app.py
-
The API will be available at
http://localhost:5000
-
Use the following endpoints:
/v1/chat/completions
: For chat completions/v1/models
: To list available models/v1/api_keys
: To create a new API key
POST /v1/chat/completions
Request body:
{
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello, how are you?"}
],
"model": "llama2",
"stream": false,
"temperature": 0.7
}
GET /v1/models
POST /v1/api_keys
app.py
: Main Flask applicationdb.py
: Database operationsllm_wrapper.py
: Ollama API wrapperdatabase.sql
: SQL schema
This project is licensed under the MIT License.