This repository contains a basic example of how to build an AI agent using the Model Context Protocol (MCP) with an open LLM (Meta Llama 3), OpenAI or Google Gemini, and a SQLite database. It's designed to be a simple, educational demonstration, not a production-ready framework.
OpenAI example: https://github.com/jalr4ever/Tiny-OAI-MCP-Agent
This code sets up a simple CLI agent that can interact with a SQLite database through an MCP server. It uses the official SQLite MCP server and demonstrates:
- Connecting to an MCP server
- Loading and using tools and resources from the MCP server
- Converting tools into LLM-compatible function calls
- Interacting with an LLM using the
openai
SDK orgoogle-genai
SDK.
- Docker installed and running.
- Hugging Face account and an access token (for using the Llama 3 model).
- Google API key (for using the Gemini model).
-
Clone the repository:
git clone https://github.com/philschmid/mcp-openai-gemini-llama-example cd mcp-openai-gemini-llama-example
-
Install the required packages:
pip install -r requirements.txt
-
Log in to Hugging Face
huggingface-cli login --token YOUR_TOKEN
Run the following command
python sqlite_llama_mcp_agent.py
The agent will start in interactive mode. You can type in prompts to interact with the database. Type "quit", "exit" or "q" to stop the agent.
Example conversation:
Enter your prompt (or 'quit' to exit): what tables are available?
Response: The available tables are: albums, artists, customers, employees, genres, invoice_items, invoices, media_types, playlists, playlist_track, tracks
Enter your prompt (or 'quit' to exit): how many artists are there
Response: There are 275 artists in the database.
Run the following command
GOOGLE_API_KEY=YOUR_API_KEY python sqlite_gemini_mcp_agent.py
I'm working on a toolkit to make implementing AI agents using MCP easier. Stay tuned for updates!