This is a demo Spring application that demonstrates how to use SpringAI, Redis as a vector store, and Ollama to create a chat application that uses information fed to it from PDF files, all running on a local machine.
- Docker
- Docker Compose
- Java 17
- Ollama
The application writes PDF files to Redis and uses this information to pass information to the prompt.
The documents will be stored in the ./src/main/resources/docs
folder.
Since this folder is empty by default, add at least one document before starting the application.
The Redis instance is available via Docker Compose.
To start Redis, execute docker compose up -d
from the command line.
This will start the Redis stack. This configuration also includes the Redis dashboard,
accessible at http://localhost:8001.
Import the project into your IDE and set a default configuration
In the command line execute
./gradlew bootRun
The application exposes one endpoint GET http://localhost/ollama/chat
and expects the parameter
message
to contain the text that will be send to the bot
In ./src/resources/application.properties, there are a few commented-out properties that can help tweak the bot responses. For more information, refer to the documentation. Please note that if you want to use different models, you'll have to install them first using ollama run modelname:version. Here is the list of models.