A production-ready template for building conversational AI applications with Next.js, React, and Vercel AI SDK. This template provides a solid foundation for creating chat interfaces with LLMs, tool integration, and persistent conversations.
We were building multiple AI applications and found ourselves implementing the same features over and over again - chat interfaces, tool integrations, conversation persistence, and more. This template is the result of our experience, giving you a head start on building production-ready AI applications.
While there are other chat UI libraries out there, Zentrale is specifically designed for modern AI applications:
-
vs LobeChat: While LobeChat is feature-rich, it's often "you want the banana but you get the gorilla". Zentrale is more minimalist and focused, giving you exactly what you need for AI applications without unnecessary complexity.
-
vs chat-ui-kit-react: Many existing chat UI kits are outdated, not designed for LLM applications, and lack modern features. Zentrale is built specifically for AI interactions with streaming, tool integration, and modern UI/UX in mind.
-
🎨 Modern UI/UX
- Clean, responsive chat interface
- Light/dark mode support
- Loading states and animations
- Mobile-friendly design
-
🤖 LLM Integration
- Built on Vercel AI SDK
- Streaming responses
- Support for multiple LLM providers
- Configurable system prompts
-
🛠️ Tool System
- Type-safe tool definitions
- Easy tool integration
- Built-in search tool
- Extensible tool architecture
-
💾 Data Persistence
- Conversation history
- Message storage
- User feedback tracking
- Database integration with geldata
-
Clone the repository:
git clone https://github.com/apto-space/zentrale.git cd zentrale
-
Install dependencies:
bun install
-
Set up your environment variables:
cp .env.example .env.local
Edit
.env.local
with your API keys and configuration. -
Start the development server:
bun dev
-
Define your app configuration:
const myAppConfig: ChatAppConfig = { id: "my-chat-app", name: "My AI Assistant", prompts: { system: "You are a helpful assistant...", }, options: { tools: [myTool], greeting: "Welcome to My AI Assistant!", examples: ["Example question 1", "Example question 2"], }, };
-
Create your tools:
export const myTool: ToolExport = { myTool: { aiTool: { name: "myTool", description: "Description of my tool", parameters: z.object({ // Your tool parameters }), }, view: MyToolView, }, };
-
Use the template:
<ChatPageWrapper config={myAppConfig} />
The template uses Tailwind CSS for styling. Customize the theme in tailwind.config.js
:
module.exports = {
theme: {
extend: {
colors: {
// Your custom colors
},
},
},
};
The template uses geldata for data persistence. Configure your database connection in .env.local
:
DATABASE_URL=your_database_url
Configure your LLM provider in app/core/api/chat/aiConfig.ts
:
export function createStream(messages: any[], config: ChatAppConfig) {
return streamText({
messages,
model: anthropic("claude-3-5-haiku-latest"),
system: config.prompts.system,
tools: config.options?.tools,
});
}
Contributions are welcome! Please feel free to submit a Pull Request.
MIT License - feel free to use this template for your own projects.