Skip to content

Commit

Permalink
feat(cohere): add cohere chat bot guide (#870)
Browse files Browse the repository at this point in the history
Co-authored-by: Khalil Najjar <[email protected]>
Co-authored-by: Khalil Najjar <[email protected]>
Co-authored-by: Max Leiter <[email protected]>
  • Loading branch information
4 people authored Feb 20, 2024
1 parent 11048cd commit 4afa012
Show file tree
Hide file tree
Showing 7 changed files with 298 additions and 22 deletions.
137 changes: 137 additions & 0 deletions docs/pages/docs/guides/providers/cohere.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,143 @@ import { Steps, Callout } from 'nextra-theme-docs';

Vercel AI SDK provides a set of utilities to make it easy to use Cohere's API. In this guide, we'll walk through how to use the utilities to create a text completion app.

## Guide: Chat Bot

<Steps>

### Create a Next.js app

Create a Next.js application and install `ai`:

```sh
pnpm dlx create-next-app my-ai-app
cd my-ai-app
pnpm install ai
```

### Add your Cohere API Key to `.env`

Create a `.env` file in your project root and add your Cohere API Key:

```env filename=".env"
COHERE_API_KEY=xxxxxxx
```

### Create a Route Handler

Create a Next.js Route Handler that uses the Edge Runtime to generate a response to a series of messages via Cohere's TypeScript SDK, and returns the response as a streaming text response.

For this example, we'll create a route handler at `app/api/chat/route.ts` that accepts a `POST` request with a `messages` array of strings:

```tsx filename="app/api/chat/route.ts" showLineNumbers
import { CohereStream, StreamingTextResponse } from 'ai';
import { CohereClient, Cohere } from 'cohere-ai';

export const runtime = 'edge';

// IMPORTANT! Set the dynamic to force-dynamic
// Prevent nextjs to cache this route
export const dynamic = 'force-dynamic';

if (!process.env.COHERE_API_KEY) {
throw new Error('Missing COHERE_API_KEY environment variable');
}

const cohere = new CohereClient({
token: process.env.COHERE_API_KEY,
});

const toCohereRole = (role: string): Cohere.ChatMessageRole => {
if (role === 'user') {
return Cohere.ChatMessageRole.User;
}
return Cohere.ChatMessageRole.Chatbot;
};

export async function POST(req: Request) {
// Extract the `prompt` from the body of the request
const { messages } = await req.json();
const chatHistory = messages.map((message: any) => ({
message: message.content,
role: toCohereRole(message.role),
}));
const lastMessage = chatHistory.pop();

const response = await cohere.chatStream({
message: lastMessage.message,
chatHistory,
});

const stream = new ReadableStream({
async start(controller) {
for await (const event of response) {
if (event.eventType === 'text-generation') {
controller.enqueue(event.text);
}
}
controller.close();
},
});

return new Response(stream);
}
```

<Callout>
Vercel AI SDK provides 2 utility helpers to make the above seamless: First, we
pass the streaming `response` we receive from Cohere's TypeScript SDK to
[`CohereStream`](/docs/api-reference/cohere-stream). This utility class
decodes/extracts the text tokens in the response and then re-encodes them
properly for simple consumption. We can then pass that new stream directly to
[`StreamingTextResponse`](/docs/api-reference/streaming-text-response). This
is another utility class that extends the normal Node/Edge Runtime `Response`
class with the default headers you probably want (hint: `'Content-Type':
'text/plain; charset=utf-8'` is already set for you).
</Callout>

### Wire up the UI

Create a Client component with a form that we'll use to gather the prompt from the user and then stream back the completion from.
By default, the [`useChat`](/docs/api-reference#usechat) hook will use the `POST` Route Handler we created above (it defaults to `/api/chat`). You can override this by passing a `api` prop to `useChat({ api: '...'})`.

```tsx filename="app/page.tsx" showLineNumbers
'use client';

import { useChat } from 'ai/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, data } = useChat();

return (
<div className="p-4">
<header className="text-center">
<h1 className="text-xl">Chat Example</h1>
</header>
<div className="flex flex-col justify-between w-full max-w-md mx-auto stretch">
<div className="flex-grow overflow-y-auto">
{messages.map(m => (
<div key={m.id} className="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.content}
</div>
))}
</div>
<form onSubmit={handleSubmit}>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={handleInputChange}
/>
</form>
</div>
</div>
);
}
```

</Steps>

## Guide: Text Completion

<Steps>
Expand Down
47 changes: 47 additions & 0 deletions examples/next-cohere/app/api/chat/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
import { CohereClient, Cohere } from 'cohere-ai';

export const runtime = 'edge';

// IMPORTANT! Set the dynamic to force-dynamic
// Prevent nextjs to cache this route
export const dynamic = 'force-dynamic';

const cohere = new CohereClient({
token: process.env.COHERE_API_KEY || '',
});

const toCohereRole = (role: string): Cohere.ChatMessageRole => {
if (role === 'user') {
return Cohere.ChatMessageRole.User;
}
return Cohere.ChatMessageRole.Chatbot;
};

export async function POST(req: Request) {
// Extract the `messages` from the body of the request
const { messages } = await req.json();
const chatHistory = messages.map((message: any) => ({
message: message.content,
role: toCohereRole(message.role),
}));
const lastMessage = chatHistory.pop();

const response = await cohere.chatStream({
message: lastMessage.message,
chatHistory,
});

const stream = new ReadableStream({
async start(controller) {
for await (const event of response) {
// Stream Events: https://docs.cohere.com/docs/streaming#stream-events
if (event.eventType === 'text-generation') {
controller.enqueue(event.text);
}
}
controller.close();
},
});

return new Response(stream);
}
58 changes: 58 additions & 0 deletions examples/next-cohere/app/completion/page.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
'use client';

import { useCompletion } from 'ai/react';
import { FormEventHandler, useState } from 'react';

export default function Chat() {
const {
completion,
input,
setInput,
handleInputChange,
handleSubmit,
error,
} = useCompletion();

const [prompt, setPrompt] = useState('');

const handleSend: FormEventHandler<HTMLFormElement> = async e => {
handleSubmit(e);
setPrompt(input);
setInput('');
};

return (
<div className="p-4">
<header className="text-center">
<h1 className="text-xl">Completion Example</h1>
</header>
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
{error && (
<div className="fixed top-0 left-0 w-full p-4 text-center bg-red-500 text-white">
{error.message}
</div>
)}
{completion && (
<ol className="space-y-2">
<li>
<span className="font-medium">Prompt: </span>
{prompt}
</li>
<li>
<span className="font-medium">Cohere: </span>
{completion}
</li>
</ol>
)}
<form onSubmit={handleSend}>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Ask something..."
onChange={handleInputChange}
/>
</form>
</div>
</div>
);
}
18 changes: 16 additions & 2 deletions examples/next-cohere/app/layout.tsx
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import Link from 'next/link';
import './globals.css';
import { Inter } from 'next/font/google';

Expand All @@ -14,8 +15,21 @@ export default function RootLayout({
children: React.ReactNode;
}) {
return (
<html lang="en">
<body className={inter.className}>{children}</body>
<html lang="en" className="w-full h-screen">
<body className={inter.className}>
<nav className="p-4 flex gap-x-4 w-full">
<Link href="/" className="text-blue-500 underline hover:no-underline">
Chat
</Link>
<Link
href="/completion"
className="text-blue-500 underline hover:no-underline"
>
Completion
</Link>
</nav>
<main>{children}</main>
</body>
</html>
);
}
44 changes: 24 additions & 20 deletions examples/next-cohere/app/page.tsx
Original file line number Diff line number Diff line change
@@ -1,30 +1,34 @@
'use client';

import { useCompletion } from 'ai/react';
import { useChat } from 'ai/react';

export default function Chat() {
const { completion, input, handleInputChange, handleSubmit, error } =
useCompletion();
const { messages, input, handleInputChange, handleSubmit, data } = useChat();
console.log({ messages });

return (
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
<h4 className="text-xl font-bold text-gray-900 md:text-xl pb-4">
useCompletion Example
</h4>
{error && (
<div className="fixed top-0 left-0 w-full p-4 text-center bg-red-500 text-white">
{error.message}
<div className="p-4">
<header className="text-center">
<h1 className="text-xl">Chat Example</h1>
</header>
<div className="flex flex-col justify-between w-full max-w-md mx-auto stretch">
<div className="flex-grow overflow-y-auto">
{messages.map(m => (
<div key={m.id} className="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.content}
</div>
))}
</div>
)}
{completion}
<form onSubmit={handleSubmit}>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={handleInputChange}
/>
</form>
<form onSubmit={handleSubmit}>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={handleInputChange}
/>
</form>
</div>
</div>
);
}
1 change: 1 addition & 0 deletions examples/next-cohere/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
},
"dependencies": {
"ai": "2.2.26",
"cohere-ai": "^7.6.1",
"next": "14.0.3",
"react": "18.2.0",
"react-dom": "^18.2.0"
Expand Down
15 changes: 15 additions & 0 deletions pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit 4afa012

Please sign in to comment.