-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
streamObject
with @ai-sdk/anthropic
only provides a full object once finished via .toTextStreamResponse()
#2105
Comments
|
Thanks for the fast reply! For extra context, throwing in |
|
I just checked and it is supported in the AI SDK: https://github.com/vercel/ai/blob/main/packages/anthropic/src/anthropic-messages-language-model.ts#L323 However, Anthropic waits and then streams all deltas really quickly, giving the impression that it only provides a full object. |
Example with delay that shows the tool call is streamed: import { anthropic } from '@ai-sdk/anthropic';
import { streamObject } from 'ai';
import dotenv from 'dotenv';
import { z } from 'zod';
dotenv.config();
async function main() {
const result = await streamObject({
model: anthropic('claude-3-5-sonnet-20240620'),
maxTokens: 2000,
schema: z.object({
characters: z.array(
z.object({
name: z.string(),
class: z
.string()
.describe('Character class, e.g. warrior, mage, or thief.'),
description: z.string(),
}),
),
}),
prompt:
'Generate 3 character descriptions for a fantasy role playing game.',
});
for await (const partialObject of result.partialObjectStream) {
// delay 50 ms to simulate processing time:
await new Promise(resolve => setTimeout(resolve, 50));
console.clear();
console.log(partialObject);
}
}
main().catch(console.error); |
I found this issue while trying to diagnose long delays with Claude streaming + tool use. I did some testing and confirmed this is an issue with the Claude api, not the 'ai' sdk. Some test code here if helpful to anyone else who runs into this: anthropics/anthropic-sdk-typescript#529 |
Description
Following this tutorial, using
@ai-sdk/openai
as the provider functions as expected - the object response is streamed to the client.However, swapping it with
@ai-sdk/anthropic
seems to only return the object to the client once the generation has completed.Code example
const result = await streamObject({
model: anthropic("claude-3-5-sonnet-20240620"),
schema: notificationSchema,
...
Additional context
No response
The text was updated successfully, but these errors were encountered: