-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Unexpected end of JSON input" when streaming on edge environments (Vercel Edge, Cloudflare Workers) #292
Comments
Thanks for reporting! cc @RobertCraigie can you take a look at this? |
Hi y'all! @rattrayalex @RobertCraigie We're experiencing this same issue in Node.js environments as well. We've confirmed it in both Node.js and Bun after migrating to the Messages API today to adopt Claude 3. We don't experience this with the legacy Text Completions streaming API. Update! It was not exactly the same issue. Instead - the new |
Thank you for the update @izuchukwu ! We'll take a look at it shortly! Could you share a script, ideally including prompt, that reproduces the problem you're seeing? EDIT: we were not able to reproduce this locally. |
Hi, unfortunately, we're now running into this issue with It's hard for me to identify the prompt because it happens when we run multiple large prompts in parallel. I can confirm it is the same problem - We consistently see the error, but when we see it is very inconsistent, presumably because its dependent on the server sending an empty string. We're able to run small prompts just fine. The prompts that trigger it write multiple paragraphs (~5) before tripping the error. Here's a snippet. It's embedded in a library, so most options are passed as variables. const stream = await client.messages.create({
messages,
model,
max_tokens: 4096,
temperature: options?.temp,
top_p: options?.topP,
system,
stream: true
})
for await (const event of stream) {
if (event.type !== 'content_block_delta') continue
const chunk = event.delta.text
// Process stream
const shouldContinue = await onComplete?.(chunk) // onComplete is an async callback function
if (!_.isNil(shouldContinue) && !shouldContinue) {
stream.controller.abort()
}
} |
Very sorry, I also encountered the same problem here. |
As venables said, patching the streaming.js file (see below) keeps the stream from aborting. EDIT: for a fix, see the response from dzhng |
Any update on this issue? @rattrayalex |
Hey, would it be possible for anyone to share a request ID for a request that failed in this way? We can't reproduce unfortunately. You can get a request ID by setting the
In this case the request ID was |
Unfortunately when I debug log on Vercel the headers aren't printed 😞
|
Doing some debugging myself it looks like a potential bug in the Here's an extract of the logs where an error occurs. I'm printing the output of Note: logs go from bottom to top
We get the output Generally the chunks that the
However an error occurs when the first chunk is just |
OK I fixed it, @nyacg was on the right track, the issue is in the Specifically, it's because the The error happens when In both of these cases, js's This is also why the previous patch doesn't work, it dropped tokens because the extra empty line caused whole data packets to be ignored. It's got nothing to do with edge env, I suspect some network config in edge causes SSE packets to be smaller making this issue more noticeable. For now, you can patch the package really easily:
This will account for the different I already created a patch for my llm-api lib if anyone want the patch files. Commit for patch |
Ahh thank you so much for the detailed investigation and proposed patch @dzhng! We'll test and port this over to our side ASAP. |
Fixed in #312 which should be released shortly. |
This fix was released in |
Amazing turnaround time on this, thank you both @rattrayalex & @RobertCraigie |
Thank you for the details, help, and patience @izuchukwu , @dzhng , @nyacg , @venables , and others! |
The SDK seems to operate fine when running in a node.js environment, but when running in an Edge runtime (browser env), such as Vercel Edge or Cloudflare Workers, streaming becomes cut off with the following exception:
The error is coming from this block: https://github.com/anthropics/anthropic-sdk-typescript/blob/main/src/streaming.ts#L69-L84
The line content is:
Since the data is an empty string,, the JSON parsing blows up. I can bypass this error if I modify the code to ignore empty strings, but that does not seem ideal.
Reproduction repos:
I put the Streaming example from the Anthropic SDK README into a Vercel Edge function and a Cloudflare Workers function with the same failing result.
Note, the error occurs whether we use
import "@anthropic-ai/sdk/shims/web";
or not.Vercel Edge:
I've put together a sample repo, using create-next-app and using the example from your README: https://github.com/venables/anthropic-edge-stream-error
The file in question would be
app/api/test/route.ts
. If you removeexport const runtime = "edge"
, it works as expected.This error will not occur locally since locally the environment is a node.js environment, but when you deploy to Vercel (with
runtime = "edge"
still in the code), you will consistently get the error.Cloudflare Workers
If you want to reproduce this locally, you can do so using Wrangler and Cloudflare Workers, which spins up a real edge-like environment locally when you run it.
I created a sample repository here, using Hono as the router: https://github.com/venables/anthropic-stream-error-cf
The file in question here is
src/index.ts
Running that locally and hitting the endpoint will fail.
The text was updated successfully, but these errors were encountered: