Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Error: [object Object]" during message streaming when error is via an SSE (cause/detail not accessible) #346

Closed
paulcalcraft opened this issue Mar 20, 2024 · 19 comments
Assignees

Comments

@paulcalcraft
Copy link

paulcalcraft commented Mar 20, 2024

When hitting an error during the async iterator of a anthropic.messages.create(), the exception raised and associated error object doesn't have any detail, it just has an e.cause.message set to "[object Object]".

My example SSE that's occurring during streaming is:

{event: 'error', data: '{"type":"error","error":{"type":"overloaded_error","message":"Overloaded"}              }', raw: Array(2)}

The error SSE is then thrown using APIError.generate here:

throw APIError.generate(undefined, errJSON, errMessage, createResponseHeaders(response.headers));

The errJSON is correctly being passed to generate, but because status isn't set (it's an SSE, not an HTTP response), we use castToError() to raise the APIConnectionError, with no other info:

return new APIConnectionError({ cause: castToError(errorResponse) });

castToError just returns new Error(errJSON)

return new Error(err);

But errJSON doesn't have a good toString, so our cause Error object has message "[object Object]" and no other properties. This means you can't handle/inspect the error cause correctly when catching the error during the async iterator.

An example error:

anthropic stream error Error: Connection error.
    at APIError.generate (\node_modules\@anthropic-ai\sdk\error.mjs:32:20)
    at Stream.iterator (\node_modules\@anthropic-ai\sdk\streaming.mjs:73:40)
    at process.processTicksAndRejections (\lib\internal\process\task_queues.js:95:5)
    at async [Symbol.asyncIterator] (\src\routes\api\oracle\+server.js:205:38)
...
    at async initiateResponse (/src/routes/api/oracle/+server.js:1410:17) {status: undefined, headers: undefined, error: undefined, cause: Error: [object Object]
    at castToError (fil…/node_modules/@anthr…, stack: 'Error: Connection error.
    at APIError.gene…/src/routes/api/oracle/+server.js:1410:17)', …}

And where I'm catching it:

try {
    for await (const messageStreamEvent of response) {
        // ...
    }
} catch (e) {
    console.error("anthropic stream error", e) // e.cause.message == "[object Object]"
}

Would it be possible to correctly format the error so that it's possible to identify at least the error type by inspecting .cause on the APIConnectionError?

Thanks for any help. Also happy to submit a PR if there's agreement on the best way to surface the error detail in the APIConnectionError object.

@paulcalcraft paulcalcraft changed the title Streaming error reason/detail not accessible when occuring during streaming as an SSE Streaming "Error: [object Object]" during streaming when error is via an SSE (cause/detail not accessible) Mar 20, 2024
@paulcalcraft paulcalcraft changed the title Streaming "Error: [object Object]" during streaming when error is via an SSE (cause/detail not accessible) "Error: [object Object]" during message streaming when error is via an SSE (cause/detail not accessible) Mar 20, 2024
@rattrayalex
Copy link
Collaborator

Thanks for the report, we'll take a look!

@ryanblock
Copy link

Just a heads up, we have also been able to replicate this issue. This is running within a Lambda; the error occurs after a few hundred tokens. An example prompt which seems to replicate this for us is: Please send the first 10 paragraphs of Alice's Adventures in Wonderland by Lewis Carroll (which is in the public domain).

APIConnectionError: Connection error.
    at Function.generate (file:///var/task/node_modules/@anthropic-ai/sdk/error.mjs:32:20)
    at Stream.iterator (file:///var/task/node_modules/@anthropic-ai/sdk/streaming.mjs:52:40)
    ... 2 lines matching cause stack trace ...
    at async MessageStream._createMessage (file:///var/task/node_modules/@anthropic-ai/sdk/lib/MessageStream.mjs:113:26) {
  status: undefined,
  headers: undefined,
  error: undefined,
  cause: Error: [object Object]
      at castToError (file:///var/task/node_modules/@anthropic-ai/sdk/core.mjs:682:12)
      at Function.generate (file:///var/task/node_modules/@anthropic-ai/sdk/error.mjs:32:52)
      at Stream.iterator (file:///var/task/node_modules/@anthropic-ai/sdk/streaming.mjs:52:40)
      at runMicrotasks (<anonymous>)
      at processTicksAndRejections (node:internal/process/task_queues:96:5)
      at async MessageStream._createMessage (file:///var/task/node_modules/@anthropic-ai/sdk/lib/MessageStream.mjs:113:26)
}

@beeirl
Copy link

beeirl commented Apr 30, 2024

Running into the exact same issue here when running it on Vercel using Vercel AI SDK.

@rattrayalex
Copy link
Collaborator

FWIW, my guess is that this is due to Vercel timing out your handler, but I agree the error message being hard to read makes this worse. @RobertCraigie care to ticket?

@paulcalcraft
Copy link
Author

Thanks! I'm on a Pro plan with Vercel for 5 minute timeouts so I don't think that's actually the case for me

@ryanblock
Copy link

@rattrayalex fwiw, and I mentioned above, we have seen this error in plain old AWS Lambda, and have observed this not to be related to Lambda timeouts. (Just for my own edification, what's the relationship here with @stainless-api?)

@rattrayalex
Copy link
Collaborator

Gotcha, that's helpful. We'll try to look into this, but a repro script would be very helpful. Can anyone share one?

what's the relationship here with https://github.com/stainless-api?

I work at Stainless, which Anthropic uses to build their SDKs.

@greg84
Copy link
Contributor

greg84 commented Aug 20, 2024

I am seeing this too.

I'm running a NextJS app locally. Just randomly chatting with my app it throws this error maybe every 5 - 10 requests. The app has been working fine with Together AI's API (via the OpenAI SDK) using Llama 3 and 3.1 in the last few months. Since swapping over to Anthropic I'm now seeing this intermittent issue.

This is the output when the error is thrown:

APIConnectionError: Connection error.
    at APIError.generate (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/error.mjs:33:20)
    at Stream.iterator (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/streaming.mjs:52:40)
    ... 11 lines matching cause stack trace ...
    at async handleRequest (/Users/path/to/app/node_modules/next/dist/server/lib/router-server.js:353:24)
    at async requestHandlerImpl (/Users/path/to/app/node_modules/next/dist/server/lib/router-server.js:377:13) {
  status: undefined,
  headers: undefined,
  request_id: undefined,
  error: undefined,
  cause: Error: [object Object]
      at castToError (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/core.mjs:695:12)
      at APIError.generate (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/error.mjs:33:52)
      at Stream.iterator (file:///Users/path/to/app/node_modules/@anthropic-ai/sdk/streaming.mjs:52:40)
      at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
      at async handler (webpack-internal:///(api)/./pages/api/chat.ts:269:30)
      at async K (/Users/path/to/app/node_modules/next/dist/compiled/next-server/pages-api.runtime.dev.js:21:2946)
      at async U.render (/Users/path/to/app/node_modules/next/dist/compiled/next-server/pages-api.runtime.dev.js:21:3827)
      at async DevServer.runApi (/Users/path/to/app/node_modules/next/dist/server/next-server.js:554:9)
      at async NextNodeServer.handleCatchallRenderRequest (/Users/path/to/app/node_modules/next/dist/server/next-server.js:266:37)
      at async DevServer.handleRequestImpl (/Users/path/to/app/node_modules/next/dist/server/base-server.js:791:17)
      at async /Users/path/to/app/node_modules/next/dist/server/dev/next-dev-server.js:331:20
      at async Span.traceAsyncFn (/Users/path/to/app/node_modules/next/dist/trace/trace.js:151:20)
      at async DevServer.handleRequest (/Users/path/to/app/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
      at async invokeRender (/Users/path/to/app/node_modules/next/dist/server/lib/router-server.js:174:21)
      at async handleRequest (/Users/path/to/app/node_modules/next/dist/server/lib/router-server.js:353:24)```

@greg84
Copy link
Contributor

greg84 commented Aug 21, 2024

Could do something like this to serialize the object as JSON for use as the error message. Not an ideal fix but at least we'd be able to see what the error is.

@jbergs-dsit
Copy link

We're also seeing this issue (using .messages.stream()) – is this still on the roadmap to be fixed?

@rattrayalex
Copy link
Collaborator

@greg84 @jbergs-dsit (or anyone else on this thread) could you please provide a codesandbox or similar which reproduces the error?

@ryanblock
Copy link

ryanblock commented Aug 25, 2024

It should replicate for you using this repo (see comment above): https://github.com/beginner-corp/claude-begin-demo

Note: you don't need to deploy to Begin to replicate, just run the local sandbox with npm start.

@rattrayalex
Copy link
Collaborator

Thank you @ryanblock, we'll take a look soon!

@greg84
Copy link
Contributor

greg84 commented Aug 28, 2024

I have not been able to consistently reproduce this. It happens when the API returns an error to a streaming response. We have seen it during times of instability, where the API was returning 500 or overloaded errors.

Please read the original comment from paulcalcraft, it describes exactly what is happening: Just need to extract some useful detail from errJson before the error is thrown.

@rattrayalex
Copy link
Collaborator

rattrayalex commented Sep 7, 2024

EDIT: we're working on a fix for this internally.

@ryanblock
Copy link

@rattrayalex that appears to be a private repo?

@jbergs-dsit
Copy link

jbergs-dsit commented Sep 9, 2024

@RobertCraigie can you elaborate a bit on how the error has been fixed?
Is e.g. the Connection Error not occurring anymore or is the error it throws now processable by thecastToError function?

EDIT: answered by commit reference

@RobertCraigie
Copy link
Collaborator

Sorry it looks like it was closed prematurely before this commit was pushed.

@RobertCraigie
Copy link
Collaborator

This fix was released in v0.27.3, really sorry for the delay here!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants