You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Yes @99Gens I guess there need some kind of a validation when the Number of Token's exceeds but this could only happen at the run-time so it would be better to give some kind of a Toast Message because it totally depends on the model that we are using . Putting an error at the compile time can be done but would take a lot more effort and with the increasing context window not worth it.
litellm.BadRequestError: litellm.ContextWindowExceededError: AnthropicError - {"type":"error","error":{"type":"invalid_request_error","message":"prompt is too long: 209353 tokens > 199999 maximum"}}
Devon should know not to submit a prompt which exceeds x tokens.
The text was updated successfully, but these errors were encountered: