You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
huggingface_hub.errors.HfHubHTTPError: 422 Client Error: Unprocessable Entity for url: https://lmqbs8965pj40e01.us-east-1.aws.endpoints.huggingface.cloud/v1/chat/completions (Request ID: 9kQ8on)
Input validation error: `inputs` tokens + `max_new_tokens` must be <= 32768. Given: 96721 `inputs` tokens and 0 `max_new_tokens`
It seems like my image was converted to a very large image, when original it is roughly only 1000*1000 pixels.
Expected behavior
I'd expect the uploaded image to be <1k tokens instead of ~100k tokens.
Other APIs (OpenAI, Anthropic) handle the same image fine, so I'm wondering: do they do some image size reduction pre-processing? Or is this a bug on TGI side?
The text was updated successfully, but these errors were encountered:
System Info
Using Inference Endpoint here: https://endpoints.huggingface.co/m-ric/endpoints/qwen2-72b-instruct-psj
ghcr.io/huggingface/text-generation-inference:3.0.1
Information
Tasks
Reproduction
Here's what I'm trying to run:
The image is not big, here it is:
I get this errror:
It seems like my image was converted to a very large image, when original it is roughly only 1000*1000 pixels.
Expected behavior
I'd expect the uploaded image to be <1k tokens instead of ~100k tokens.
Other APIs (OpenAI, Anthropic) handle the same image fine, so I'm wondering: do they do some image size reduction pre-processing? Or is this a bug on TGI side?
The text was updated successfully, but these errors were encountered: