Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Model hub not showing customized costs #8573

Open
xmcp opened this issue Feb 16, 2025 · 0 comments
Open

[Bug]: Model hub not showing customized costs #8573

xmcp opened this issue Feb 16, 2025 · 0 comments
Labels
bug Something isn't working mlops user request

Comments

@xmcp
Copy link

xmcp commented Feb 16, 2025

What happened?

The "model hub" page in the UI shows costs from the built-in cost map instead of litellm_params. This leads to the displayed costs do not match the actual costs when calling the API.

In the model hub page:

Image

In config.yaml:

  - model_name: azure/gpt-4o-mini
    litellm_params:
      model: azure/gpt-4o-mini
      input_cost_per_token: 0.00000015
      output_cost_per_token: 0.0000006
      ...

  - model_name: ali/qwen-turbo
    litellm_params:
      model: openai/qwen-turbo
      input_cost_per_token: 0.000000042
      output_cost_per_token: 0.000000084
      ...

Relevant log output

N/A

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.61.3-stable

Twitter / LinkedIn details

No response

@xmcp xmcp added the bug Something isn't working label Feb 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mlops user request
Projects
None yet
Development

No branches or pull requests

1 participant