Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(feat) add IBM watsonx.ai as an llm provider #3270

Merged
merged 15 commits into from
Apr 27, 2024

Conversation

simonsanvil
Copy link
Contributor

Integrated IBM's watsonx.ai API as a provider to be able to make calls to the (text generation and embedding) models available in the watsonx.ai platform.

Copy link

vercel bot commented Apr 24, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 24, 2024 3:34pm

@simonsanvil simonsanvil changed the title (feat) added IBM watsonx.ai as an llm provider (feat) add IBM watsonx.ai as an llm provider Apr 25, 2024
@h0rv h0rv mentioned this pull request Apr 26, 2024
@h0rv
Copy link

h0rv commented Apr 26, 2024

Issue: #3204

@simonsanvil
Copy link
Contributor Author

Hey @h0rv, I just checked your PR in instructor and can confirm that this PR would add the support you were looking for.

The only thing is that since watsonx.ai doesn't support function-calling natively you would need to add it to the prompt by setting mode=instructor.Mode.JSON

import os
import instructor, litellm
from instructor import Mode
from litellm import completion
from pydantic import BaseModel

litellm.drop_params=True # since watsonx.ai doesn't support `json_mode`

os.environ["WATSONX_URL"] = "" 
os.environ["WATSONX_APIKEY"] = ""

class User(BaseModel):
    name: str
    age: int

client = instructor.from_litellm(completion, project_id="<your-project-id>" mode=Mode.JSON)

resp = client.chat.completions.create(
    model="watsonx/meta-llama/llama-3-8b-instruct",
    max_tokens=1024,
    messages=[
        {
            "role": "user",
            "content": "Extract Jason is 25 years old.",
        }
    ],
    response_model=User,
)

assert isinstance(resp, User)
assert resp.name == "Jason"
assert resp.age == 25

@simonsanvil
Copy link
Contributor Author

cc: @krrishdholakia

@h0rv
Copy link

h0rv commented Apr 26, 2024

Awesome thanks and good work on this!

@krrishdholakia krrishdholakia merged commit 2d976cf into BerriAI:main Apr 27, 2024
2 checks passed
@krrishdholakia
Copy link
Contributor

Great PR! Thank you for the work on this @simonsanvil

@krrishdholakia
Copy link
Contributor

just merged. I'll take care of any issues that come up in ci/cd

@ishaan-jaff
Copy link
Contributor

This failed a bunch of linting tests

litellm/llms/watsonx.py:16: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase
litellm/llms/watsonx.py:77: error: Name "return_options" already defined on line 76  [no-redef]
litellm/llms/watsonx.py:193: error: Incompatible default for argument "print_verbose" (default has type "None", argument has type "Callable[..., Any]")  [assignment]
litellm/llms/watsonx.py:193: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True
litellm/llms/watsonx.py:193: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase
litellm/llms/watsonx.py:231: error: Incompatible types in assignment (expression has type "str", variable has type "WatsonXAIEndpoint")  [assignment]
litellm/llms/watsonx.py:246: error: Incompatible default for argument "print_verbose" (default has type "None", argument has type "Callable[..., Any]")  [assignment]
litellm/llms/watsonx.py:246: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True
litellm/llms/watsonx.py:246: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase
litellm/llms/watsonx.py:310: error: Function "print_verbose" could always be true in boolean context  [truthy-function]
litellm/llms/watsonx.py:348: error: Incompatible default for argument "timeout" (default has type "None", argument has type "float")  [assignment]
litellm/llms/watsonx.py:348: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True
litellm/llms/watsonx.py:348: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase
litellm/llms/watsonx.py:354: error: Item "None" of "dict[Any, Any] | None" has no attribute "pop"  [union-attr]
litellm/llms/watsonx.py:359: error: Unsupported right operand type for in ("dict[Any, Any] | None")  [operator]
litellm/llms/watsonx.py:360: error: Unsupported target for indexed assignment ("dict[Any, Any] | None")  [index]
litellm/llms/watsonx.py:382: error: "ModelResponse" has no attribute "usage"  [attr-defined]
litellm/llms/watsonx.py:415: error: Argument "optional_params" to "_prepare_text_generation_req" of "IBMWatsonXAI" has incompatible type "dict[Any, Any] | None"; expected "dict[Any, Any]"  [arg-type]
litellm/llms/watsonx.py:528: error: Incompatible default for argument "timeout" (default has type "None", argument has type "float")  [assignment]
litellm/llms/watsonx.py:528: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True
litellm/llms/watsonx.py:528: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase
litellm/router_strategy/lowest_latency.py:84: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
litellm/router_strategy/lowest_latency.py:223: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
litellm/main.py:1875: error: Argument "litellm_params" to "completion" of "IBMWatsonXAI" has incompatible type "list[str]"; expected "dict[Any, Any] | None"  [arg-type]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants