-
-
Notifications
You must be signed in to change notification settings - Fork 730
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add watsonx support #620
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
❌ Changes requested.
- Reviewed the entire pull request up to 52131d0
- Looked at
173
lines of code in5
files - Took 1 minute and 20 seconds to review
More info
- Skipped
1
files when reviewing. - Skipped posting
9
additional comments because they didn't meet confidence threshold of85%
.
1. examples/watsonx/watsonx.py:9
:
- Assessed confidence :
80%
- Comment:
Consider using a configuration file or a secrets manager to handle sensitive data like API keys instead of directly fetching them from environment variables. - Reasoning:
The API key and project ID are being fetched from the environment variables directly in the code. This could be a potential security risk if the environment variables are not properly managed. It would be better to use a configuration file or a secrets manager to handle sensitive data like API keys.
2. examples/watsonx/watsonx.py:14
:
- Assessed confidence :
80%
- Comment:
Consider fetching the model_id from a configuration file or an environment variable instead of hardcoding it. - Reasoning:
The Watsonx client is being instantiated with a hardcoded model_id. This could limit the flexibility of the code. It would be better to fetch the model_id from a configuration file or an environment variable.
3. examples/watsonx/watsonx.py:18
:
- Assessed confidence :
80%
- Comment:
Consider fetching the parameters from a configuration file or an environment variable instead of hardcoding them. - Reasoning:
The Watsonx client is being instantiated with hardcoded parameters. This could limit the flexibility of the code. It would be better to fetch these parameters from a configuration file or an environment variable.
4. instructor/__init__.py:70
:
- Assessed confidence :
80%
- Comment:
Consider handling the import failure and providing a clear error message. - Reasoning:
The Watsonx client is being imported conditionally. This could lead to potential issues if the import fails. It would be better to handle the import failure and provide a clear error message.
5. instructor/utils.py:28
:
- Assessed confidence :
0%
- Comment:
Good practice of adding Watsonx to the Provider enum for easy addition of new providers in the future. - Reasoning:
The Watsonx provider is being added to the Provider enum. This is a good practice as it allows for easy addition of new providers in the future.
6. instructor/utils.py:47
:
- Assessed confidence :
0%
- Comment:
Good practice of identifying the Watsonx provider based on the base_url. - Reasoning:
The Watsonx provider is being identified based on the base_url. This is a good practice as it allows for easy identification of the provider.
7. pyproject.toml:35
:
- Assessed confidence :
0%
- Comment:
Good practice of adding Watsonx dependency to the poetry configuration file for easy management of dependencies. - Reasoning:
The Watsonx dependency is being added to the poetry configuration file. This is a good practice as it allows for easy management of dependencies.
8. instructor/client_watsonx.py:10
:
- Assessed confidence :
100%
- Grade:
0%
- Comment:
Consider having different signatures for overloaded functions to avoid confusion and potential bugs. - Reasoning:
The from_watsonx function is overloaded but both versions have the same signature. This could lead to confusion and potential bugs. It would be better to have different signatures for overloaded functions.
9. instructor/client_watsonx.py:39
:
- Assessed confidence :
100%
- Grade:
0%
- Comment:
Consider implementing the missing functionality or providing a clear explanation of why it is not implemented. - Reasoning:
The from_watsonx function has a TODO comment about adding Watsonx async. This indicates that the function is not fully implemented and could lead to potential issues. It would be better to implement the missing functionality or provide a clear explanation of why it is not implemented.
Workflow ID: wflow_rIKCcN8i1GIEOnbo
Want Ellipsis to fix these issues? Tag @ellipsis-dev
in a comment. We'll respond in a few minutes. Learn more here.
Yeah I think the easiest approach for this is to add LiteLLM support. |
Would be keen on the LiteLLM support. Then people who use Amazon Bedrock can get involved. |
72d1921
to
8853244
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 Looks good to me!
- Reviewed the entire pull request up to 8853244
- Looked at
137
lines of code in3
files - Took 46 seconds to review
More info
- Skipped
1
files when reviewing. - Skipped posting
1
additional comments because they didn't meet confidence threshold of85%
.
1. pyproject.toml:80
:
- Assessed confidence :
100%
- Grade:
0%
- Comment:
The PR description mentions addingibm-watsonx-ai
as an optional dependency, but it is not present in thepyproject.toml
. Please add it to ensure the new functionality works as expected.
ibm-watsonx-ai = { version = "^0.1.0", optional = true }
- Reasoning:
The PR description mentions thatibm-watsonx-ai
is added as an optional dependency inpyproject.toml
, but the diff does not show this change. This could be an oversight or an error in updating the file. It's crucial for the functionality described in the PR to work, as the newclient_watsonx.py
likely depends on this package.
Workflow ID: wflow_Tom4hYW1GX5PGwfP
Not what you expected? You can customize the content of the reviews using rules. Learn more here.
LiteLLM support has been added, so I updated the docs to show how to use watsonx with via LiteLLM, so no new code changes. Should I create a |
It sounds like you have made updates to the documentation to include information on how to use WatsonX with LiteLLM. Since there are no new code changes, we can focus on reviewing the existing code and documentation. I can assist with various types of reviews on this pull request, such as:
If you would like, I can review specific files or sections based on your preference. Just let me know which type of review you would like to prioritize or if you have any specific files in mind for review. Regarding the decision to create a Please let me know your preferences for the review or if you have any specific files you would like me to focus on. |
Was waiting to test since my watsonx API key is on my work computer, but just tried it. I just updated LiteLLM to latest but it is still not out but it is on the docs. Likely will go out today so will bump the version when that happens |
Understood! It's great that you were able to test the WatsonX API key and that you have updated LiteLLM to the latest version. Since LiteLLM is not yet released but the documentation reflects the upcoming changes, it's a good idea to bump the version once LiteLLM is officially out. If you need any assistance with updating the version or any other related tasks once LiteLLM is released, feel free to let me know. Additionally, if you would like to proceed with a review of the pull request before the version bump, just give me the go-ahead, and I can start the review process. Feel free to reach out if you have any further questions or need assistance with anything else! |
add files and start implementation x x Revert "add/update dependencies" This reverts commit 9787dd4.
Add example Revert "add/update dependencies" This reverts commit 9787dd4. fix: bullet list in README.md (instructor-ai#621) fix: Update README.md (instructor-ai#625) fix typo about Enum 'and' → 'an' (instructor-ai#626) Fix anthropic usage and tools (instructor-ai#622) Revert init changes
Updated LiteLLM - examples tested and working |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review of the Content
👈(click to expand)
We classified this document as a tutorial
and reviewed it accordingly. We rate this content quality as a 6/10.
To improve the writing quality to 8 or better, the document should focus on enhancing clarity and conciseness. Sentences should be restructured to use active voice consistently and ensure that technical terms are clearly defined. Additionally, the document should provide a smoother transition between sections to enhance the flow and coherence of the content.
tutorial
and reviewed it accordingly. We rate this content quality as a 6/10.To improve the writing quality to 8 or better, the document should focus on enhancing clarity and conciseness. Sentences should be restructured to use active voice consistently and ensure that technical terms are clearly defined. Additionally, the document should provide a smoother transition between sections to enhance the flow and coherence of the content.
Overall outline and structure feedback
In Structured Outputs with IBM watsonx.ai
(medium Severity)
The title could be more descriptive and engaging. Try replacing it with Implementing Structured Outputs with IBM Watsonx.ai
.
In watsonx.ai
(medium Severity)
The heading is too generic and does not reflect the content beneath it. Try replacing it with Setting Up Your Watsonx.ai Credentials
.
In Example
(medium Severity)
The heading could be more specific to reflect the content of the code example. Try replacing it with Python Example: Using Watsonx.ai for Model Inference
.
with respect to the content
In the section watsonx.ai
, we identified 2 issues.
- Unclear sentence structure (medium Severity) Clarify the sentence to specify where or how to obtain these credentials.
- You will need an API key and the project ID for your watsonx.ai.
+ To use watsonx.ai, you will need an API key and a project ID, which can be obtained from your account settings on the IBM Watsonx website.
- Missing context or link (medium Severity) Provide a brief description of what information can be found at the link.
- More information at <https://www.ibm.com/products/watsonx-ai>.
+ For more detailed information about watsonx.ai features and setup, visit the IBM product page at <https://www.ibm.com/products/watsonx-ai>.
In the section Example
, we identified 2 issues.
- Unclear code comments (medium Severity) Explain why this setting is necessary in the context of the example.
- litellm.drop_params = True # watsonx.ai doesn't support `json_mode`
+ litellm.drop_params = True # Set to True because watsonx.ai does not support `json_mode`, which affects how parameters are handled.
- Grammatical error (medium Severity) Correct the grammatical structure in the comment.
- assert resp.name == "IBM"
+ assert resp.name == "IBM" # Ensure the 'name' attribute of the response is 'IBM'
Review of the Content in
|
docs/examples/watsonx.md
Outdated
messages=[ | ||
{ | ||
"role": "user", | ||
"content": task, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just set the string here no need to add indirection for no reason
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
Made those fixes |
Great job on making those fixes! It's important to ensure that the content is clear and accurate for users. Since you have addressed the highlighted issues in the If you would like, I can perform another content review to verify the changes and ensure that everything is in order. Just let me know if you would like to proceed with another review or if there are any specific areas you would like me to focus on. Feel free to provide further instructions or requests for assistance! |
#619
Summary:
This PR introduces IBM Watson AI support in the instructor library, including a new example, updates to the instructor package and utils, and a new optional dependency.
Key points:
watsonx.py
in theexamples/watsonx
directory.instructor
package to include Watson AI support ifibm_watsonx_ai
is installed.client_watsonx.py
in theinstructor
package, which contains the functionfrom_watsonx
.Provider
enum inutils.py
to includeWATSONX
.pyproject.toml
to includeibm-watsonx-ai
as an optional dependency.md
file indocs
.Generated with ❤️ by ellipsis.dev