Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added example for using unions of models, useful for agents #299

Merged
merged 4 commits into from
Dec 23, 2023
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
50 changes: 50 additions & 0 deletions examples/let_the_llm_choose_a_tool.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
from pydantic import BaseModel, Field
from typing import Union
import instructor
from openai import OpenAI

class Search(BaseModel):
"""Search action class with a 'query' field and a process method."""
query: str = Field(description="The search query")

def process(self):
"""Process the search action."""
return f"Search method called for query: {self.query}"

class Lookup(BaseModel):
"""Lookup action class with a 'keyword' field and a process method."""
keyword: str = Field(description="The lookup keyword")

def process(self):
"""Process the lookup action."""
return f"Lookup method called for keyword: {self.keyword}"

class Finish(BaseModel):
"""Finish action class with an 'answer' field and a process method."""
answer: str = Field(description="The answer for finishing the process")

def process(self):
"""Process the finish action."""
return f"Finish method called with answer: {self.answer}"

# Union of Search, Lookup, and Finish
class TakeAction(BaseModel):
action: Union[Search, Lookup, Finish]

def process(self):
"""Process the action."""
return self.action.process()


# Enables `response_model`
client = instructor.patch(OpenAI())
action = client.chat.completions.create(
model="gpt-3.5-turbo",
response_model=TakeAction,
messages=[
{"role": "user", "content": "Please choose one action"},
]
)
assert isinstance(action, TakeAction)
print(action.process())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider adding error handling around the API call and the type assertion to gracefully handle any exceptions that may occur, ensuring the robustness of the example.

+ try:
  client = instructor.patch(OpenAI())
  action = client.chat.completions.create(
      model="gpt-3.5-turbo",
      response_model=TakeAction,
      messages=[
          {"role": "user", "content": "Please choose one action"},
      ]
  )
  assert isinstance(action, TakeAction)
  print(action.process())
+ except (SomeSpecificException, AssertionError) as e:
+     print(f"An error occurred: {e}")

Committable suggestion

IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
# Enables `response_model`
client = instructor.patch(OpenAI())
action = client.chat.completions.create(
model="gpt-3.5-turbo",
response_model=TakeAction,
messages=[
{"role": "user", "content": "Please choose one action"},
]
)
assert isinstance(action, TakeAction)
print(action.process())
try:
client = instructor.patch(OpenAI())
action = client.chat.completions.create(
model="gpt-3.5-turbo",
response_model=TakeAction,
messages=[
{"role": "user", "content": "Please choose one action"},
]
)
assert isinstance(action, TakeAction)
print(action.process())
except (SomeSpecificException, AssertionError) as e:
print(f"An error occurred: {e}")


Loading