Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

O3-mini model - BadRequestError (400) - Extra Inputs Not Permitted in Azure Search Parameters #1260

Open
aasedek opened this issue Feb 7, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@aasedek
Copy link

aasedek commented Feb 7, 2025

Describe the Bug
When adding the endpoint and API for an O3-mini model, the app produces the following error message:

openai.BadRequestError: Error code: 400 - {'error': {'requestid': '54dd7ee5-5912-4bcb-9f34-7d27f', 'code': 400, 'message': 'Validation error at #/data_sources/0/azure_search/parameters/role_information: Extra inputs are not permitted'}}

To Reproduce
Steps to reproduce the behavior:

  1. Deploy the o3-mini model.
  2. Add the endpoint, API key, resource, and other required values in the .env file.
  3. Start the application.
  4. Enter any input in the chat box.

Expected Behavior
The application should successfully process chat inputs without errors.

Error Logs
ERROR:root:Exception in send_chat_request Traceback (most recent call last): File "app.py", line 474, in send_chat_request await azure_openai_client.chat.completions.with_raw_response.create( File "openai/_legacy_response.py", line 381, in wrapped return cast(LegacyAPIResponse[R], await func(*args, **kwargs)) ... File "openai/_base_client.py", line 1644, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'requestid': '54dd7ee5-5912-4bcb-9f34-7db7654db27f', 'code': 400, 'message': 'Validation error at #/data_sources/0/azure_search/parameters/role_information: Extra inputs are not permitted'}}

Possible Cause
It appears that an unexpected or unsupported parameter (role_information) is being included in the Azure Search configuration that is not compatible with o3-mini model.

Environment:
OS: Windows
Python Version: 3.11
OpenAI SDK Version: 1.61.0
Deployment Environment: Local Machine

Requirements.txt:
azure-identity==1.15.0 openai==1.61.0 azure-search-documents==11.4.0b6 azure-storage-blob==12.17.0 python-dotenv==1.0.0 azure-cosmos==4.5.0 quart==0.19.4 uvicorn==0.24.0 aiohttp==3.9.2 gunicorn==20.1.0 pydantic-settings==2.2.1

Additional Context
The issue started occurring after integrating the o3-mini model.
The same setup worked previously with other models.

@aasedek aasedek added the bug Something isn't working label Feb 7, 2025
@aasedek
Copy link
Author

aasedek commented Feb 8, 2025

I solved the issue, but it requires writing a new model argument for the O* family of models. This also includes the data source configuration.

@patelha-ccm
Copy link

I solved the issue, but it requires writing a new model argument for the O* family of models. This also includes the data source configuration.

Hey, I was also planning to integrate the O* models with the current app, but figured out that the API is a bit different of the O family. If possible can you share the new api/function call? I would really appreciate that :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants