Skip to content

[Bug]: Can't add parallel_tool_calls in LLMConfig #1695

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
marklysze opened this issue Apr 22, 2025 · 0 comments
Open

[Bug]: Can't add parallel_tool_calls in LLMConfig #1695

marklysze opened this issue Apr 22, 2025 · 0 comments
Assignees
Labels
bug Something isn't working

Comments

@marklysze
Copy link
Collaborator

Describe the bug

The following works:
basic_llm_config = LLMConfig(config_list=[{"model": "gpt-4.1-mini", "api_type": "openai"}])
and
basic_llm_config = LLMConfig(model="gpt-4.1-mini", api_type="openai", parallel_tool_calls=False)

However, the following doesn't work:
basic_llm_config = LLMConfig(config_list=[{"model": "gpt-4.1-mini", "api_type": "openai", "parallel_tool_calls": False}])

Raises a Pydantic ValidationError exception:

pydantic_core._pydantic_core.ValidationError: 1 validation error for _LLMConfig
config_list.0.openai.parallel_tool_calls
  Extra inputs are not permitted [type=extra_forbidden, input_value=False, input_type=bool]

Steps to reproduce

No response

Model Used

No response

Expected Behavior

No response

Screenshots and logs

No response

Additional Information

No response

@marklysze marklysze added the bug Something isn't working label Apr 22, 2025
@marklysze marklysze added this to ag2 Apr 22, 2025
@kumaranvpl kumaranvpl moved this to Todo in ag2 Apr 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: Todo
Development

No branches or pull requests

2 participants