Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Provide Custom LLM Support #2276

Open
mvenkatsriram opened this issue Mar 4, 2025 · 2 comments · May be fixed by #2277
Open

[FEATURE] Provide Custom LLM Support #2276

mvenkatsriram opened this issue Mar 4, 2025 · 2 comments · May be fixed by #2277
Labels
feature-request New feature or request

Comments

@mvenkatsriram
Copy link

mvenkatsriram commented Mar 4, 2025

Feature Area

Core functionality

Is your feature request related to a an existing bug? Please link it here.

NA

Describe the solution you'd like

We have Enterprise LLM Gateway which encapsulates External Endpoints of several LLMs. Instead of API Key based authentication, we have JWT based authentication.

When I try to inherit the crewai.LLM class and override call method and removed the dependency of litellm. When We tried the sample research agent. we get the following error

Error:
AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

Is there a way CrewAI provides a functionality similar to langchain_core.language_models.chat_models.BaseChatModel

Describe alternatives you've considered

NA

Additional context

No response

Willingness to Contribute

I could provide more detailed specifications

@mvenkatsriram mvenkatsriram added the feature-request New feature or request label Mar 4, 2025
@devin-ai-integration devin-ai-integration bot linked a pull request Mar 4, 2025 that will close this issue
@lorenzejay
Copy link
Collaborator

cc: @bhancockio

@TaylorPzreal
Copy link

TaylorPzreal commented Mar 7, 2025

provider not worked, it's seems that 'provider' does not transfter to litellm.

I got the error "Error during LLM call: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable"

LLM(
            provider="custom",
            model="custom-gpt-4", 
            base_url="https://xxx",
            api_key=os.getenv("API_KEY"),
            temperature=0.5,
            timeout=480,
            context_window=4096,
            max_tokens=4000,
        )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants