Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: update readme and installation guide for litellm backend addition #703

Merged
merged 7 commits into from
Mar 21, 2025

Conversation

TPLin22
Copy link
Collaborator

@TPLin22 TPLin22 commented Mar 19, 2025

Description

We support using LiteLLM as LLM-Backend to use various model more flexibly.

Motivation and Context

How Has This Been Tested?

  • If you are adding a new feature, test on your own test scripts.

Screenshots of Test Results (if appropriate):

  1. Your own tests:

Types of changes

  • Fix bugs
  • Add new feature
  • Update documentation

📚 Documentation preview 📚: https://RDAgent--703.org.readthedocs.build/en/703/

README.md Outdated
```bash
cat << EOF > .env
BACKEND=rdagent.oai.backend.LiteLLMAPIBackend
LITELLM_CHAT_MODEL=gpt-4o
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove the prefix

BACKEND=rdagent.oai.backend.LiteLLMAPIBackend
LITELLM_CHAT_MODEL=gpt-4o
LITELLM_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_API_KEY=<replace_with_your_openai_api_key>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
OPENAI_API_KEY=<replace_with_your_openai_api_key>
# The backend api_key and api_base fully follow the convetion of litellm.
OPENAI_API_KEY=<replace_with_your_openai_api_key>


.. code-block:: Properties

LITELLM_CHAT_MODEL=deepseek/deepseek-reasoner
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this parameter will be passed into litellm's completion function. litellm will use its router to decide which backend will be used.

@TPLin22 TPLin22 changed the title docs: Update README and installation guide for LiteLLM backend addition docs: update readme and installation guide for litellm backend addition Mar 20, 2025
@TPLin22 TPLin22 mentioned this pull request Mar 20, 2025
@TPLin22 TPLin22 merged commit bdfe82e into main Mar 21, 2025
7 of 8 checks passed
@TPLin22 TPLin22 deleted the llm-docs branch March 21, 2025 06:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants