Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'No module named 'letta.llm_api.google_ai' When Using Gemini Models #2525

Closed
2 tasks done
lemorage opened this issue Mar 27, 2025 · 1 comment
Closed
2 tasks done

Comments

@lemorage
Copy link
Contributor

Describe the bug
I was doing the following commands in the terminal and got this weird bug. The commands I run are from the tutorial: https://docs.letta.com/guides/server/providers/google#cli-pypi-only.

export GEMINI_API_KEY="..."
letta run
Letta.letta.server.db - INFO - Creating sqlite engine sqlite:////Users/clannad/.letta/sqlite.db

Would you like to select an existing agent? No

🧬 Creating new agent...
/Users/pypoetry/virtualenvs/yanara-BBYGsHMK-py3.12/lib/python3.12/site-packages/letta/server/server.py:1059:
UserWarning: An error occurred while listing LLM models for provider id=None name='google_ai' api_key='...' organization_id=None updated_at=None base_url='https://generativelanguage.googleapis.com': No module named 'letta.llm_api.google_ai'
  warnings.warn(f"An error occurred while listing LLM models for provider {provider}: {e}")
? Select LLM model: (Use arrow keys)

Please describe your setup

  • How did you install letta?
    • pip install letta
  • Describe your setup
    • MacOS
    • Terminal

Letta Config
Please attach your ~/.letta/config file or copy paste it below.

[defaults]
preset = memgpt_chat
persona = sam_pov
human = basic

[archival_storage]
type = postgres
path = /Users/clannad/.letta
uri = postgresql://postgres:@/postgres?host=/Users/clannad/.letta/desktop_data

[recall_storage]
type = postgres
path = /Users/clannad/.letta
uri = postgresql://postgres:@/postgres?host=/Users/clannad/.letta/desktop_data

[metadata_storage]
type = sqlite
path = /Users/clannad/.letta

[version]
letta_version = 0.6.45

If you're not using OpenAI, please provide additional information on your local LLM setup:

I was trying to use the gemini-2.0-flash model here.

@lemorage
Copy link
Contributor Author

Well, I noticed that the google_ai.py file was removed in commit f9a8aad, and a new file, google_ai_client.py, was added under llm_api/ in commit ee61a74.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant