Skip to content

[Docs]: Reduce onboarding friction in AG2 examples by supporting multiple LLM providers #1622

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
shrsv opened this issue Apr 12, 2025 · 0 comments

Comments

@shrsv
Copy link
Contributor

shrsv commented Apr 12, 2025

Describe the issue

While trying out one of the AG2 examples, I had a few thoughts around improving the initial developer experience — specifically to reduce the friction in reaching a first successful run.

Current State

  • The current examples default to using OpenAI (OAI) as the LLM provider (both in README and examples).
  • However, OAI requires a paid subscription, which might act as a barrier for first-time users.

Observations & Suggestions

  • Consider alternative defaults: Google’s Gemini provides an AI Studio with a generous free tier, which allows users to get started immediately.
  • Provider switch mechanism: It may help to provide an easy way for users to switch between providers (e.g., Claude, Gemini, OAI). Even simple code snippets with minimal copy-paste/setup friction could go a long way.

Example config I used to get Gemini working with AG2 for the first:

[
  {
    "model": "gemini-2.0-flash-001",
    "api_key": "<from ai studio>",
    "api_type": "google"
  }
]

And the corresponding setup in code:

seed = 42  # for caching, use None for no caching
llm_config = LLMConfig.from_json(
    path="OAI_CONFIG_LIST",
    seed=seed,
).where(model="gemini-2.0-flash-001")

This took me a 5-7 minutes of trial and error as a first-time user. Ideally - the quick start should be specific enough that there is no possibility of friction in getting it to work.

  • Documentation/UI enhancement Possibilities:

    • A step-by-step guide for Gemini (and possibly Claude) would help:
      1. Go to Google AI Studio (or Claude settings)
      2. Get API key
      3. Create config file
      4. Select exact LLM config
      5. Run example
    • A tabbed UI for examples (OAI | Gemini | Claude) might reduce onboarding friction significantly.
  • Rationale:

    • Reaching a "first success" quickly is key to adoption.
    • Many developers (esp. personal users or non-enterprise) are increasingly diversifying away from OAI.
    • Anecdotally, in my circles, people use Claude or Gemini more for personal use.
    • Diversity in provider preference should be acknowledged in examples/documentation.

Thanks again for considering this. Happy to iterate or help out further!

Steps to reproduce

No response

Screenshots and logs

No response

Additional Information

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants