Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow other LLMs #5

Open
21 of 33 tasks
jkomoros opened this issue Dec 20, 2023 · 0 comments
Open
21 of 33 tasks

Allow other LLMs #5

jkomoros opened this issue Dec 20, 2023 · 0 comments

Comments

@jkomoros
Copy link
Owner

jkomoros commented Dec 20, 2023

  • Plug in claude provider
  • Add a converter for the stream format so computePromptStreamAnthropic and the OpenAI one can be used interchangeably.
  • Resolve the bug that has existed for at least a little bit where setting the openai key leads to an infinite loop (introed in bac7ad3)
  • setAPIKey -> setAPiKeys
  • once Add a dangerouslyAllowBrowser option to allow running in the browser anthropics/anthropic-sdk-typescript#248 is fixed, set KEY_NAMES.anthropic.com.include to true once extractStreamChunk supports it
  • Make extractStreamChunk work for anthropic (and make the caller continue if content = '', whcih will happen more often now)
  • Support Palm/Gemini
  • Make it so the api dialog has a _firstRun (set to whether it was autoopened)
  • If _firstRun, then keep the other keys behind a closed zippy
  • Add a "view password" eye icon to see the key
  • For each provider add a help string pointing to directions
  • the zippy on conversation turn should say Advanced and print out state within
  • AIProvider methods should return an object with the return value inside
  • AIProvider returns resolved model in the object
  • Print out the AI model in use in the advanced zippy
  • Add a plugin architecture so that llm.ts doesn't have to be aware of all of the LLMs that are plugged in
  • Add a promptOptions.prefererred that is the same as the shape of modelRequirements. First, clear to only required. Then look for the item that has the highest preferenceMatch score, and use that one (first one if not)
  • state has a preferredModelProvider that is put into the promptOptions automatically.
  • Render a red alert next to model in the zippy if it's not the default requested, and a gray one next to Advanced.
  • in the api dialog allow changing the preferred provider if more than one apiKey exists.
  • Two bugs: 1) if you cancel a signaller, with finsih before streamWillStart, then it will start the stream even though it shoudldn't. 2) apiKeys set one at a time lead to tickling this
  • There's a weird race in SproutView where if sprout changes twice before the setTimeout is called, it will have a lastSprout that was actually never called in sproutChanged.
  • Have the first run dialog only talk about openai (or have a zippy for claude / other providers
  • Allow showing the API key dialog if necessary (with a little key icon button to forceOpen dialog)
  • The modelForPrompt (or whatever) grows a modelProvider matching criteria. The machinery automatically sends only the models that the user has affixed an API key for
  • Add a claude model to the default stack.
  • Hide the API key by default in the api key dialog (maybe just expand the default provider out of a zippy
  • api dialog allows adding a claude aPI key too
  • If mutliple API keys, then have a dropdown to let a user pick the modelProvider to use
  • Clean up the maxTokens, because it currently assumes how much context you can give but that's different.
  • Document in README (and update the motivation section)
  • Update node logic about setting anthropic key
  • Keep track of the provider in the URL if it's not openai (the default) so people you share the URL with will try it if they dont' have a strong opinion (preferProvider)
jkomoros added a commit that referenced this issue Jan 6, 2024
It _almost_ works, but computePromptStreamAnthropic just flat out lies about the type of the stream to get it to compile.

The model isnt' activated yet because it's not in the default stack.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
…owser yet, and was breaking the app from running.

But also, the input token limit is so high that it doesn't really matter.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
… will just throw if you use anthropic.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
…on creators.

A lot of surrounding machinery doesn't work for anthopic yet.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
Make AIProvider take APIKeys.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
…local storage.

This is also what caused hte infinite loop of calling disptcher instead of calling datamanager.setAPIKey.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
… of models we actually have keys for.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
…vider that has explicitly or implicitly already been set.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
…l call stopStreaming immediately.

This doesn't fix a bug uncovered by having two provider keys of double streaming, but does seem like it helps with correctness.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
…double run a sprout.

Made it so sprouts will only run themselves once.

This feels a little wrong still, but 🤷

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
Before, sproutChanged only took the lastSprout, and whatever the currentSprout was now.

But this led to weird issues if _currentSprout changed multiple times before the next timer tick... as happens on first run when multiple API Keys are set (successive calls to setAPIKey). This could lead to sproutChanged with a last Sprout that was never actually seen in sproutChanged.

Now we snapshot both the current sprout and the last sprout, which is at least a littl emore sane.

Realistically we should probably batch up the setAPIKeys updates into one as well.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
This avoids getting multiple sprouts in quick succession at boot when you have multiple keys.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
We'll be getting a lot more soon.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
Now that it might show up more often, it's good to keep it at least harder to see casually.

Part of #5.
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
jkomoros added a commit that referenced this issue Jan 6, 2024
…. Anthropic has a massive input and a normal sized output.

For OpenAI, I just assumed the input and output limit is the same but 🤷?

Part of #5.
jkomoros added a commit that referenced this issue Jan 7, 2024
jkomoros added a commit that referenced this issue Jan 7, 2024
Action creators and reducers, not yet used for anything.

Part of #5.
jkomoros added a commit that referenced this issue Jan 7, 2024
jkomoros added a commit that referenced this issue Jan 7, 2024
jkomoros added a commit that referenced this issue Jan 7, 2024
jkomoros added a commit that referenced this issue Jan 7, 2024
jkomoros added a commit that referenced this issue Jan 9, 2024
jkomoros added a commit that referenced this issue May 15, 2024
Part of #5 (I think?)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant