SuperModels is an open-source project for running LLM agents in a user-friendly desktop app.
Give it a try! Download the executable here!
A self-reflecting machine:
- Think Wittgenstein not Frankenstein
- Built-in reflection mechanism to steer thinking
- Fully customisable
__in short
A lightweight, simple, and easy to adapt Agent.
Stack: Python
, Electronjs
, and TailwindCSS
.
- I implemented a series of powerful LLM APIs for you to play with.
- Most notably, Groq which is faster than any other LLM.
The tools are basic but performant:
- The
python_wizzard
generates and executes Python on your system. Note that running generated code is not without risk. - Be prepared for SuperModels to take control over your OS and start operating your home automation. - A second tool,
online_llm
uses Perplexity AI under the hood for natural language web results.
The real magic happens here (click for source code):
- This is where the agent breaks down complex problems into steps:
break_down = BreakdownReflection(prompt)
- Every step is then passed to the tooluser:
for step in break_down.steps:
answer = tooluser.call(step)
- The agent reflects on the tooluser's answer, and whether it should break it down further:
is_satisfying = YesNoReflection(prompt, answer)
In short: The combination of a super fast tooluser and nominally typed answers allows for quite impressive reflections and reasoning.
Although I still haven't got it to turn the lights off.
Give it a try! Download the executable here!
Then:
- Join the FREE beta at groq.com to get a free API KEY
- Input the API key when the app prompts you