Welcome to ezAGI #1
Professor-Codephreak
announced in
Announcements
Replies: 3 comments
-
Started the ezAGI project yesterday. Added groq, openai and llama from ollama as admin tab so far |
Beta Was this translation helpful? Give feedback.
0 replies
-
I am writing code for ollama handler support in window. I am getting error while integrating model info at frontend side which I attached below. |
Beta Was this translation helpful? Give feedback.
0 replies
-
yes. thank you. required was the addition of the with statement in numerous locations within the class OpenMind: to maintain the correct with container_element: in this case with self.message_container: elif llama_running:
# Call ollama_handler to list models when LLaMA is found running
models = self.ollama_handler.list_models()
if models:
model_list = ", ".join(models)
with self.message_container:
ui.notify(f'LLaMA found running. Models available: {model_list}')
logging.debug(f"LLaMA running on localhost:11434. Models available: {model_list}")
else:
with self.message_container:
ui.notify('LLaMA found running, but no models are available.')
logging.debug("LLaMA running on localhost:11434, but no models are available.") |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
👋 Welcome!
We’re using Discussions as a place to connect with other members of our community. We hope that you:
build together 💪.
To get started, comment below with an introduction of yourself and tell us about what you do with this community.
Beta Was this translation helpful? Give feedback.
All reactions