-
-
Notifications
You must be signed in to change notification settings - Fork 16
Conversation
* Add ollama example to README I just spent a while trying to understand how to use a locally-running ollama instance, so adding some documentation to the README that would have saved me some time. * Be less attached to llama * Specify ollama for the 11434 port and merge the note about local AI into the local AI section * space
* Translate skill.json via GitLocalize * Translate skill.json via GitLocalize --------- Co-authored-by: OpenVoiceOS <[email protected]> Co-authored-by: mt-gitlocalize <[email protected]>
Important Review skippedBot user detected. To trigger a single review, invoke the You can disable this status message by setting the Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
Human review requested!