Skip to content

Configuring local Ollama instance for use with LLM honeypots. #1741

Answered by JeffMcGeoff
JeffMcGeoff asked this question in Q&A
Discussion options

You must be logged in to vote

Thank you for the reply @t3chn0m4g3, after reading Ollama's faq.md, particularly Setting environment variables on Linux and adding the line Environment="OLLAMA_HOST=0.0.0.0" within the ollama.service file, I got both honeypots communicating with Ollama.

I have shared relevant snippets of my .env file, should anyone else have a similar issue running Ollama locally.

BEELZEBUB_LLM_MODEL: "ollama"
BEELZEBUB_LLM_HOST: "http://192.168.0.1:11434/api/chat"
BEELZEBUB_OLLAMA_MODEL: "phi4"
......
GALAH_LLM_PROVIDER: "ollama"
GALAH_LLM_SERVER_URL: "http://192.168.0.1:11434"
GALAH_LLM_MODEL: "phi4"

Obviously, replace 192.168.0.1 with the inet IP address displayed in the output of the command ip addr …

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@JeffMcGeoff
Comment options

Answer selected by JeffMcGeoff
@t3chn0m4g3
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants