Skip to content

Commit

Permalink
Clarify usage with non-default server addresses
Browse files Browse the repository at this point in the history
  • Loading branch information
taketwo committed Dec 16, 2024
1 parent dd616e7 commit e5d981b
Showing 1 changed file with 6 additions and 2 deletions.
8 changes: 6 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
[![Tests](https://github.com/taketwo/llm-ollama/actions/workflows/test.yml/badge.svg)](https://github.com/taketwo/llm-ollama/actions/workflows/test.yml)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/taketwo/llm-ollama/blob/main/LICENSE)

[LLM](https://llm.datasette.io/) plugin providing access to models running on local [Ollama](https://ollama.ai) server.
[LLM](https://llm.datasette.io/) plugin providing access to models running on an [Ollama](https://ollama.ai) server.

## Installation

Expand Down Expand Up @@ -114,7 +114,11 @@ Additionally, the `-o json_object 1` option can be used to force the model to re

## Ollama server address

If your Ollama server is not hosted at the default `localhost:11434` address, you can use `OLLAMA_HOST` environment variable to point the plugin to it.
`llm-ollama` will try to connect to a server at the default `localhost:11434` address. If your Ollama server is remote or runs on a non-default port, you can use `OLLAMA_HOST` environment variable to point the plugin to it, e.g.:

```bash
export OLLAMA_HOST=https://192.168.1.13:11434
```

## Development

Expand Down

0 comments on commit e5d981b

Please sign in to comment.