Skip to content

unternet-co/client

Repository files navigation

Operator

An experimental client for the web from Unternet.

Setup

  • Run npm install
  • Copy .env.example to .env and fill in the required environment variables
  • Start the native client with npm run dev

Local Models

Operator has support for running LLM inference locally using Ollama. By default, this will be used if no Vite OpenAI API key has been provided as an environment variable.

To set this up on Linux use this to download and install Ollama:

curl -fsSL https://ollama.com/install.sh | sh

Or download the binary from their webisite.

Once installed use this to download the qwen2.5-coder:3b model which is the default.

ollama run qwen2.5-coder:3b

Builds

Windows

Windows builds are currently unsigned. Signing can be enabled by adding windows-certs.pfx and setting the WINDOWS_CERTS_PASSWORD secret in GitHub Actions and updating the certificateFile and certificatePassword fields in package.json under the "win" section.