Skip to content

A language-agnostic, distributed backend platform for AI, microservices, and beyond.

License

Notifications You must be signed in to change notification settings

openorch/openorch

Repository files navigation

OpenOrch

backend build frontend build go client build js client build go sdk
A language-agnostic, distributed backend platform for AI, microservices, and beyond.

Originally developed as a local ChatGPT alternative, OpenOrch quickly evolved into a robust, language-agnostic microservices platform. This evolution was a natural progression, driven by its authors’ career-spanning expertise in microservices and their need for a comprehensive platform to build on.

At its core, OpenOrch serves as a shared backend—but it goes far beyond that. It functions as an orchestrator, reverse proxy, ORM, AI platform, user management tool, and more. By unifying a suite of essential tools, OpenOrch streamlines backend development, enabling you to build and deploy powerful applications with ease.

Whether managing AI models, creating microservices, handling user authentication, or leveraging a wide range of other capabilities, OpenOrch provides a unified, developer-friendly foundation. By eliminating the need to reimplement common functionalities and reducing reliance on complex infrastructure components, OpenOrch simplifies setups and accelerates development—allowing you to focus on building, not managing.

Starting

Easiest way to run OpenOrch is with Docker. Install Docker if you don't have it. Step into repo root and:

docker compose up

to run the platform in foreground. It stops running if you Ctrl+C it. If you want to run it in the background:

docker compose up -d

Using

Now that the OpenOrch is running you have a few options to interact with it.

UI

You can go to http://127.0.0.1:3901 and log in with username openorch and password changeme and start using it just like you would use ChatGPT.

Click on the big "AI" button and download a model first. Don't worry, this model will be persisted across restarts (see volumes in the docker-compose.yaml).

Clients

For brevity the below example assumes you went to the UI and downloaded a model already. (That could also be done with clients but would be longer).

Let's do a sync prompting in JS. In your project run

npm i -s @openorch/client

Make sure your package.json contains "type": "module", put the following snippet into index.js

import { UserSvcApi, PromptSvcApi, Configuration } from "@openorch/client";

async function testDrive() {
  let userService = new UserSvcApi();
  let loginResponse = await userService.login({
    request: {
      slug: "openorch",
      password: "changeme",
    },
  });

  const promptSvc = new PromptSvcApi(
    new Configuration({
      apiKey: loginResponse.token?.token,
    })
  );

  let promptRsp = await promptSvc.addPrompt({
    request: {
      sync: true,
      prompt: "Is a cat an animal? Just answer with yes or no please.",
    },
  });

  console.log(promptRsp);
}

testDrive();

and run

$ node index.js
{
  answer: ' Yes, a cat is an animal.\n' +
    '\n' +
    'But if you meant to ask whether cats are domesticated animals or pets, then the answer is also yes. Cats belong to the Felidae family and are common household pets around the world. They are often kept for companionship and to control rodent populations.',
  prompt: undefined
}

Depending on your system it might take a while for the AI to respond. In case it takes long check the backend logs if it's processing, you should see something like this:

openorch-backend-1   | {"time":"2024-11-27T17:27:14.602762664Z","level":"DEBUG","msg":"LLM is streaming","promptId":"prom_e3SA9bJV5u","responsesPerSecond":1,"totalResponses":1}
openorch-backend-1   | {"time":"2024-11-27T17:27:15.602328634Z","level":"DEBUG","msg":"LLM is streaming","promptId":"prom_e3SA9bJV5u","responsesPerSecond":4,"totalResponses":9}

CLI

Install oo to get started (at the moment you need Go to install it):

go install github.com/openorch/openorch/cli/oo@latest
$ oo env add local http://127.0.0.1:58231

$ oo env ls
ENV NAME   SELECTED   URL                           DESCRIPTION
local      *          http://127.0.0.1:58231
$ oo login openorch changeme

$ oo whoami
slug: openorch
id: usr_e9WSQYiJc9
roles:
- user-svc:admin
$ oo post /prompt-svc/prompt --sync=true --prompt="Is a cat an animal? Just answer with yes or no please."
{
  "prompt": null,
  "answer": " Yes. A cat is an animal.\n\nTable of Contents\n\n## What is considered an animal in science?\n\nIn science, an animal is a multicellular, eukaryotic organism of the kingdom Animalia. Its body plan is characterized by a segmented body and a nervous system with a centralized brain, which coordinates all the actions of the organism’s body. Animals are multicellular organisms that are characterized by having a complex nervous system and sense organs for perceiving their environment. They are also characterized by having a digestive system that breaks down food externally and internally, and by having a circulatory system that transports nutrients and waste products throughout their body.\n\nCats are animals that belong to the phylum Chordata and the class Mammalia. They have a backbone and a notochord, which are characteristics of chordates, and they are mammals because they have mammary glands that produce milk to feed their young. So, a cat is an animal that belongs to the kingdom Animalia and specifically to the phylum Chordata and the class Mammalia."
}

Context

OpenOrch is a microservices platform that started taking shape in 2013 while I was at Hailo, an Uber competitor. The idea stuck with me and kept evolving over the years – including during my time at Micro, a microservices framework company. I assumed someone else would eventually build it, but with the AI boom and the wave of AI apps we’re rolling out, I’ve realized it’s time to build it myself.

Run On Your Servers

See the Running the daemon page to help you get started.

Services

For articles about the built-in services see the Built-in services page. For comprehensive API docs see the OpenOrch API page.

Run On Your Laptop/PC

We have temporarily discontinued the distribution of the desktop version. Please refer to this page for alternative methods to run the software.

License

OpenOrch is licensed under AGPL-3.0.

About

A language-agnostic, distributed backend platform for AI, microservices, and beyond.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •