Dagu is a powerful Cron alternative that comes with a Web UI. It allows you to define dependencies between commands as a Directed Acyclic Graph (DAG) in a declarative YAML format. Additionally, Dagu natively supports running Docker containers, making HTTP requests, and executing commands over SSH.
- Single binary file installation
- Declarative YAML format for defining DAGs
- Web UI for visually managing, rerunning, and monitoring pipelines
- Use existing programs without any modification
- Self-contained, with no need for a DBMS
- Highlights
- Contents
- Features
- Usecase
- Web User Interface
- Installation
- ️Quick Start Guide
- Documentation
- Example Workflow
- Motivation
- Why Not Use an Existing Workflow Scheduler Like Airflow?
- How It Works
- Roadmap
- Contributors
- License
- Support and Community
- Web User Interface
- Command Line Interface (CLI) with several commands for running and managing DAGs
- YAML format for defining DAGs, with support for various features including:
- Execution of custom code snippetts
- Parameters
- Command substitution
- Conditional logic
- Redirection of stdout and stderr
- Lifecycle hooks
- Repeating task
- Automatic retry
- Executors for running different types of tasks:
- Running arbitrary Docker containers
- Making HTTP requests
- Sending emails
- Running jq command
- Executing remote commands via SSH
- Email notification
- Scheduling with Cron expressions
- REST API Interface
- Basic Authentication
- Data Pipeline Automation: Schedule ETL tasks for data processing and centralization.
- Infrastructure Monitoring: Periodically check infrastructure components with HTTP requests or SSH commands.
- Automated Reporting: Generate and send periodic reports via email.
- Batch Processing: Schedule batch jobs for tasks like data cleansing or model training.
- Task Dependency Management: Manage complex workflows with interdependent tasks.
- Microservices Orchestration: Define and manage dependencies between microservices.
- CI/CD Integration: Automate code deployment, testing, and environment updates.
- Alerting System: Create notifications based on specific triggers or conditions.
- Custom Task Automation: Define and schedule custom tasks using code snippets.
-
DAG Details: It shows the real-time status, logs, and DAG configurations. You can edit DAG configurations on a browser.
You can switch to the vertical graph with the button on the top right corner.
-
DAGs: It shows all DAGs and the real-time status.
-
Execution History: It shows past execution results and logs.
-
DAG Execution Log: It shows the detail log and standard output of each execution and step.
You can install Dagu quickly using Homebrew or by downloading the latest binary from the Releases page on GitHub.
brew install yohamta/tap/dagu
Upgrade to the latest version:
brew upgrade yohamta/tap/dagu
curl -L https://raw.githubusercontent.com/yohamta/dagu/main/scripts/downloader.sh | bash
docker run \
--rm \
-p 8080:8080 \
-v $HOME/.dagu/dags:/home/dagu/.dagu/dags \
-v $HOME/.dagu/data:/home/dagu/.dagu/data \
-v $HOME/.dagu/logs:/home/dagu/.dagu/logs \
yohamta/dagu:latest
Download the latest binary from the Releases page and place it in your $PATH
(e.g. /usr/local/bin
).
Start the server with the command dagu server
and browse to http://127.0.0.1:8080
to explore the Web UI.
Navigate to the DAG List page by clicking the menu in the left panel of the Web UI. Then create a DAG by clicking the New DAG
button at the top of the page. Enter example
in the dialog.
Note: DAG (YAML) files will be placed in ~/.dagu/dags
by default. See Configuration Options for more details.
Go to the SPEC
Tab and hit the Edit
button. Copy & Paste the following example and click the Save
button.
Example:
steps:
- name: s1
command: echo Hello Dagu
- name: s2
command: echo done!
depends:
- s1
You can execute the example by pressing the Start
button. You can see "Hello Dagu" in the log page in the Web UI.
- Installation Instructions
- ️Quick Start Guide
- Command Line Interface
- Web User Interface
- YAML Format
- Minimal DAG Definition
- Running Arbitrary Code Snippets
- Defining Environment Variables
- Defining and Using Parameters
- Using Command Substitution
- Adding Conditional Logic
- Setting Environment Variables with Standard Output
- Redirecting Stdout and Stderr
- Adding Lifecycle Hooks
- Repeating a Task at Regular Intervals
- All Available Fields for DAGs
- All Available Fields for Steps
- Example DAGs
- Configurations
- Scheduler
- Docker Compose
This example workflow calls the ChatGPT API and sends the result to your email address.
Create a new DAG in the Web UI and copy-paste the following YAML in the editor.
Replace OPEN_API_KEY
, YOUR_EMAIL_ADDRESS
, MAILGUN_USERNAME
, and MAILGUN_PASSWORD
in the env
and smtp
section with your own information. Mailgun offers a free tier for testing.
You can then run the DAG.
env:
- OPENAI_API_KEY: "$OPEN_API_KEY"
- MY_EMAIL: "$YOUR_EMAIL_ADDRESS"
smtp:
host: "smtp.mailgun.org"
port: "587"
username: "$MAILGUN_USERNAME"
password: "$MAILGUN_PASSWORD"
params: QUESTION="Can you explain your philosophy of Stoicism?"
steps:
- name: ask chatgpt
executor:
type: http
config:
timeout: 180
headers:
Authorization: "Bearer $OPENAI_API_KEY"
Content-Type: "application/json"
silent: true
body: |
{ "model": "gpt-3.5-turbo", "messages": [
{"role": "system", "content": "Respond as a philosopher of the Roman Imperial Period"},
{"role": "user", "content": "$QUESTION"}
]
}
command: POST https://api.openai.com/v1/chat/completions
output: API_RESPONSE
- name: get result
executor:
type: jq
config:
raw: true
command: ".choices[0].message.content"
script: "$API_RESPONSE"
output: MESSAGE_CONTENT
depends:
- ask chatgpt
- name: send mail
executor:
type: mail
config:
to: "$MY_EMAIL"
from: "$MY_EMAIL"
subject: "[dagu-auto] philosopher's reply"
message: |
<html>
<body>
<h1>$QUESTION</h1>
<p>$MESSAGE_CONTENT</p>
</body>
</html>
depends:
- get result
You can input the ChatGPT prompt in the Web UI.
Legacy systems often have complex and implicit dependencies between jobs. When there are hundreds of cron jobs on a server, it can be difficult to keep track of these dependencies and to determine which job to rerun if one fails. It can also be a hassle to SSH into a server to view logs and manually rerun shell scripts one by one. Dagu aims to solve these problems by allowing you to explicitly visualize and manage pipeline dependencies as a DAG, and by providing a web UI for checking dependencies, execution status, and logs and for rerunning or stopping jobs with a simple mouse click.
There are many existing tools such as Airflow, but many of these require you to write code in a programming language like Python to define your DAG. For systems that have been in operation for a long time, there may already be complex jobs with hundreds of thousands of lines of code written in languages like Perl or Shell Script. Adding another layer of complexity on top of these codes can reduce maintainability. Dagu was designed to be easy to use, self-contained, and require no coding, making it ideal for small projects.
Dagu is a single command line tool that uses the local file system to store data, so no database management system or cloud service is required. DAGs are defined in a declarative YAML format, and existing programs can be used without modification.
- Writing dags in the Starlark Language
- Writing dags in Cue
- AWS Lambda Execution
- Slack Integration
- User Defined Function
- Observability
- Project concept for grouping dags
- Simple User management and RBAC
- Keycloak Integration Option
- HA Cluster Mode
- Database Option
- DAG Versioning
- Built-in TLS
Feel free to contribute in any way you want! Share ideas, questions, submit issues, and create pull requests. Check out our Contribution Guide for help getting started.
We welcome any and all contributions!
This project is licensed under the GNU GPLv3.
Join our Discord community to ask questions, request features, and share your ideas.