About

chaos is given, but order must be earned.


o7 is a workflow DSL and local runner for orchestrating AI-assisted engineering.

The problem it solves is specific: you have powerful AI coding tools — Claude Code, Codex, others — but the automation around them is a pile of bash scripts, ad hoc Python, and institutional memory. It works until it doesn't, and when it breaks, nobody knows why.

o7 gives that automation a spine. Workflows are explicit, readable, and repeatable. State is persisted locally. Any step can be paused, inspected, and resumed. The tool you're using to run steps is a pluggable harness — switch from one AI tool to another without rewriting your workflow.

Design decisions

No cloud. State lives on your machine. No accounts, no API keys pointed at our servers, no telemetry.

No magic. Workflows are declared in a minimal DSL. The parser is strict. The runner does exactly what you wrote.

One canonical form. The language has one way to do each thing. run invokes a workflow. exec calls a harness. if branches. match dispatches on a variant. No aliases, no syntactic sugar that hides what's happening.

Resumability as a first-class feature. Long AI workflows break. Networks time out. Models fail. o7 treats interruption as a normal case, not an error. Every run is a persistent object you can inspect, resume, or roll back.

The name

o7 is a salute — the emoticon kind, where o is the head and 7 is the arm raised. It felt right for a tool whose job is to report for duty, execute reliably, and get out of the way.

Status

o7 is under active development. The Rust implementation is the production target; the TypeScript prototype was the proving ground. Both are in this repository.