Open source · Free forever

Your terminal.
Now autonomous.

No Python glue. No bloated daemons. No API lock-in.
Pipe data in, get results back — one binary, runs fully local or on any cloud model.

🏠 Fully local with Ollama🔌 OpenAI / Gemini / vLLM📋 Plans survive reboots🔍 Review before every step
contenox

Install

$curl -fsSL https://contenox.com/install.sh | sh

Or build from source

Stop typing. Start piping.

Drop these proven AI workflows into your terminal and ship hours of manual work in seconds.

Automate release notes

$ git log --oneline | contenox run release-notes
## v0.9.1 — Hook policies, SQLite fixes, CLI cleanup
Full recipe

Generate contextual commits

$ git diff --staged | contenox run commit-msg
feat(hooks): add chain-scoped security policies
Full recipe

Autonomous project scaffolding

$ contenox plan new "scaffold a Go HTTP server with tests" && contenox plan next --auto
Step 1/5: Create project structure... ✓
Full recipe

How it works

Four steps. No surprises.

1

Describe

Write a goal in plain English — or pipe stdin directly.

2

Plan

Contenox turns it into a persistent, step-by-step plan saved to SQLite.

3

Execute

Each step runs real shell commands on your machine, one at a time.

4

Review

Approve sensitive steps, run fully-auto in CI, or watch live in the TUI.

Give your agent superpowers

MCP is USB-C for AI — one standard, unlimited tools.
Connect Notion, Postgres, GitHub, or any OpenAPI service in one command.

Universal MCP Client

Plug into Notion, GitHub, Postgres, and the entire Model Context Protocol ecosystem. One standard wire format — unlimited tools.

Bring your own API

Point Contenox at any OpenAPI v3 spec and the model gains those endpoints as tools automatically. No SDK. No wrapper code.