Built for engineers who ship
Everything in the CLI is open source and free. No hidden limits.
contenox plan — Persistent AI plans
Describe a goal in plain English. Contenox turns it into a multi-step plan stored in SQLite — so plans survive reboots, you can run one step at a time, and even auto-execute with --auto.
Quickstartcontenox vibe — Full-screen TUI
A Bubble Tea TUI that lets you watch autonomous plan execution live, approve sensitive steps interactively, and chat directly with the model — without leaving your terminal.
Learn morecontenox run — Stateless pipe-friendly execution
Feed stdin into a chain and get output back. Works with grep, awk, jq, and any shell script. Perfect for CI hooks, git aliases, and batch automation.
Git & DevOps cookbookChains — Workflows as JSON
Define tasks, handlers, branching logic, and tool hooks as JSON files you own and version. No proprietary DSL. No lock-in.
Chain referenceRemote hooks — Connect any HTTP API
Point Contenox at any OpenAPI service and the model gains those tools automatically. The hook feature is open source — bring your own, or use our hosted services (coming soon).
Remote hooks docsMCP client — Model Context Protocol
Session-scoped connections to any MCP server — filesystem, databases, GitHub, internal tools. State survives across tool calls and plan steps.
MCP integrationYour models, your infra
Ollama, OpenAI, Gemini, vLLM — swap providers per task. Fully offline with Ollama. No lock-in. No cloud dependency.
Configure backendsChain-scoped security policies
Each chain defines its own allow/deny lists for shell commands. Fine-grained security without global flags — safe by default, powerful when unlocked.
Local hooks & policy