Configuration
Contenox stores all configuration in SQLite (.contenox/local.db, or ~/.contenox/local.db globally).
There is no YAML file — register backends and set defaults using CLI commands.
Register a backend
# Local Ollama (base URL inferred automatically)
contenox backend add local --type ollama
# Ollama Cloud
contenox backend add ollama-cloud --type ollama --url https://ollama.com/api --api-key-env OLLAMA_API_KEY
# OpenAI (base URL inferred)
contenox backend add openai --type openai --api-key-env OPENAI_API_KEY
# Google Gemini
contenox backend add gemini --type gemini --api-key-env GEMINI_API_KEY
# Self-hosted vLLM or compatible endpoint
contenox backend add myvllm --type vllm --url http://gpu-host:8000
Set persistent defaults
contenox config set default-model qwen2.5:7b
contenox config set default-provider ollama
contenox config set default-chain .contenox/default-chain.json
contenox config list # review current settings
Manage backends
contenox backend list
contenox backend show openai
contenox backend remove myvllm
Supported providers
--type | Notes |
|---|---|
ollama | Local: run ollama serve first. Hosted: use --url https://ollama.com/api --api-key-env OLLAMA_API_KEY. |
openai | Use --api-key-env OPENAI_API_KEY. Base URL inferred. |
gemini | Use --api-key-env GEMINI_API_KEY. Base URL inferred. |
vllm | Self-hosted OpenAI-compatible endpoint. Requires --url. |
Database location
Contenox resolves the database path in this order:
--db <path>flag--data-dir <path>— if set, uses<path>/local.db.contenox/local.dbfound by walking up from the current working directory~/.contenox/local.db(global fallback)
The --data-dir flag also controls where chain files, plans, and VFS data are stored. This is useful for running isolated instances (e.g. integration tests) without affecting your real .contenox/ directory.