Configuration
Contenox stores all configuration in SQLite (.contenox/local.db, or ~/.contenox/local.db globally).
There is no YAML file — register backends and set defaults using CLI commands.
Register a backend
# Local Ollama (base URL inferred automatically)
contenox backend add local --type ollama
# Ollama Cloud
contenox backend add ollama-cloud --type ollama --url https://ollama.com/api --api-key-env OLLAMA_API_KEY
# OpenAI (base URL inferred)
contenox backend add openai --type openai --api-key-env OPENAI_API_KEY
# Google Gemini
contenox backend add gemini --type gemini --api-key-env GEMINI_API_KEY
# Self-hosted vLLM or compatible endpoint
contenox backend add myvllm --type vllm --url http://gpu-host:8000
# Vertex AI — --url is required (include project and region)
# Option A: service account JSON (works everywhere, including the workspace)
export VERTEX_SA_JSON=$(cat /path/to/service-account.json)
contenox backend add vertex --type vertex-google \
--url "https://us-central1-aiplatform.googleapis.com/v1/projects/YOUR_PROJECT_ID/locations/us-central1" \
--api-key-env VERTEX_SA_JSON
# Option B: Application Default Credentials (CLI only)
# Both gcloud steps are required — set-quota-project is not optional
gcloud auth application-default login
gcloud auth application-default set-quota-project YOUR_PROJECT_ID
contenox backend add vertex --type vertex-google \
--url "https://us-central1-aiplatform.googleapis.com/v1/projects/YOUR_PROJECT_ID/locations/us-central1"
Set persistent defaults
contenox config set default-model qwen2.5:7b
contenox config set default-provider ollama
contenox config set default-chain .contenox/default-chain.json
contenox config set hitl-policy-name hitl-policy-strict.json
contenox config list # review current settings
| Key | Description |
|---|---|
default-model | Model name used when --model is not passed |
default-provider | Provider type used when --provider is not passed |
default-chain | Path to the default chain file |
hitl-policy-name | Active HITL policy file name (e.g. hitl-policy-strict.json). Empty = hitl-policy-default.json. |
Manage backends
contenox backend list
contenox backend show openai
contenox backend remove myvllm
Supported providers
--type | Notes |
|---|---|
ollama | Local: run ollama serve first. Hosted: use --url https://ollama.com/api --api-key-env OLLAMA_API_KEY. |
openai | Use --api-key-env OPENAI_API_KEY. Base URL inferred. |
gemini | Use --api-key-env GEMINI_API_KEY. Base URL inferred. |
vllm | Self-hosted OpenAI-compatible endpoint. Requires --url. |
vertex-google | Vertex AI — Google models (Gemini). Requires --url with project and region. Auth: service account JSON via --api-key-env, or ADC (no flag needed if gcloud auth application-default login is configured). |
vertex-anthropic | Vertex AI — Anthropic models (Claude). Same auth and URL pattern as vertex-google. |
vertex-meta | Vertex AI — Meta models (Llama). Same auth and URL pattern as vertex-google. |
vertex-mistralai | Vertex AI — Mistral models. Same auth and URL pattern as vertex-google. |
Database location
Contenox resolves the database path in this order:
--db <path>flag--data-dir <path>— if set, uses<path>/local.db.contenox/local.dbfound by walking up from the current working directory~/.contenox/local.db(global fallback)
The --data-dir flag also controls where chain files, plans, and VFS data are stored. This is useful for running isolated instances (e.g. integration tests) without affecting your real .contenox/ directory.