Remote Tools
Remote tools turn any external HTTP service into a set of callable tools for your AI agent. Contenox fetches the service's OpenAPI v3 spec, discovers every operation, and makes them available to the model as named tools — no client code required.
Register a tools
contenox tools add <name> --url <endpoint>
# Example: US National Weather Service (free, public API)
contenox tools add nws --url https://api.weather.gov --timeout 15000
# Example: internal API with auth and hidden tenant context
contenox tools add myapi --url https://api.example.com \
--header "Authorization: Bearer $MY_TOKEN" \
--inject "tenant_id=acme" \
--inject "env=production"
Contenox probes the endpoint at registration time to count available tools. If the service is unreachable at that moment, it is still registered and re-probed at chain execution time.
Flags
| Flag | Description |
|---|---|
--url | Base URL of the service (required) |
--header | HTTP header to inject on every call, e.g. "Authorization: Bearer $TOKEN" (repeatable) |
--inject | Tool call argument to inject and hide from the model, e.g. "tenant_id=acme" (repeatable) |
--timeout | Request timeout in milliseconds (default: 10000) |
Inspect tools
contenox tools show nws
Lists the tools's URL, timeout, registered headers (keys only — values are never shown), injected params (keys only — values hidden), and all tools discovered from its OpenAPI spec.
Manage tools
contenox tools list # show all registered tools
contenox tools update nws --timeout 30000 # update timeout
contenox tools update nws --header "X-App: v2" # replace ALL headers
contenox tools update nws --inject "tenant_id=newvalue" # replace ALL inject params
contenox tools remove nws
Important
tools update --header replaces the entire header set for the tools. tools update --inject replaces the entire inject param map. Pass all required values in a single update call.
Authentication and secret injection
Pass authentication headers at registration time:
contenox tools add myapi --url https://api.example.com \
--header "Authorization: Bearer $MY_TOKEN" \
--header "X-Tenant: acme"
These headers are stored in SQLite and injected transparently into every HTTP call made to that service. The model never sees them — they are stripped from the tool schema before it reaches the LLM.
Injecting tool call arguments (hidden from model)
Beyond HTTP headers, you can also inject named parameters directly into every tool call — completely hidden from the model's tool schema:
contenox tools add myapi --url https://api.example.com \
--inject "tenant_id=acme" \
--inject "correlation_id=trace-123"
Specifically, the engine:
- Removes injected parameter names from the tool manifest the model sees (
properties+required) - Merges them back into every tool call after the model-provided args (injected values always win)
This is the right pattern for: tenant IDs, correlation/trace IDs, session context, environment tags, and any other infrastructure concern that the model shouldn't reason about.
Tool naming
Contenox derives a tool name for each API operation in this priority order:
operationIdfrom the OpenAPI spec (recommended)x-tool-nameextension on the operation- Fallback:
<last_path_segment>_<method>(e.g.alerts_get)
For the best experience, set operationId on every operation in your OpenAPI spec.
Excluded paths
The following paths are automatically excluded from tool discovery:
/health,/healthz— health checks/ready,/readyz— readiness probes/metrics— Prometheus metrics
Use in a chain
Add the tools's name to execute_config.tools:
{
"id": "weather_task",
"handler": "chat_completion",
"system_instruction": "You are a weather assistant. Available tools: {{toolsservice:list}}.",
"execute_config": {
"model": "qwen2.5:7b",
"provider": "ollama",
"tools": ["nws"]
},
"transition": {
"branches": [
{ "operator": "equals", "when": "tool-call", "goto": "run_tools" },
{ "operator": "default", "when": "", "goto": "end" }
]
}
},
{
"id": "run_tools",
"handler": "execute_tool_calls",
"input_var": "weather_task",
"transition": {
"branches": [
{ "operator": "default", "when": "", "goto": "weather_task" }
]
}
}
Building your own tools with FastAPI
FastAPI serves an /openapi.json spec automatically, making it a perfect fit. Every endpoint becomes a tool the moment you register the service.
from fastapi import FastAPI
app = FastAPI()
@app.get("/summarize", operation_id="summarize_text")
def summarize(text: str) -> dict:
"""Return a short summary of the provided text."""
return {"summary": text[:100] + "..."}
# Start the service
uvicorn main:app --port 8080
# Register it
contenox tools add myapp --url http://localhost:8080
contenox tools show myapp # → 1 tool: summarize_text
The model can now call summarize_text directly from any chain that includes myapp in its tools list.