Build AI Agents & Intelligent Workflows with Low Code
The contenox platform empowers enterprises to build semantic search systems, contextual chat interactions, and LLM-driven automation — all based on your data.
Let’s connectCore Capabilities
Semantic Search
Search across your documents using natural language queries powered by embeddings and vector databases.
Contextual Chat
Engage in dynamic conversations with AI personas created from your internal knowledge base.
LLM Workflows
Automate complex processes using modular chains of prompts, reasoning, and external hooks.
Open Core
We believe in transparency and extensibility. Our core platform is open-source under Apache 2.0.
View on GitHubWhy contenox?
Self-Hosted Infrastructure
Maintain full control over your data and operations with local deployment options.
Bring-Your-Own-Models Flexibility
Use your own models or integrate with supported LLM providers seamlessly.
Modular Architecture
Scale effortlessly with plug-and-play components and microservices.
European-First Compliance Focus
Designed with GDPR, AI Act, and enterprise compliance standards in mind.
Example Task templates
In contenox, task templates define how AI agents process input, make decisions, and interact with systems.
These templates can chain prompts, hooks, and conditional logic to build chatbots, automation flows, or contextual assistants.
Input → Mutations → Decisions → Actions → Output
And then repeat with Output as Input for the next step or respond with the Output.
All the nessary and type-safety between steps and context like user-sessions, access-control, obeservabiltiy is handled by the platform.
💬 AI Chat
- id: load_chat_history
type: hook
description: "Loads previous chat history for the current session"
hook:
name: get_user_chat_history
transition:
next:
- id: append_user_input
- id: append_user_input
type: hook
description: "Adds latest input to chat history"
hook:
name: append_to_chat
args:
chat: "{{ input }}" # From previous task (loaded chat history)
new_input: "{{ user_input }}"
transition:
next:
- id: run_chat_model
- id: run_chat_model
type: string
description: "Runs LLM with full chat context"
template: |
You are an assistant. Here's the conversation so far:
{{ input }}
Assistant:
transition:
next:
- id: append_model_response
- id: append_model_response
type: hook
description: "Adds model response to chat history"
hook:
name: append_to_chat
args:
chat: "{{ input }}" # From model output
new_input: "{{ output }}"
transition:
next:
- id: save_chat_history
- id: save_chat_history
type: hook
description: "Saves updated chat back to storage"
hook:
name: save_user_chat_history
args:
chat: "{{ input }}" # Updated chat from last step
🎯 Intent Detection with Routing
- id: detect_intent
type: string
template: |
What is the intent of this message?
"{{ input }}"
Possible intents: booking_flight, checking_weather, other
transition:
next:
- value: "booking_flight"
id: handle_flight_booking
- value: "checking_weather"
id: handle_weather_check
- id: default_response