LLM Endpoint Configuration
Dynamixs.AI connects to language models through a central endpoint registry file: imixs-llm.xml. This file defines all LLM endpoints available to the platform — for chat completions, AI conditions, document analysis, and RAG embeddings.
The registry file is mounted into the container at startup and does not require a rebuild when endpoints change.
The Endpoint Registry File
Create a file named imixs-llm.xml and place it in a location accessible to the Docker container (e.g. ./keys/imixs-llm.xml).
<?xml version="1.0" encoding="UTF-8"?>
<imixs-llm>
<!--
Completion endpoint — used for chat completions, conditions, and document analysis.
Connects to a local llama.cpp server or any OpenAI-compatible API.
-->
<endpoint id="my-llm">
<url>http://localhost:8080/</url>
<apikey>${env.LLM_API_KEY}</apikey>
<options>
<temperature>0.2</temperature>
<max_tokens>1024</max_tokens>
</options>
</endpoint>
<!--
Embedding endpoint — used for RAG indexing and retrieval.
Can be hosted separately; no API key needed for local instances.
-->
<endpoint id="my-embeddings">
<url>http://localhost:8081/</url>
<options>
<max_tokens>512</max_tokens>
</options>
</endpoint>
</imixs-llm>
Each <endpoint> is identified by its id attribute. This id is referenced in your BPMN process configuration to select the correct endpoint for each task.
Referencing Endpoints in BPMN Models
In your BPMN Workflow Result configuration, use the <endpoint> tag to reference an endpoint by its id:
<!-- Standard completion task -->
<imixs-ai name="PROMPT">
<endpoint>my-llm</endpoint>
</imixs-ai>
<!-- RAG task using separate completion and embedding endpoints -->
<imixs-ai name="RAG_INDEX">
<endpoint-completion>my-llm</endpoint-completion>
<endpoint-embeddings>my-embeddings</endpoint-embeddings>
</imixs-ai>
Environment Variable Substitution
Sensitive values like API keys should not be hardcoded in imixs-llm.xml. Instead, use environment variable placeholders — they are resolved at runtime:
<apikey>${env.LLM_API_KEY}</apikey>
Define the corresponding variables in your Docker environment file (e.g. ./docker/.env):
LLM_API_ENDPOINT=https://api.your-llm-provider.com/
LLM_API_KEY=your-api-key-here
Security: API Token vs. Basic Authentication
API Token (recommended)
Set the <apikey> element in the endpoint definition. Dynamixs.AI will automatically use Bearer Token authentication for all requests to that endpoint.
Basic Authentication
If your LLM endpoint requires Basic authentication instead, use the <username> and <password> elements:
<endpoint id="my-llm">
<url>http://localhost:8080/</url>
<username>${env.LLM_USER}</username>
<password>${env.LLM_PASSWORD}</password>
</endpoint>