Create a new directory and save this file as docker-compose.yml:
services:
ebs-assistant:
image: interpretos/local:latest
ports:
- "${CDDI_PORT:-8080}:5000"
volumes:
- ebs-data:/app/data
environment:
- CDDI_INTEGRATION=oracle_ebs
- CDDI_TELEMETRY=false
- CDDI_LLM_PROXY_URL=https://llm.interpretos.com/v1
restart: unless-stopped
volumes:
ebs-data:
Default port: 8080
Change the port with CDDI_PORT=9090 docker compose up -d
Docker pulls the image and starts the assistant. This takes 1–2 minutes on first run.
Open your browser:
The setup wizard walks you through:
- Admin account — Username, password, EULA acceptance
- AI provider — Interpretos Cloud (free, default) or bring your own key
- Telemetry — Opt in or out (disabled by default)
- Database connection — SSH host + credentials, or direct Oracle connection
- Review & complete
No API key needed
Interpretos Cloud is selected by default — 100 free queries per day, no credit card. Your questions go through our LLM proxy; no EBS data is stored.
Once the wizard completes, you land on the chat UI. Try:
What is the status of PO 1001?
Show me recent AP invoices
What inventory items are on hand in org 204?
How many open purchase orders do we have?
The assistant connects to your EBS database and returns live data. Typical response time: 9–15 seconds.
For fully air-gapped operation with zero external network calls:
- Run a local LLM (e.g. Ollama, vLLM, or any OpenAI-compatible server)
- During setup, select Custom / Self-hosted and enter your local endpoint (e.g.
http://your-server:11434/v1)
- Set
CDDI_TELEMETRY=false in your docker-compose.yml (already the default)
Complete isolation
In this configuration, the container never contacts any Interpretos server. Your data, your LLM, your network.