Skip to content

Docker

import { Aside } from ‘@astrojs/starlight/components’;

Quick start with Docker

Ventana de terminal
git clone https://github.com/pepetox/osint-ai-one.git
cd osint-ai-one
cp .env.example .env
# Edit .env with your API keys

Available profiles

Docker Compose uses profiles to start different services:

ProfileServiceCommand
cliInteractive agentdocker compose run --rm --profile cli osint-agent
serverMCP Server (port 8080)docker compose --profile server up osint-mcp
serverA2A Server (port 9000)docker compose --profile server up osint-a2a
Ollama (port 11434)Automatically started as dependency

Interactive CLI

Ventana de terminal
docker compose run --rm --profile cli osint-agent

Opens the interactive prompt like the local installation:

osint> Investigate IP 185.220.101.34
osint> /dashboard
osint> /report save

Single query

Ventana de terminal
docker compose run --rm --profile cli osint-agent -q "Investigate IP 8.8.8.8"
docker compose run --rm --profile cli osint-agent -q "Analyze domain evil.com" -- --json

MCP and A2A servers

Ventana de terminal
# Start both servers in background
docker compose --profile server up -d
# View logs
docker compose logs -f osint-mcp
docker compose logs -f osint-a2a
# Stop
docker compose --profile server down

MCP will be available at http://localhost:8080/mcp and A2A at http://localhost:9000.

Volumes

Data persists between executions thanks to volumes:

Host pathContainer pathPurpose
./data//app/data/SQLite databases + ChromaDB
./investigaciones//app/investigaciones/Investigation folder structure
./reports//app/reports/Generated threat intelligence reports
./logs//app/logs/Application logs
./.env/app/.envAPI keys and configuration

Pre-installed embeddings model

The Dockerfile includes:

RUN python -c "from sentence_transformers import SentenceTransformer; \
SentenceTransformer('sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2')"

This avoids download on first use and allows operation without HuggingFace Hub access.

Ollama in Docker

By default, the Docker image tries to connect to Ollama at http://host.docker.internal:11434 (Ollama running on your host machine).

If you want Ollama inside the container:

docker-compose.yml
services:
ollama:
image: ollama/ollama
volumes:
- ollama_data:/root/.ollama
ports:
- "11434:11434"
Ventana de terminal
# After starting
docker compose exec ollama ollama pull qwen3:14b

Environment variables in Docker

Same as local installation, via the .env file. Docker Compose loads it automatically.

Ventana de terminal
# Verify .env is loaded
docker compose run --rm --profile cli osint-agent -- osint-agent --help

Manual image build

Ventana de terminal
docker build -t osint-ai-one .
docker run -it --env-file .env osint-ai-one