Docker
import { Aside } from ‘@astrojs/starlight/components’;
Quick start with Docker
git clone https://github.com/pepetox/osint-ai-one.gitcd osint-ai-onecp .env.example .env# Edit .env with your API keysAvailable profiles
Docker Compose uses profiles to start different services:
| Profile | Service | Command |
|---|---|---|
cli | Interactive agent | docker compose run --rm --profile cli osint-agent |
server | MCP Server (port 8080) | docker compose --profile server up osint-mcp |
server | A2A Server (port 9000) | docker compose --profile server up osint-a2a |
| — | Ollama (port 11434) | Automatically started as dependency |
Interactive CLI
docker compose run --rm --profile cli osint-agentOpens the interactive prompt like the local installation:
osint> Investigate IP 185.220.101.34osint> /dashboardosint> /report saveSingle query
docker compose run --rm --profile cli osint-agent -q "Investigate IP 8.8.8.8"docker compose run --rm --profile cli osint-agent -q "Analyze domain evil.com" -- --jsonMCP and A2A servers
# Start both servers in backgrounddocker compose --profile server up -d
# View logsdocker compose logs -f osint-mcpdocker compose logs -f osint-a2a
# Stopdocker compose --profile server downMCP will be available at http://localhost:8080/mcp and A2A at http://localhost:9000.
Volumes
Data persists between executions thanks to volumes:
| Host path | Container path | Purpose |
|---|---|---|
./data/ | /app/data/ | SQLite databases + ChromaDB |
./investigaciones/ | /app/investigaciones/ | Investigation folder structure |
./reports/ | /app/reports/ | Generated threat intelligence reports |
./logs/ | /app/logs/ | Application logs |
./.env | /app/.env | API keys and configuration |
Pre-installed embeddings model
The Dockerfile includes:
RUN python -c "from sentence_transformers import SentenceTransformer; \ SentenceTransformer('sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2')"This avoids download on first use and allows operation without HuggingFace Hub access.
Ollama in Docker
By default, the Docker image tries to connect to Ollama at http://host.docker.internal:11434 (Ollama running on your host machine).
If you want Ollama inside the container:
services: ollama: image: ollama/ollama volumes: - ollama_data:/root/.ollama ports: - "11434:11434"# After startingdocker compose exec ollama ollama pull qwen3:14bEnvironment variables in Docker
Same as local installation, via the .env file. Docker Compose loads it automatically.
# Verify .env is loadeddocker compose run --rm --profile cli osint-agent -- osint-agent --helpManual image build
docker build -t osint-ai-one .docker run -it --env-file .env osint-ai-one