Installation
import { Aside, Steps, Tabs, TabItem } from ‘@astrojs/starlight/components’;
Option A: Local Installation
-
Clone the repository
Ventana de terminal git clone https://github.com/pepetox/osint-ai-one.gitcd osint-ai-one -
Create and activate virtual environment
Ventana de terminal python -m venv .venvsource .venv/bin/activate # Linux/macOS# .venv\Scripts\activate # Windows (PowerShell) -
Install the package
Ventana de terminal pip install -e .This installs the package in editable mode and registers the entry points:
osint-agent,osint-mcp,osint-a2a. -
Configure environment variables
Ventana de terminal cp .env.example .envEdit
.envwith your preferred editor. At minimum, add one API key:Ventana de terminal # .env — minimum to get startedVIRUSTOTAL_API_KEY=your_key_here# Local LLM (default)LLM_PROVIDER=ollamaOLLAMA_MODEL=qwen3:14b -
Pull the Ollama model
Ventana de terminal ollama pull qwen3:14b -
Verify the installation
Ventana de terminal osint-agent --helpYou should see the CLI help menu.
Option B: Docker
docker compose run --rm --profile cli osint-agent```Verify the installation
# The agent should start without errorsosint-agent
# In the interactive prompt, try a simple queryosint> Investigate IP 8.8.8.8If you see Ollama connection errors, make sure the service is running:
ollama serve # In another terminal if not running as a serviceollama list # Verify the model is downloaded