Skip to content

Installation

import { Aside, Steps, Tabs, TabItem } from ‘@astrojs/starlight/components’;

Option A: Local Installation

  1. Clone the repository

    Ventana de terminal
    git clone https://github.com/pepetox/osint-ai-one.git
    cd osint-ai-one
  2. Create and activate virtual environment

    Ventana de terminal
    python -m venv .venv
    source .venv/bin/activate # Linux/macOS
    # .venv\Scripts\activate # Windows (PowerShell)
  3. Install the package

    Ventana de terminal
    pip install -e .

    This installs the package in editable mode and registers the entry points: osint-agent, osint-mcp, osint-a2a.

  4. Configure environment variables

    Ventana de terminal
    cp .env.example .env

    Edit .env with your preferred editor. At minimum, add one API key:

    Ventana de terminal
    # .env — minimum to get started
    VIRUSTOTAL_API_KEY=your_key_here
    # Local LLM (default)
    LLM_PROVIDER=ollama
    OLLAMA_MODEL=qwen3:14b
  5. Pull the Ollama model

    Ventana de terminal
    ollama pull qwen3:14b
  6. Verify the installation

    Ventana de terminal
    osint-agent --help

    You should see the CLI help menu.

Option B: Docker

```bash git clone https://github.com/pepetox/osint-ai-one.git cd osint-ai-one cp .env.example .env # Edit .env with your API keys
docker compose run --rm --profile cli osint-agent
```
```bash docker compose --profile server up -d ``` - MCP Server: `http://localhost:8080` - A2A Server: `http://localhost:9000` ```bash docker compose run --rm --profile cli osint-agent -q "Investigate IP 8.8.8.8" ```

Verify the installation

Ventana de terminal
# The agent should start without errors
osint-agent
# In the interactive prompt, try a simple query
osint> Investigate IP 8.8.8.8

If you see Ollama connection errors, make sure the service is running:

Ventana de terminal
ollama serve # In another terminal if not running as a service
ollama list # Verify the model is downloaded