Skip to content

Requirements

import { Aside } from ‘@astrojs/starlight/components’;

System

RequirementMinimum versionNotes
Python3.11+Tested on 3.11 and 3.12
Operating systemmacOS, Linux, Windows (WSL2)Native Windows not recommended
RAM8 GB minimum16 GB recommended for Qwen3:14b
Storage10 GB freeFor Ollama models and data

Ollama (for local mode)

Ollama is required if you want to run the ReAct agent completely locally, without depending on external LLM APIs.

Ventana de terminal
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.ai/install.sh | sh

After installing, pull a model with tool calling support:

Ventana de terminal
# Recommended (16 GB RAM) — best quality/speed balance
ollama pull qwen3:14b
# If you have only 8-16 GB RAM
ollama pull qwen3:8b
# If you have 48 GB RAM — maximum quality
ollama pull qwen3:32b

LLM Alternatives

If you don’t want to install Ollama, you can use:

  • OpenAI (gpt-4o) — requires OPENAI_API_KEY
  • Anthropic (claude-3-5-sonnet-20241022) — requires ANTHROPIC_API_KEY

With these options the MCP Server works without Ollama.

OSINT API Keys (optional)

Most tools have a free tier. You only need at least one to get started. VirusTotal is recommended.

ServiceVariableFree tierRegister
VirusTotalVIRUSTOTAL_API_KEY500 req/dayvirustotal.com
AbuseIPDBABUSEIPDB_API_KEY1,000 req/dayabuseipdb.com
AlienVault OTXALIENVAULT_OTX_API_KEYUnlimitedotx.alienvault.com
ShodanSHODAN_API_KEY100 req/monthshodan.io
IPInfoIPINFO_TOKEN50,000 req/monthipinfo.io
URLScan.ioURLSCAN_API_KEY5,000 req/dayurlscan.io