Requirements
import { Aside } from ‘@astrojs/starlight/components’;
System
| Requirement | Minimum version | Notes |
|---|---|---|
| Python | 3.11+ | Tested on 3.11 and 3.12 |
| Operating system | macOS, Linux, Windows (WSL2) | Native Windows not recommended |
| RAM | 8 GB minimum | 16 GB recommended for Qwen3:14b |
| Storage | 10 GB free | For Ollama models and data |
Ollama (for local mode)
Ollama is required if you want to run the ReAct agent completely locally, without depending on external LLM APIs.
# macOSbrew install ollama
# Linuxcurl -fsSL https://ollama.ai/install.sh | shAfter installing, pull a model with tool calling support:
# Recommended (16 GB RAM) — best quality/speed balanceollama pull qwen3:14b
# If you have only 8-16 GB RAMollama pull qwen3:8b
# If you have 48 GB RAM — maximum qualityollama pull qwen3:32bLLM Alternatives
If you don’t want to install Ollama, you can use:
- OpenAI (
gpt-4o) — requiresOPENAI_API_KEY - Anthropic (
claude-3-5-sonnet-20241022) — requiresANTHROPIC_API_KEY
With these options the MCP Server works without Ollama.
OSINT API Keys (optional)
Most tools have a free tier. You only need at least one to get started. VirusTotal is recommended.
| Service | Variable | Free tier | Register |
|---|---|---|---|
| VirusTotal | VIRUSTOTAL_API_KEY | 500 req/day | virustotal.com |
| AbuseIPDB | ABUSEIPDB_API_KEY | 1,000 req/day | abuseipdb.com |
| AlienVault OTX | ALIENVAULT_OTX_API_KEY | Unlimited | otx.alienvault.com |
| Shodan | SHODAN_API_KEY | 100 req/month | shodan.io |
| IPInfo | IPINFO_TOKEN | 50,000 req/month | ipinfo.io |
| URLScan.io | URLSCAN_API_KEY | 5,000 req/day | urlscan.io |