← Docs hub

Pick your mode

llmwiki runs in two modes that share the same three-layer pipeline (raw/wiki/site/) but differ on who calls the LLM:

Mode A — API Mode B — Agent
How synthesis runs Python → Anthropic API Claude Code / Codex CLI → slash command
API key needed Yes (ANTHROPIC_API_KEY) No (uses your agent's subscription)
Batch + parallel Yes (native API batching) No (serial, one turn at a time)
Cost model Pay per token (with prompt cache) Included in your agent subscription
Runs headless? Yes (cron / CI) No (needs interactive agent session)
Best for Large corpora, scheduled sync, CI Exploratory + per-session enrichment

When to pick which

The two modes share

Everything except synthesis: