Huginn thinks.
Your AI coding assistant. Free, open source, and MIT-licensed.
Run fully local with zero network calls — or connect your own API key for cloud models. You choose what leaves your machine.
curl -fsSL https://huginn.sh/install.sh | sh Why developers are looking for an alternative
$40/month forever
Cursor, GitHub Copilot, Claude Pro — the subscriptions add up. You pay every month, forever, for tools that do less than you need.
Your code, their cloud
Every request to a cloud AI sends your proprietary code to a third-party server. Your trade secrets, your logic, your IP — up in the cloud.
Setup hell
Other tools make you assemble the machine before you can use it. Install this runtime, configure that API key, debug why nothing connects. You just want to code.
Everything you need. Nothing you don't.
Batteries Included
One binary. Managed mode bundles its own llama.cpp runtime and pulls models automatically. Or connect Ollama, Anthropic, OpenRouter, or any OpenAI-compatible endpoint. Switch backends in one config line.
Named Agents
Create persistent agent personas. Each remembers your history. They can even consult each other.
Multi-Slot Models
Separate planner, coder, and reasoner slots. Each model optimized for its role. Hot-swap at runtime.
Hybrid Search
BM25 keyword + HNSW vector search fused with Reciprocal Rank Fusion. Understands your entire codebase.
MCP + LSP
Wire in Model Context Protocol servers for custom tools. Auto-detects language servers for deep code nav.
Skills System
Bundled prompts + rules per project or globally. Load domain expertise into any session.
An AI that actually knows you.
Most AI tools forget you the moment the session ends. Huginn's named agents are persistent personas — they remember your codebase, your preferences, your history. Create a team of specialists and they'll be there every time you open the project.
Remembers across sessions
Hot history + LLM-generated summaries stored in a local Pebble KV store. Steve knows what you worked on last week.
Agents consult each other
Route work to specialists. Ask Steve to consult Maya. Their exchange is logged, summarised, and available next session.
Custom personas
Each agent has its own system prompt, assigned model slot, and colour in the TUI. Build the team that fits your stack.
In Norse myth, Odin sent two ravens across the world each day. Huginn — Thought — flew out to observe and act. Muninn — Memory — returned carrying everything learned.
Huginn ships with first-class memory tools wired directly into the agent loop. Your agents remember what you built, what you decided, and why — across every session.
How Huginn stacks up
| Feature | Huginn | Claude CLI | Cursor | Aider | Continue |
|---|---|---|---|---|---|
| Supports Local Execution | ✓ | ✗ | ~ | ✓ | ~ |
| Free to Use | ✓ | ✗ | ✗ | ✓ | ✓ |
| Open Source | ✓ | ✗ | ✗ | ✓ | ✓ |
| Memory Across Sessions | ✓ | ✗ | ~ | ✗ | ✗ |
| Named Agents | ✓ | ✗ | ✗ | ✗ | ✗ |
| Plan→Approve→Implement | ✓ | ~ | ~ | ✗ | ✗ |
| Multi-Slot Models | ✓ | ✗ | ✗ | ✗ | ✗ |
| Single Binary Install | ✓ | ✗ | ✗ | ✗ | ✗ |
| MCP Support | ✓ | ✓ | ✗ | ✗ | ~ |
~ = partial support
One binary. Runs everywhere.
No npm, pip, or runtime overhead. Drop huginn in your PATH and you're done.
macOS
brew install huginn curl -fsSL https://huginn.sh/install.sh | sh Linux
curl -fsSL https://huginn.sh/install.sh | sh wget https://github.com/scrypster/huginn/releases/latest/download/huginn-linux-amd64
chmod +x huginn-linux-amd64
sudo mv huginn-linux-amd64 /usr/local/bin/huginn Windows
scoop install huginn # Download huginn-windows-amd64.exe from GitHub Releases
# Add to PATH After installing, run huginn in any git repository.