Multi-Slot Model System
Huginn assigns different models to different roles. Each slot is independently configurable.
Slots
| Slot | Default | Purpose |
|---|---|---|
planner | qwen3-coder:30b | Generates structured plans, thinks about architecture |
coder | qwen2.5-coder:14b | Implements changes, generates diffs |
reasoner | qwen3-coder:30b | Deep reasoning for complex problems (/reason) |
Switching models
In config (~/.huginn/config.json):
{
"models": {
"planner": "llama3.3:70b",
"coder": "qwen2.5-coder:32b"
}
}
At runtime with /switch-model or natural language:
use llama3.3 for planning
use deepseek-r1 for reasoning
Supported providers
Huginn supports models from any provider — local or cloud.
Local (managed runtime or Ollama)
Huginn’s managed runtime and Ollama both work with any GGUF-compatible model. Recommended combinations:
| Use case | Planner | Coder |
|---|---|---|
| Balanced | qwen3-coder:30b | qwen2.5-coder:14b |
| Speed | qwen2.5-coder:14b | qwen2.5-coder:7b |
| Quality | llama3.3:70b | qwen2.5-coder:32b |
Anthropic
Use Claude models directly with your own API key:
{
"backend": { "type": "anthropic", "api_key": "$ANTHROPIC_API_KEY" },
"models": { "planner": "claude-sonnet-4-6", "coder": "claude-sonnet-4-6", "reasoner": "claude-opus-4-6" }
}
OpenRouter
Access 200+ models from a single API key:
{
"backend": { "type": "openrouter", "api_key": "$OPENROUTER_API_KEY" },
"models": { "planner": "anthropic/claude-sonnet-4-6", "coder": "qwen/qwen2.5-coder-32b-instruct" }
}
OpenAI
{
"backend": { "type": "openai", "endpoint": "https://api.openai.com/v1", "api_key": "$OPENAI_API_KEY" },
"models": { "coder": "gpt-4o" }
}