Multi-Slot Model System

Huginn assigns different models to different roles. Each slot is independently configurable.

Slots

SlotDefaultPurpose
plannerqwen3-coder:30bGenerates structured plans, thinks about architecture
coderqwen2.5-coder:14bImplements changes, generates diffs
reasonerqwen3-coder:30bDeep reasoning for complex problems (/reason)

Switching models

In config (~/.huginn/config.json):

{
  "models": {
    "planner": "llama3.3:70b",
    "coder": "qwen2.5-coder:32b"
  }
}

At runtime with /switch-model or natural language:

use llama3.3 for planning
use deepseek-r1 for reasoning

Supported providers

Huginn supports models from any provider — local or cloud.

Local (managed runtime or Ollama)

Huginn’s managed runtime and Ollama both work with any GGUF-compatible model. Recommended combinations:

Use casePlannerCoder
Balancedqwen3-coder:30bqwen2.5-coder:14b
Speedqwen2.5-coder:14bqwen2.5-coder:7b
Qualityllama3.3:70bqwen2.5-coder:32b

Anthropic

Use Claude models directly with your own API key:

{
  "backend": { "type": "anthropic", "api_key": "$ANTHROPIC_API_KEY" },
  "models": { "planner": "claude-sonnet-4-6", "coder": "claude-sonnet-4-6", "reasoner": "claude-opus-4-6" }
}

OpenRouter

Access 200+ models from a single API key:

{
  "backend": { "type": "openrouter", "api_key": "$OPENROUTER_API_KEY" },
  "models": { "planner": "anthropic/claude-sonnet-4-6", "coder": "qwen/qwen2.5-coder-32b-instruct" }
}

OpenAI

{
  "backend": { "type": "openai", "endpoint": "https://api.openai.com/v1", "api_key": "$OPENAI_API_KEY" },
  "models": { "coder": "gpt-4o" }
}