▶ Open Beta — Free tier available

Stop paying frontier prices
for mechanical tasks

70% of agentic steps are mechanical — file reads, shell commands, test runs. mesh-router sends them to local models at ~$0. Reserves Claude/GPT for reasoning. Cut your agentic API bill by 70%+.

Get Pro — $15/mo View on GitHub
mesh-sh — routing log
$ mesh-sh execute "add error handling to api/main.py"
 
# Reasoning Router classifying tasks...
 
[mechanical] read_file api/main.py → llama-3-8b $0.000
[mechanical] list_dir api/ → llama-3-8b $0.000
[reasoning] plan_changes → claude-3.7 $0.012
[reasoning] generate_patch → claude-3.7 $0.031
[mechanical] write_file api/main.py → llama-3-8b $0.000
[mechanical] run_tests pytest → llama-3-8b $0.000
 
✓ Done. Cost: $0.043 (saved $0.29 vs all-frontier)
Tokens routed
% mechanical tasks
$0 Avg. mechanical task cost
70%+ Avg. cost reduction

A reasoning router inside your agent loop

mesh-router wraps your existing agentic workflow. Each tool call is classified before dispatch. Mechanical tasks never leave your machine.

Incoming task
read_file("main.py")
mechanical
ls, grep, cat, tests
→ Llama 3 / DeepSeek local
reasoning
plan, generate, review
→ Claude / GPT / Gemini
Reasoning Router
Llama-3-70B
local classifier
local mesh
Your machines
Ollama · DeepSeek · Llama
frontier
Remote APIs
Claude · GPT · Gemini
Task type Example Without mesh-router With mesh-router Saved
mechanical read 500 files $1.50 $0.00 100%
mechanical run test suite $0.45 $0.00 100%
reasoning plan refactor $0.08 $0.08 0%
reasoning generate code $0.12 $0.12 0%

Up in 5 minutes

Works with any Ollama-compatible local model. No config required to start.

Step 01
Install mesh-sh
pip install mesh-router

Adds the mesh-sh CLI. Wraps your existing agentic tools — Claude Code, Aider, or bare API.
Step 02
Point to your mesh
Configure your local Ollama endpoint and frontier API keys in ~/.mesh/config.toml. Tailscale mesh? Set your worker IPs.
Step 03
Run your loop
Replace claude with mesh-sh. The router handles classification automatically. Watch costs drop in real time.
Step 04 (Pro)
Cloud Router
Pro tier unlocks the cloud-hosted Reasoning Router — a 70B model tuned on task-type classification. Optimal distribution across your whole fleet.

Start free, scale as you grow

Bring your own hardware for Open Source. Pro adds cloud routing. Managed handles everything.

Open Source
$0
  • Self-hosted local workers only
  • Ollama + DeepSeek + Llama support
  • Basic mechanical/reasoning classifier
  • Works with Claude Code, Aider, bare API
  • No cloud router
  • No managed GPU workers
View on GitHub
Managed
$99/mo
  • Everything in Pro
  • Hosted GPU workers (managed mesh)
  • No local hardware required
  • SLA: 99.9% uptime
  • Custom model fine-tuning
  • Dedicated Slack channel
Best for teams · Includes onboarding call