A powerful tool to route Claude Code requests to different models and customize any request.
- Model Routing: Route requests to different models based on your needs (e.g., background tasks, thinking, long context).
- Multi-Provider Support: Supports various model providers like OpenRouter, DeepSeek, Ollama, Gemini, Volcengine, and SiliconFlow.
- Request/Response Transformation: Customize requests and responses for different providers using transformers.
- Dynamic Model Switching: Switch models on-the-fly within Claude Code using the
/model
command. - GitHub Actions Integration: Trigger Claude Code tasks in your GitHub workflows.
- Plugin System: Extend functionality with custom transformers.
First, ensure you have Claude Code installed:
npm install -g @anthropic-ai/claude-code
Then, install Claude Code Router:
npm install -g @musistudio/claude-code-router
Create and configure your ~/.claude-code-router/config.json
file. For more details, you can refer to config.example.json
.
The config.json
file has several key sections:
-
PROXY_URL
(optional): You can set a proxy for API requests, for example:"PROXY_URL": "http://127.0.0.1:7890"
. -
LOG
(optional): You can enable logging by setting it totrue
. The log file will be located at$HOME/.claude-code-router.log
. -
APIKEY
(optional): You can set a secret key to authenticate requests. When set, clients must provide this key in theAuthorization
header (e.g.,Bearer your-secret-key
) or thex-api-key
header. Example:"APIKEY": "your-secret-key"
. -
HOST
(optional): You can set the host address for the server. IfAPIKEY
is not set, the host will be forced to127.0.0.1
for security reasons to prevent unauthorized access. Example:"HOST": "0.0.0.0"
. -
Providers
: Used to configure different model providers. -
Router
: Used to set up routing rules.default
specifies the default model, which will be used for all requests if no other route is configured.
Here is a comprehensive example:
{
"APIKEY": "your-secret-key",
"PROXY_URL": "http://127.0.0.1:7890",
"LOG": true,
"Providers": [
{
"name": "openrouter",
"api_base_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-xxx",
"models": [
"google/gemini-2.5-pro-preview",
"anthropic/claude-sonnet-4",
"anthropic/claude-3.5-sonnet"
],
"transformer": { "use": ["openrouter"] }
},
{
"name": "deepseek",
"api_base_url": "https://api.deepseek.com/chat/completions",
"api_key": "sk-xxx",
"models": ["deepseek-chat", "deepseek-reasoner"],
"transformer": {
"use": ["deepseek"],
"deepseek-chat": { "use": ["tooluse"] }
}
},
{
"name": "ollama",
"api_base_url": "http://localhost:11434/v1/chat/completions",
"api_key": "ollama",
"models": ["qwen2.5-coder:latest"]
}
],
"Router": {
"default": "deepseek,deepseek-chat",
"background": "ollama,qwen2.5-coder:latest",
"think": "deepseek,deepseek-reasoner",
"longContext": "openrouter,google/gemini-2.5-pro-preview"
}
}
Start Claude Code using the router:
ccr code
The Providers
array is where you define the different model providers you want to use. Each provider object requires:
name
: A unique name for the provider.api_base_url
: The full API endpoint for chat completions.api_key
: Your API key for the provider.models
: A list of model names available from this provider.transformer
(optional): Specifies transformers to process requests and responses.
Transformers allow you to modify the request and response payloads to ensure compatibility with different provider APIs.
-
Global Transformer: Apply a transformer to all models from a provider. In this example, the
openrouter
transformer is applied to all models under theopenrouter
provider.{ "name": "openrouter", "api_base_url": "https://openrouter.ai/api/v1/chat/completions", "api_key": "sk-xxx", "models": [ "google/gemini-2.5-pro-preview", "anthropic/claude-sonnet-4", "anthropic/claude-3.5-sonnet" ], "transformer": { "use": ["openrouter"] } }
-
Model-Specific Transformer: Apply a transformer to a specific model. In this example, the
deepseek
transformer is applied to all models, and an additionaltooluse
transformer is applied only to thedeepseek-chat
model.{ "name": "deepseek", "api_base_url": "https://api.deepseek.com/chat/completions", "api_key": "sk-xxx", "models": ["deepseek-chat", "deepseek-reasoner"], "transformer": { "use": ["deepseek"], "deepseek-chat": { "use": ["tooluse"] } } }
-
Passing Options to a Transformer: Some transformers, like
maxtoken
, accept options. To pass options, use a nested array where the first element is the transformer name and the second is an options object.{ "name": "siliconflow", "api_base_url": "https://api.siliconflow.cn/v1/chat/completions", "api_key": "sk-xxx", "models": ["moonshotai/Kimi-K2-Instruct"], "transformer": { "use": [ [ "maxtoken", { "max_tokens": 16384 } ] ] } }
Available Built-in Transformers:
deepseek
: Adapts requests/responses for DeepSeek API.gemini
: Adapts requests/responses for Gemini API.openrouter
: Adapts requests/responses for OpenRouter API.groq
: Adapts requests/responses for groq API.maxtoken
: Sets a specificmax_tokens
value.tooluse
: Optimizes tool usage for certain models viatool_choice
.gemini-cli
(experimental): Unofficial support for Gemini via Gemini CLI gemini-cli.js.
Custom Transformers:
You can also create your own transformers and load them via the transformers
field in config.json
.
{
"transformers": [
{
"path": "$HOME/.claude-code-router/plugins/gemini-cli.js",
"options": {
"project": "xxx"
}
}
]
}
The Router
object defines which model to use for different scenarios:
default
: The default model for general tasks.background
: A model for background tasks. This can be a smaller, local model to save costs.think
: A model for reasoning-heavy tasks, like Plan Mode.longContext
: A model for handling long contexts (e.g., > 60K tokens).
You can also switch models dynamically in Claude Code with the /model
command:
/model provider_name,model_name
Example: /model openrouter,anthropic/claude-3.5-sonnet
Integrate Claude Code Router into your CI/CD pipeline. After setting up Claude Code Actions, modify your .github/workflows/claude.yaml
to use the router:
name: Claude Code
on:
issue_comment:
types: [created]
# ... other triggers
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
# ... other conditions
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Prepare Environment
run: |
curl -fsSL https://bun.sh/install | bash
mkdir -p $HOME/.claude-code-router
cat << 'EOF' > $HOME/.claude-code-router/config.json
{
"log": true,
"OPENAI_API_KEY": "${{ secrets.OPENAI_API_KEY }}",
"OPENAI_BASE_URL": "https://api.deepseek.com",
"OPENAI_MODEL": "deepseek-chat"
}
EOF
shell: bash
- name: Start Claude Code Router
run: |
nohup ~/.bun/bin/bunx @musistudio/[email protected] start &
shell: bash
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@beta
env:
ANTHROPIC_BASE_URL: http://localhost:3456
with:
anthropic_api_key: "any-string-is-ok"
This setup allows for interesting automations, like running tasks during off-peak hours to reduce API costs.
If you find this project helpful, please consider sponsoring its development. Your support is greatly appreciated!
![]() |
![]() |
A huge thank you to all our sponsors for their generous support!
- @Simon Leischnig
- @duanshuaimin
- @vrgitadmin
- @*o
- @ceilwoo
- @*说
- @*更
- @K*g
- @R*R
- @bobleer
- @*苗
- @*划
- @Clarence-pan
- @carter003
- @S*r
- @*晖
- @*敏
- @Z*z
- @*然
- @cluic
- @*苗
- @PromptExpert
- @*应
- @yusnake
- @*飞
- @董*
(If your name is masked, please contact me via my homepage email to update it with your GitHub username.)