How to use Claude Code with Custom Providers (Z.ai & OpenRouter)

Ali Almahdi

Claude Code is an incredible tool, but did you know you aren't limited to just using it with Anthropic's direct API? By adjusting a few environment variables, you can route Claude Code through other providers like Z.ai or OpenRouter.
This is particularly useful if you have credits elsewhere or need access to specific models proxied through these services.
Configuring for Z.ai
Z.ai offers an Anthropic-compatible API endpoint. To use it, we need to override the ANTHROPIC_BASE_URL and provide our Z.ai authentication token. We also want to disable non-essential traffic to keep things clean.
ZSH Configuration
If you are using ZSH, you can add this function to your ~/.zshrc. This allows you to run zaiclaude whenever you want to use the Z.ai provider, leaving your default claude command mapped to the standard API.
zaiclaude() {
ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic \
ANTHROPIC_AUTH_TOKEN="<YOUR_ZAI_AUTH_TOKEN>" \
CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1 \
claude "$@"
}
Usage:
zaiclaude
Fish Shell Configuration
For Fish shell users (my personal favorite), you can set this up in ~/.config/fish/config.fish.
To set it globally for the session:
## Default claude with zai
set -gx ANTHROPIC_BASE_URL https://api.z.ai/api/anthropic
set -gx ANTHROPIC_AUTH_TOKEN "<YOUR_ZAI_AUTH_TOKEN>"
set -gx CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC 1
Or, if you prefer a dedicated command similar to the ZSH example above:
function zaiclaude
set -lx ANTHROPIC_BASE_URL https://api.z.ai/api/anthropic
set -lx ANTHROPIC_AUTH_TOKEN "<YOUR_ZAI_AUTH_TOKEN>"
set -lx CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC 1
claude $argv
end
Configuring for OpenRouter
OpenRouter is another popular option that aggregates various LLMs. The setup is similar, but there's a small catch: you must explicitly set the standard ANTHROPIC_API_KEY to an empty string to prevent conflicts.
ZSH Configuration
openrouterclaude() {
ANTHROPIC_BASE_URL="https://openrouter.ai/api" \
ANTHROPIC_AUTH_TOKEN="$OPENROUTER_API_KEY" \
ANTHROPIC_API_KEY="" \
CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1 \
claude "$@"
}
Fish Shell Configuration
function openrouterclaude
set -lx ANTHROPIC_BASE_URL "https://openrouter.ai/api"
set -lx ANTHROPIC_AUTH_TOKEN "$OPENROUTER_API_KEY"
set -lx ANTHROPIC_API_KEY ""
set -lx CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC 1
claude $argv
end
Changing Models
By default, Claude Code uses specific Anthropic model aliases (Sonnet, Opus, Haiku) to map to different performance tiers. When using OpenRouter, you can override these defaults to use any model available on the platform.
Understanding Default Model Mapping
Claude Code uses these environment variables to control which models are used:
ANTHROPIC_DEFAULT_SONNET_MODEL- Default balanced/performance modelANTHROPIC_DEFAULT_OPUS_MODEL- Default high-performance modelANTHROPIC_DEFAULT_HAIKU_MODEL- Default fast/lightweight model
OpenRouter automatically maps these to appropriate Anthropic models by default, but you can override them to use any provider's models.
Overriding Default Models
You can configure Claude Code to use specific models by setting the appropriate environment variable. For example, to use GPT models from OpenAI:
export ANTHROPIC_DEFAULT_SONNET_MODEL="openai/gpt-5.2"
export ANTHROPIC_DEFAULT_OPUS_MODEL="openai/gpt-5.2-pro"
export ANTHROPIC_DEFAULT_HAIKU_MODEL="openai/gpt-5.2-chat"
Some popular model options include:
- OpenAI GPT-5.2:
openai/gpt-5.2,openai/gpt-5.2-pro - OpenAI GPT-5.1-Codex-Max:
openai/gpt-5.1-codex-max - GLM 4.7:
z-ai/glm-4.7 - MiniMax M2.1:
minimax/minimax-m2.1 - MiMo V2 Flash:
xiaomi/mimo-v2-flash:free - Mistral Devstral 2:
mistralai/devstral-2512
Using GLM Models with OpenRouter
Here's a dedicated function to use GLM-4.7, Z.AI's latest flagship model, with Claude Code:
ZSH Configuration
glmoclaude() {
ANTHROPIC_BASE_URL="https://openrouter.ai/api" \
ANTHROPIC_AUTH_TOKEN="$OPENROUTER_API_KEY" \
ANTHROPIC_API_KEY="" \
ANTHROPIC_DEFAULT_SONNET_MODEL="z-ai/glm-4.7" \
CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1 \
claude "$@"
}
Fish Shell Configuration
function glmoclaude
set -lx ANTHROPIC_BASE_URL "https://openrouter.ai/api"
set -lx ANTHROPIC_AUTH_TOKEN "$OPENROUTER_API_KEY"
set -lx ANTHROPIC_API_KEY ""
set -lx ANTHROPIC_DEFAULT_SONNET_MODEL "z-ai/glm-4.7"
set -lx CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC 1
claude $argv
end
Important Considerations
Tool Use Support: Claude Code relies heavily on tool use capabilities (reading files, running commands, editing code). When selecting alternative models, ensure they support tool use. You can filter models by this capability on the OpenRouter models page.
Cleaner Requests: Always include CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1 in your functions. This disables telemetry and other non-essential API calls, keeping your requests focused on what matters.
Model Performance: While you can use any model, highly capable models (like Claude 4.5 Sonnet, GPT 5.2, GLM-4.7) generally provide the best experience for complex coding tasks that require strong reasoning.
For more detailed information about using Claude Code with OpenRouter, including advanced configuration with presets, check out the official OpenRouter documentation.
With these simple tweaks, you can maximize flexibility and choose the provider and model that works best for your workflow.