Model providers

Using Claude Max API Proxy with OpenClaw

2 min read

Browse more in Model providers.

All model providers guides →

This guide shows you how to run the Claude Max API Proxy locally and wire it into OpenClaw as an OpenAI-compatible model provider. You will use your Claude Max or Pro subscription through the proxy instead of per-token Anthropic API billing.

By the end, your OpenClaw agents will call models like `openai/claude-opus-4` via `http://localhost:3456/v1` using your existing subscription.

Setup flow

Prerequisites

  • Node.js 20 or later installed on your machine (the proxy requires Node.js 20+).
  • Claude Code CLI installed and authenticated with an active Claude Max or Pro subscription.
  • An OpenClaw project where you can edit the environment and agent model configuration.
  • Ability to run a local service on http://localhost:3456.

Steps

  1. 1

    Install the Claude Max API Proxy globally

    Install the proxy as a global npm package so you can run the `claude-max-api` command from anywhere. This binary is what exposes your Claude Max/Pro subscription as an OpenAI-compatible endpoint on localhost.

    bash
    # Requires Node.js 20+ and Claude Code CLI
    npm install -g claude-max-api-proxy
    
    # Verify Claude CLI is authenticated
    claude --version
  2. 2

    Start the Claude Max API Proxy server

    Run the proxy so it listens for OpenAI-format requests from your tools and from OpenClaw. By default it binds to `http://localhost:3456`, which you will reference later in your OpenClaw config.

    bash
    claude-max-api
    # Server runs at http://localhost:3456
  3. 3

    Smoke-test the proxy with curl

    Before you involve OpenClaw, hit the proxy directly to confirm it is healthy and can talk to Claude via your subscription. Use the health check, model listing, and a simple chat completion to verify everything works end to end.

    bash
    # Health check
    curl http://localhost:3456/health
    
    # List models
    curl http://localhost:3456/v1/models
    
    # Chat completion
    curl http://localhost:3456/v1/chat/completions \
      -H "Content-Type: application/json" \
      -d '{
        "model": "claude-opus-4",
        "messages": [{"role": "user", "content": "Hello!"}]
      }'
  4. 4

    Point OpenClaw at the Claude Max API Proxy

    Configure OpenClaw to treat the proxy as a custom OpenAI-compatible backend. You set a dummy `OPENAI_API_KEY`, point `OPENAI_BASE_URL` at the proxy’s `/v1` route, and choose an `openai/claude-opus-4` model so OpenClaw routes traffic through the proxy.

    json
    {
      env: {
        OPENAI_API_KEY: "not-needed",
        OPENAI_BASE_URL: "http://localhost:3456/v1",
      },
      agents: {
        defaults: {
          model: { primary: "openai/claude-opus-4" },
        },
      },
    }
  5. 5

    Run the proxy automatically on macOS (optional)

    If you develop on macOS and want the proxy always available for OpenClaw, set it up as a LaunchAgent. This makes macOS start and keep the proxy alive in the background whenever you log in.

    bash
    cat > ~/Library/LaunchAgents/com.claude-max-api.plist << 'EOF'
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
    <plist version="1.0">
    <dict>
      <key>Label</key>
      <string>com.claude-max-api</string>
      <key>RunAtLoad</key>
      <true/>
      <key>KeepAlive</key>
      <true/>
      <key>ProgramArguments</key>
      <array>
        <string>/usr/local/bin/node</string>
        <string>/usr/local/lib/node_modules/claude-max-api-proxy/dist/server/standalone.js</string>
      </array>
      <key>EnvironmentVariables</key>
      <dict>
        <key>PATH</key>
        <string>/usr/local/bin:/opt/homebrew/bin:~/.local/bin:/usr/bin:/bin</string>
      </dict>
    </dict>
    </plist>
    EOF
    
    launchctl bootstrap gui/$(id -u) ~/Library/LaunchAgents/com.claude-max-api.plist

Configuration

OptionDescriptionExample
OPENAI_API_KEYDummy API key value required by OpenClaw when using the OpenAI-compatible proxy; the proxy does not validate this.not-needed
OPENAI_BASE_URLBase URL where OpenClaw sends OpenAI-format requests, pointing to the Claude Max API Proxy.http://localhost:3456/v1
agents.defaults.model.primaryDefault model identifier OpenClaw uses for agents, mapped here to the Claude Max API Proxy via the OpenAI provider.openai/claude-opus-4
modelModel ID you send to the proxy in OpenAI-format requests; the proxy maps it to a Claude 4 family model.claude-opus-4

Troubleshooting

curl http://localhost:3456/health fails or connection is refused

The proxy server is not running or not bound to the default port. Start it in a terminal with `claude-max-api` so it listens on `http://localhost:3456` before you send health checks or let OpenClaw connect.

bash
claude-max-api
# Server runs at http://localhost:3456

Chat completion calls hang or fail when using OpenClaw with the proxy

OpenClaw only talks to the proxy if `OPENAI_BASE_URL` points at the `/v1` route and the model name matches what the proxy exposes. Double-check that `OPENAI_BASE_URL` is set to `"http://localhost:3456/v1"` and that your agent model is `"openai/claude-opus-4"` or another available model.

bash
{
  env: {
    OPENAI_API_KEY: "not-needed",
    OPENAI_BASE_URL: "http://localhost:3456/v1",
  },
  agents: {
    defaults: {
      model: { primary: "openai/claude-opus-4" },
    },
  },
}

Frequently asked questions

Powered by Mem0

Add persistent memory to OpenClaw

Official Mem0 plugin for OpenClaw keeps context across chats and tools. Smaller prompts, lower cost, better continuity for your agents.

More in Model providers