Model providers

Using GLM (Zhipu AI) with OpenClaw

4 min read

Browse more in Model providers.

All model providers guides →

AI provider. 1` as the default, and verify that OpenClaw can see and use the GLM family.

By the end, your agents will call GLM models via the `zai` provider with the right auth and model refs wired up.

Setup flow

Prerequisites

  • A Z.AI account with access to GLM models and an API key that starts with `sk-...`.
  • OpenClaw installed with access to the `openclaw` CLI in your shell.
  • Network access from your environment to the Z.AI endpoints that match your region and plan (global or China).

Steps

  1. 1

    Choose your Z.AI auth route and run onboarding

    AI so it can reach GLM models. AI plan and region; this is where many people accidentally mix up Coding vs general API or global vs China.

    Run the onboarding command so OpenClaw stores the provider config and, for `zai-api-key`, auto-detects the correct endpoint from your key.

    bash
    # Example: generic auto-detect
    openclaw onboard --auth-choice zai-api-key
    
    # Example: Coding Plan global
    openclaw onboard --auth-choice zai-coding-global
  2. 2

    Understand the available Z.AI auth choices

    AI subscription to the right `--auth-choice`. Using the wrong one can point OpenClaw at the wrong regional endpoint or API surface, which leads to confusing auth or model-availability errors later.

    Use this table as your quick reference when deciding which onboarding flag to use.

    text
    Auth choice        | Best for
    -------------------|-------------------------------
    `zai-api-key`      | Generic API-key setup with endpoint auto-detection
    `zai-coding-global`| Coding Plan users (global)
    `zai-coding-cn`    | Coding Plan users (China region)
    `zai-global`       | General API (global)
    `zai-cn`           | General API (China region)
  3. 3

    Set a GLM model as the default for your agents

    Once onboarding succeeds, point your agents at a specific GLM model so you do not have to specify it on every call. 1`, which is a bundled GLM ref wired through the `zai` provider.

    Run this config command so new agents automatically use GLM unless you override the model per-agent.

    bash
    openclaw config set agents.defaults.model.primary "zai/glm-5.1"
  4. 4

    Verify GLM models are available from the zai provider

    Before wiring this into a larger system, confirm that OpenClaw can list GLM models via the `zai` provider. This catches issues like bad auth, wrong region, or missing plan access early.

    1` and `glm-5`, your provider wiring is good.

    bash
    openclaw models list --provider zai
  5. 5

    Configure GLM in your OpenClaw environment

    If you manage OpenClaw config as code, mirror the CLI changes in your config object. 1`.

    This keeps your GLM setup reproducible across environments and CI.

    json
    {
      env: { ZAI_API_KEY: "sk-..." },
      agents: { defaults: { model: { primary: "zai/glm-5.1" } } },
    }
  6. 6

    Pick the right bundled GLM model reference

    OpenClaw seeds the `zai` provider with several GLM model refs so you can swap behavior without changing providers. primary` accordingly.

    Use this table to see what GLM variants are already wired in.

    text
    Model           | Model
    ----------------|-----------------
    `glm-5.1`       | `glm-4.7`
    `glm-5`         | `glm-4.7-flash`
    `glm-5-turbo`   | `glm-4.7-flashx`
    `glm-5v-turbo`  | `glm-4.6`
    `glm-4.5`       | `glm-4.6v`
    `glm-4.5-air`   | 
    `glm-4.5-flash` | 
    `glm-4.5v`      | 

Configuration

OptionDescriptionExample
ZAI_API_KEYZ.AI API key that OpenClaw uses to authenticate to the zai provider and access GLM models.sk-...
agents.defaults.model.primaryDefault model reference for your agents; set this to a GLM model via the zai provider so GLM is used when no model is specified explicitly.zai/glm-5.1

Troubleshooting

GLM models do not appear when you run `openclaw models list --provider zai`.

AI plan/region. Re-run onboarding with the correct `--auth-choice` (for example `zai-api-key` for generic keys or `zai-coding-global` for the global Coding Plan) so OpenClaw can reach the right endpoint.

bash
openclaw onboard --auth-choice zai-api-key

Your agent calls a non-GLM model even though you expect GLM.

OpenClaw uses the configured default model when you do not specify one explicitly. 1` and not another provider, then restart any long-lived processes that cache config.

bash
openclaw config set agents.defaults.model.primary "zai/glm-5.1"

Frequently asked questions

Powered by Mem0

Add persistent memory to OpenClaw

Official Mem0 plugin for OpenClaw keeps context across chats and tools. Smaller prompts, lower cost, better continuity for your agents.

More in Model providers