Model providers

Using Volcengine (Doubao) with OpenClaw

3 min read

Browse more in Model providers.

All model providers guides →

This guide walks you through configuring Volcengine (Doubao) as a model provider in OpenClaw, including both general and coding endpoints. You will wire up your `VOLCANO_ENGINE_API_KEY`, register the `volcengine` and `volcengine-plan` providers, and set a Doubao-based coding model as your default.

By the end, your OpenClaw agents will be able to call Doubao and other Volcano Engine–hosted models through OpenAI-compatible APIs.

Setup flow

Prerequisites

  • An existing OpenClaw installation with the `openclaw` CLI available in your shell.
  • A valid Volcengine (Volcano Engine) API key that you can expose as `VOLCANO_ENGINE_API_KEY`.
  • Network access from your OpenClaw environment to `ark.cn-beijing.volces.com` for both `/api/v3` and `/api/coding/v3` endpoints.

Steps

  1. 1

    Export your Volcengine API key for OpenClaw

    Before you run any onboarding commands, make sure your Volcengine API key is available as `VOLCANO_ENGINE_API_KEY`. OpenClaw’s non-interactive onboarding flow reads this variable when you pass `--volcengine-api-key "$VOLCANO_ENGINE_API_KEY"`, and daemon processes also rely on it being set in a non-interactive environment.

    bash
    VOLCANO_ENGINE_API_KEY="sk-volc-..."
  2. 2

    Run interactive onboarding for Volcengine

    Use the OpenClaw CLI to register Volcengine as a provider with an interactive prompt. This command wires up both the general `volcengine` and coding `volcengine-plan` providers from a single API key, so you can use Doubao and coding models without separate setup.

    bash
    openclaw onboard --auth-choice volcengine-api-key
  3. 3

    Set a default Volcengine coding model for your agents

    Configure your agent defaults so OpenClaw uses a Volcengine coding model out of the box. The example below sets `volcengine-plan/ark-code-latest` as the primary model, which matches the default that onboarding currently applies and targets the coding endpoint.

    json
    {
      agents: {
        defaults: {
          model: { primary: "volcengine-plan/ark-code-latest" },
        },
      },
    }
  4. 4

    Verify Volcengine models are registered

    After onboarding and config, list the models for both providers to confirm OpenClaw can see the Volcengine catalogs. You should see general models under `volcengine/*` and coding models under `volcengine-plan/*`, which confirms both endpoints are wired correctly.

    bash
    openclaw models list --provider volcengine
    openclaw models list --provider volcengine-plan
  5. 5

    Automate Volcengine onboarding in CI or scripts

    For non-interactive environments like CI pipelines or container entrypoints, use the non-interactive onboarding mode. This command registers both providers using the `VOLCANO_ENGINE_API_KEY` you pass explicitly, and avoids any prompts that would hang a headless run.

    bash
    openclaw onboard --non-interactive \
      --mode local \
      --auth-choice volcengine-api-key \
      --volcengine-api-key "$VOLCANO_ENGINE_API_KEY"

Configuration

OptionDescriptionExample
VOLCANO_ENGINE_API_KEYAPI key used by OpenClaw to authenticate with Volcengine for both general (`volcengine`) and coding (`volcengine-plan`) endpoints.sk-volc-1234567890abcdef
agents.defaults.model.primaryDefault primary model reference for your agents; can point at a Volcengine coding model like `volcengine-plan/ark-code-latest`.volcengine-plan/ark-code-latest

Troubleshooting

Your agent calls fail when running under systemd, but work in your interactive shell.

When the Gateway runs as a daemon, it does not inherit environment variables from your interactive shell, so `VOLCANO_ENGINE_API_KEY` is missing. shellEnv` as described in the daemon note.

The model picker does not show any Volcengine models during onboarding.

The Volcengine auth choice prefers `volcengine/*` and `volcengine-plan/*` rows, but if those models are not loaded yet, OpenClaw falls back to the unfiltered catalog instead of showing an empty provider-scoped picker. Scroll the full catalog or run `openclaw models list --provider volcengine` and `openclaw models list --provider volcengine-plan` to confirm the models are available.

bash
openclaw models list --provider volcengine
openclaw models list --provider volcengine-plan

Frequently asked questions

Powered by Mem0

Add persistent memory to OpenClaw

Official Mem0 plugin for OpenClaw keeps context across chats and tools. Smaller prompts, lower cost, better continuity for your agents.

More in Model providers