Model providers
Using Amazon Bedrock with OpenClaw
Browse more in Model providers.
All model providers guides →This guide walks you through configuring Amazon Bedrock as a model provider in OpenClaw using the Bedrock Converse streaming API. You will wire OpenClaw into AWS credentials, enable automatic Bedrock model discovery, and register a Claude Opus inference profile as your default agent model.
By the end, your OpenClaw agents will call Bedrock models and (optionally) use Bedrock embeddings for memory search.
Prerequisites
- ✓An AWS account with Amazon Bedrock access enabled in at least one region (for example, us-east-1).
- ✓IAM credentials or an EC2 instance role with permissions for bedrock:InvokeModel, bedrock:InvokeModelWithResponseStream, bedrock:ListFoundationModels, and bedrock:ListInferenceProfiles (or the AmazonBedrockFullAccess managed policy).
- ✓An OpenClaw gateway already installed and reachable where you can run the openclaw CLI and edit its config.
- ✓AWS CLI installed and configured if you plan to use the EC2 IAM role quick-setup path.
Steps
- 1
Export AWS credentials for Bedrock on the gateway host
Start by making sure the OpenClaw gateway host has AWS credentials that the AWS SDK default chain can pick up. This step is critical because Bedrock auth does not use an API key; OpenClaw relies entirely on AWS env vars, shared config, or instance roles.
If you skip this, discovery and model calls will fail even if your OpenClaw config looks correct.
bashexport AWS_ACCESS_KEY_ID="AKIA..." export AWS_SECRET_ACCESS_KEY="..." export AWS_REGION="us-east-1" # Optional: export AWS_SESSION_TOKEN="..." export AWS_PROFILE="your-profile" # Optional (Bedrock API key/bearer token): export AWS_BEARER_TOKEN_BEDROCK="..." - 2
Add the Amazon Bedrock provider and a model to OpenClaw
Next, register the amazon-bedrock provider in your OpenClaw config and point it at the Bedrock Converse streaming API. You do not set an apiKey here because auth flows through the AWS SDK; instead you define the baseUrl, api type, and at least one model, then make that model the default for your agents.
This gives you a stable, named model route even if you later enable automatic discovery.
json{ models: { providers: { "amazon-bedrock": { baseUrl: "https://bedrock-runtime.us-east-1.amazonaws.com", api: "bedrock-converse-stream", auth: "aws-sdk", models: [ { id: "us.anthropic.claude-opus-4-6-v1:0", name: "Claude Opus 4.6 (Bedrock)", reasoning: true, input: ["text", "image"], cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 }, contextWindow: 200000, maxTokens: 8192, }, ], }, }, }, agents: { defaults: { model: { primary: "amazon-bedrock/us.anthropic.claude-opus-4-6-v1:0" }, }, }, } - 3
Enable automatic Bedrock model discovery
Turn on discovery so OpenClaw can list Bedrock foundation models and inference profiles that support streaming and text output. Discovery uses bedrock:ListFoundationModels and bedrock:ListInferenceProfiles and caches results, so you get an up-to-date model catalog without hardcoding every ID.
This is especially useful when AWS adds new models or profiles in your region.
json{ plugins: { entries: { "amazon-bedrock": { config: { discovery: { enabled: true, region: "us-east-1", providerFilter: ["anthropic", "amazon"], refreshInterval: 3600, defaultContextWindow: 32000, defaultMaxTokens: 4096, }, }, }, }, }, } - 4
Configure EC2 instance roles for Bedrock (optional but recommended on AWS)
If you run OpenClaw on EC2, use an IAM role instead of long-lived access keys. This step creates an EC2 role with AmazonBedrockFullAccess, attaches it to your instance, and then enables discovery with the openclaw CLI.
It keeps credentials off disk and lets the AWS SDK use IMDS while still giving OpenClaw the env markers it needs for auto mode if you want them.
bash# 1. Create IAM role and instance profile aws iam create-role --role-name EC2-Bedrock-Access \ --assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [{ "Effect": "Allow", "Principal": {"Service": "ec2.amazonaws.com"}, "Action": "sts:AssumeRole" }] }' aws iam attach-role-policy --role-name EC2-Bedrock-Access \ --policy-arn arn:aws:iam::aws:policy/AmazonBedrockFullAccess aws iam create-instance-profile --instance-profile-name EC2-Bedrock-Access aws iam add-role-to-instance-profile \ --instance-profile-name EC2-Bedrock-Access \ --role-name EC2-Bedrock-Access # 2. Attach to your EC2 instance aws ec2 associate-iam-instance-profile \ --instance-id i-xxxxx \ --iam-instance-profile Name=EC2-Bedrock-Access # 3. On the EC2 instance, enable discovery explicitly openclaw config set plugins.entries.amazon-bedrock.config.discovery.enabled true openclaw config set plugins.entries.amazon-bedrock.config.discovery.region us-east-1 # 4. Optional: add an env marker if you want auto mode without explicit enable echo 'export AWS_PROFILE=default' >> ~/.bashrc echo 'export AWS_REGION=us-east-1' >> ~/.bashrc source ~/.bashrc # 5. Verify models are discovered openclaw models list - 5
Enable Bedrock embeddings for memory search
Finally, wire Bedrock into OpenClaw’s memory search so your agents use Bedrock embeddings for retrieval. This is configured separately from inference and uses the same AWS SDK credential chain, so you do not need any extra keys.
Start with the default Amazon Titan Embed model and adjust later if you need different dimensions or providers.
json{ agents: { defaults: { memorySearch: { provider: "bedrock", model: "amazon.titan-embed-text-v2:0", // default }, }, }, }
Configuration
| Option | Description | Example |
|---|---|---|
| AWS_ACCESS_KEY_ID | Access key ID used by the AWS SDK default credential chain for Bedrock auth when not using an instance role. | AKIAIOSFODNN7EXAMPLE |
| AWS_SECRET_ACCESS_KEY | Secret access key paired with AWS_ACCESS_KEY_ID for Bedrock authentication. | wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY |
| AWS_REGION | Primary AWS region for Bedrock calls and discovery when not overridden in the discovery config. | us-east-1 |
| AWS_SESSION_TOKEN | Optional session token for temporary AWS credentials when using STS or federated access. | IQoJb3JpZ2luX2VjEOr//////////wEaCXVzLWVhc3QtMSJHMEUCIQD... |
| AWS_PROFILE | Optional named profile that acts as an AWS auth marker for auto mode and can be used by the AWS SDK shared config. | default |
| AWS_BEARER_TOKEN_BEDROCK | Optional Bedrock API key or bearer token that OpenClaw uses as a high-priority credential source marker. | bedrock-bearer-token-abc123 |
| plugins.entries.amazon-bedrock.config.discovery.enabled | Controls whether OpenClaw performs automatic Bedrock model and inference profile discovery; auto mode when unset, explicit on/off when set. | true |
| plugins.entries.amazon-bedrock.config.discovery.region | Region used for Bedrock discovery when you want to override AWS_REGION or AWS_DEFAULT_REGION. | us-east-1 |
| plugins.entries.amazon-bedrock.config.discovery.providerFilter | List of Bedrock provider names to include during discovery, such as anthropic or amazon. | ["anthropic", "amazon"] |
| plugins.entries.amazon-bedrock.config.discovery.refreshInterval | Number of seconds that discovery results are cached before OpenClaw refreshes the Bedrock model list. | 3600 |
| plugins.entries.amazon-bedrock.config.discovery.defaultContextWindow | Default context window size applied to discovered models when OpenClaw does not know the exact limit. | 32000 |
| plugins.entries.amazon-bedrock.config.discovery.defaultMaxTokens | Default maxTokens value applied to discovered models when their limit is unknown. | 4096 |
| plugins.entries.amazon-bedrock.config.guardrail.guardrailIdentifier | Identifier or ARN of the Amazon Bedrock Guardrail to apply to all Bedrock invocations. | abc123 |
| plugins.entries.amazon-bedrock.config.guardrail.guardrailVersion | Guardrail version or DRAFT that Bedrock should use when applying the guardrail. | 1 |
| plugins.entries.amazon-bedrock.config.guardrail.streamProcessingMode | Whether guardrail evaluation runs synchronously or asynchronously during streaming. | sync |
| plugins.entries.amazon-bedrock.config.guardrail.trace | Controls whether guardrail trace output is included in API responses for debugging. | enabled_full |
| agents.defaults.memorySearch.provider | Sets Bedrock as the embedding provider for memory search. | bedrock |
| agents.defaults.memorySearch.model | Specifies which Bedrock embedding model to use for memory search. | amazon.titan-embed-text-v2:0 |
Troubleshooting
openclaw models list shows no Amazon Bedrock models even though the provider is configured.
Automatic discovery only runs when discovery is enabled or when OpenClaw sees an AWS auth env marker. enabled to true and a region, or export AWS_BEARER_TOKEN_BEDROCK, AWS_ACCESS_KEY_ID/AWS_SECRET_ACCESS_KEY, or AWS_PROFILE so the implicit provider activates.
openclaw config set plugins.entries.amazon-bedrock.config.discovery.enabled true
openclaw config set plugins.entries.amazon-bedrock.config.discovery.region us-east-1Bedrock calls fail even though discovery works when running on EC2 with an instance role.
Discovery can use env markers while runtime auth still relies on the AWS SDK default chain and IMDS. Make sure the EC2 instance role has bedrock:InvokeModel and bedrock:InvokeModelWithResponseStream (or AmazonBedrockFullAccess) and that you did not rely on a fake API key, since OpenClaw does not use one for Bedrock.
Guardrails configuration is ignored and responses are not filtered.
Guardrails only apply when you configure the guardrail block under the amazon-bedrock plugin and the IAM principal has bedrock:ApplyGuardrail. Double-check guardrailIdentifier and guardrailVersion values and ensure the role or user includes bedrock:ApplyGuardrail in addition to the standard invoke permissions.
{
plugins: {
entries: {
"amazon-bedrock": {
config: {
guardrail: {
guardrailIdentifier: "abc123", // guardrail ID or full ARN
guardrailVersion: "1", // version number or "DRAFT"
streamProcessingMode: "sync", // optional: "sync" or "async"
trace: "enabled", // optional: "enabled", "disabled", or "enabled_full"
},
},
},
},
},
}Frequently asked questions
Powered by Mem0
Add persistent memory to OpenClaw
Official Mem0 plugin for OpenClaw keeps context across chats and tools. Smaller prompts, lower cost, better continuity for your agents.