OpenClaw MCP Server Details

OpenClaw MCP Server is a secure Model Context Protocol (MCP) bridge that connects Claude.ai with a self-hosted OpenClaw assistant, enabling OAuth2 authentication and safe, controlled communication between the Claude AI ecosystem and your local or hosted OpenClaw deployment. This MCP server acts as an orchestration layer that exposes MCP tools to Claude.ai, manages authentication, and enforces security boundaries like CORS and transport options. It is designed to be deployed via Docker or run locally, with detailed installation, configuration, and security guidance provided in the documentation. By serving as a bridge, it enables Claude.ai to delegate tasks to your OpenClaw bot while preserving control over data flow and access controls, in line with MCP specifications and best security practices.

Use Case

The MCP server enables Claude.ai to securely interact with a self-hosted OpenClaw assistant. It exposes a set of tools that Claude can invoke, including synchronous chat and status checks, as well as asynchronous long-running tasks for more complex operations. It supports OAuth2 authentication, CORS configuration, and transport options (such as SSE) to fit your deployment scenario. Example usage includes running the server with Docker Compose, starting a local development instance, or running remotely behind a reverse proxy with proper issuer URL configuration to ensure OAuth metadata is advertised correctly. The server also supports adding more tools over time and provides integration guides for Claude Desktop and Claude.ai configurations.

Key starting points from the docs include:

  • Docker deployment and environment variables to configure routing and authentication.

  • Local Claude Desktop integration with a JSON config.

  • Remote Claude.ai usage with OAuth2 and CORS settings.
  • Code examples from the documentation:

    Docker (Recommended) snippet for docker-compose:

    services:
    mcp-bridge:
    image: ghcr.io/freema/openclaw-mcp:latest
    container_name: openclaw-mcp
    restart: unless-stopped
    ports:
  • "3000:3000"

  • environment:
  • OPENCLAW_URL=http://host.docker.internal:18789

  • OPENCLAW_GATEWAY_TOKEN=${OPENCLAW_GATEWAY_TOKEN}

  • AUTH_ENABLED=true

  • MCP_CLIENT_ID=openclaw

  • MCP_CLIENT_SECRET=${MCP_CLIENT_SECRET}

  • MCP_ISSUER_URL=${MCP_ISSUER_URL:-}

  • CORS_ORIGINS=https://claude.ai

  • extra_hosts:
  • "host.docker.internal:host-gateway"

  • read_only: true
    security_opt:
  • no-new-privileges
  • Generate secrets and start:

    export MCP_CLIENT_SECRET=$(openssl rand -hex 32)
    export OPENCLAW_GATEWAY_TOKEN=your-gateway-token
    docker compose up -d

    Then in Claude.ai add a custom MCP connector pointing to your server with MCP_CLIENT_ID=openclaw and your MCP_CLIENT_SECRET.

    Tip: Pin a specific version instead of latest for production: ghcr.io/freema/openclaw-mcp:1.1.0.

    Local (Claude Desktop) usage:

    npx openclaw-mcp

    Claude Desktop config example:

    {
    "mcpServers": {
    "openclaw": {
    "command": "npx",
    "args": ["openclaw-mcp"],
    "env": {
    "OPENCLAW_URL": "http://127.0.0.1:18789",
    "OPENCLAW_GATEWAY_TOKEN": "your-gateway-token",
    "OPENCLAW_TIMEOUT_MS": "300000"
    }
    }
    }
    }

    Remote (Claude.ai) without Docker:

    AUTH_ENABLED=true MCP_CLIENT_ID=openclaw MCP_CLIENT_SECRET=your-secret \
    MCP_ISSUER_URL=https://mcp.your-domain.com \
    CORS_ORIGINS=https://claude.ai OPENCLAW_GATEWAY_TOKEN=your-gateway-token \
    npx openclaw-mcp --transport sse --port 3000

    > Important: When running behind a reverse proxy (Caddy, nginx, etc.), you must set MCP_ISSUER_URL (or --issuer-url) to your public HTTPS URL. Without this, OAuth metadata will advertise http://localhost:3000 and clients will fail to authenticate.

    Available Tools (9)

    Examples & Tutorials

    Code and usage patterns from the documentation:

    Docker (Recommended)

    services:
    mcp-bridge:
    image: ghcr.io/freema/openclaw-mcp:latest
    container_name: openclaw-mcp
    restart: unless-stopped
    ports:
  • "3000:3000"

  • environment:
  • OPENCLAW_URL=http://host.docker.internal:18789

  • OPENCLAW_GATEWAY_TOKEN=${OPENCLAW_GATEWAY_TOKEN}

  • AUTH_ENABLED=true

  • MCP_CLIENT_ID=openclaw

  • MCP_CLIENT_SECRET=${MCP_CLIENT_SECRET}

  • MCP_ISSUER_URL=${MCP_ISSUER_URL:-}

  • CORS_ORIGINS=https://claude.ai

  • extra_hosts:
  • "host.docker.internal:host-gateway"

  • read_only: true
    security_opt:
  • no-new-privileges
  • Generate secrets and start:

    export MCP_CLIENT_SECRET=$(openssl rand -hex 32)
    export OPENCLAW_GATEWAY_TOKEN=your-gateway-token
    docker compose up -d

    Claude Desktop config:

    {
    "mcpServers": {
    "openclaw": {
    "command": "npx",
    "args": ["openclaw-mcp"],
    "env": {
    "OPENCLAW_URL": "http://127.0.0.1:18789",
    "OPENCLAW_GATEWAY_TOKEN": "your-gateway-token",
    "OPENCLAW_TIMEOUT_MS": "300000"
    }
    }
    }
    }

    Remote (Claude.ai):

    AUTH_ENABLED=true MCP_CLIENT_ID=openclaw MCP_CLIENT_SECRET=your-secret \\
    MCP_ISSUER_URL=https://mcp.your-domain.com \\
    CORS_ORIGINS=https://claude.ai OPENCLAW_GATEWAY_TOKEN=your-gateway-token \\
    npx openclaw-mcp --transport sse --port 3000

    Security note in docs:

    > Important: When running behind a reverse proxy (Caddy, nginx, etc.), you must set MCP_ISSUER_URL (or --issuer-url) to your public HTTPS URL. Without this, OAuth metadata will advertise http://localhost:3000 and clients will fail to authenticate.

    Installation Guide

    Step-by-step installation instructions with actual commands from the documentation

    Docker (Recommended)

  • Pull image: docker pull ghcr.io/freema/openclaw-mcp:latest

  • Create docker-compose.yml as shown in the docs

  • Start secrets and container:

  • export MCP_CLIENT_SECRET=$(openssl rand -hex 32)
    export OPENCLAW_GATEWAY_TOKEN=your-gateway-token
    docker compose up -d

    Local (Claude Desktop)

  • Run: npx openclaw-mcp

  • Claude Desktop config (example):

  • {
    "mcpServers": {
    "openclaw": {
    "command": "npx",
    "args": ["openclaw-mcp"],
    "env": {
    "OPENCLAW_URL": "http://127.0.0.1:18789",
    "OPENCLAW_GATEWAY_TOKEN": "your-gateway-token",
    "OPENCLAW_TIMEOUT_MS": "300000"
    }
    }
    }
    }

    Remote (Claude.ai) without Docker

  • Run with envs and flags as shown:

  • AUTH_ENABLED=true MCP_CLIENT_ID=openclaw MCP_CLIENT_SECRET=your-secret \
    MCP_ISSUER_URL=https://mcp.your-domain.com \
    CORS_ORIGINS=https://claude.ai OPENCLAW_GATEWAY_TOKEN=your-gateway-token \
    npx openclaw-mcp --transport sse --port 3000

    Integration Guides

    Frequently Asked Questions

    Is this your MCP?

    Claim ownership and get verified badge

    Repository Stats
    Important Notes

    ⚠️ Always enable authentication in production!\n\nConfigure CORS to restrict access:\n\nCORS_ORIGINS=https://claude.ai,https://your-app.com

    Prerequisites

    Requirements before using this MCP (Node.js version, API keys, etc.) - from the docs

  • Node.js ≥ 20

  • OpenClaw gateway running with HTTP API enabled:
  • // openclaw.json
    { "gateway": { "http": { "endpoints": { "chatCompletions": { "enabled": true } } } } }

    Details
    Last Updated3/14/2026
    Websitegithub.com
    SourceGitHub

    Compare Alternatives

    Similar MCP Tools

    6 related tools
    Anki MCP Server

    Anki MCP Server

    A Model Context Protocol (MCP) server that enables AI assistants to interact with Anki, the spaced repetition flashcard application. The Anki MCP Server allows AI models to access Anki's card data, enabling features like automated flashcard creation, review, and management.

    1MCP Agent

    1MCP Agent

    A unified Model Context Protocol server implementation that aggregates multiple MCP servers into one. The 1mcp-app/agent is an open-source project that provides a single entry point for multiple MCP servers, making it easier to manage and interact with various AI models and tools.

    Roundtable AI MCP Server

    Roundtable AI MCP Server

    Roundtable AI MCP Server is a zero-configuration local MCP server that unifies multiple AI coding assistants (Codex, Claude Code, Cursor, Gemini) through intelligent auto-discovery and a standardized interface. It coordinates specialized sub-agents from within your IDE to solve engineering problems in parallel, sharing context and synthesizing responses into a single, high-quality output. This documentation details installation, available MCP tools, integration with popular IDEs, and a broad ecosystem of specialized tools and CLIs that can be invoked as part of a roundtable-powered workflow, enabling developers to delegate tasks to the right AI for each facet of a problem without leaving their development environment.

    MCPJungle

    MCPJungle

    MCPJungle is a self-hosted MCP Gateway and Registry for AI agents. It serves as a central registry and gateway to manage Model Context Protocol (MCP) servers and the tools they expose. By consolidating MCP server registration, tool discovery, and access control, MCPJungle enables AI agents and clients to discover, group, and securely invoke tools from a single, unified gateway. The project provides a CLI, Docker-based deployment options, and enterprise-ready features such as tool grouping, access control, and observability to streamline MCP-based workflows across organizations.

    mcpmcp-server

    mcpmcp-server

    mcpmcp-server is a focused solution for discovering, setting up, and integrating MCP servers with your favorite clients to unlock AI-powered workflows. It streamlines how you connect MCP-powered servers to popular clients, enabling seamless AI-assisted interactions across your daily tools. The project emphasizes an approachable, config-driven approach to linking MCP servers with clients like Claude Desktop, while directing you to the homepage for variations across apps and platforms. This README highlights a practical JSON configuration example and notes on supported environments, helping you get started quickly and confidently.

    Imagen3-MCP

    Imagen3-MCP

    Imagen3-MCP is an image generation service based on Google's Imagen 3.0 that exposes its functionality through MCP (Model Control Protocol). The project provides a server to run a local MCP service that accesses Google Gemini-powered image generation, enabling developers to integrate advanced image synthesis into their applications. The documentation covers prerequisites (Gemini API key), installation steps for Cherry Studio, and a Cursor-based JSON configuration example for embedding the MCP server in broader tooling. This MCP is designed to be deployment-friendly, with configurable environment variables and optional proxy settings to adapt to various network environments.