MCPJungle Details

MCPJungle is a self-hosted MCP Gateway and Registry for AI agents. It serves as a central registry and gateway to manage Model Context Protocol (MCP) servers and the tools they expose. By consolidating MCP server registration, tool discovery, and access control, MCPJungle enables AI agents and clients to discover, group, and securely invoke tools from a single, unified gateway. The project provides a CLI, Docker-based deployment options, and enterprise-ready features such as tool grouping, access control, and observability to streamline MCP-based workflows across organizations.

Use Case

MCPJungle acts as a centralized registry and gateway for MCP servers and their tools. It supports both Streamable HTTP and STDIO transports, enabling you to register remote MCP servers and invoke their tools through a single endpoint. It also lets you create Tool Groups to expose a curated subset of tools to specific clients, manage enabling/disabling of tools and prompts, and integrate with clients like Claude or Cursor. Example use cases include registering a calculator MCP server (streamable_http) and a filesystem MCP server (stdio), listing and invoking tools, and creating Claude-specific tool groups.

Example usage from the docs:

  • Register a streamable HTTP MCP server:

  • mcpjungle register --name calculator --description "Provides some basic math tools" --url http://127.0.0.1:8000/mcp

  • Access tools via MCPJungle:

  • mcpjungle list tools

    <h1 class="text-2xl font-semibold mt-5 mb-3">Check tool usage</h1>
    mcpjungle usage calculator__multiply

    <h1 class="text-2xl font-semibold mt-5 mb-3">Call a tool</h1>
    mcpjungle invoke calculator__multiply --input '{"a": 100, "b": 50}'


  • Connect Claude to MCPJungle (example config):

  • {
    "mcpServers": {
    "mcpjungle": {
    "command": "npx",
    "args": [
    "mcp-remote",
    "http://localhost:8080/mcp",
    "--allow-http"
    ]
    }
    }
    }

  • Register a STDIO MCP server (filesystem):

  • {
    "name": "filesystem",
    "transport": "stdio",
    "description": "filesystem mcp server",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-filesystem", "."]
    }

  • Example tool group usage:

  • $ mcpjungle create group -c ./claude-tools-group.json

    Tool Group claude-tools created successfully
    It is now accessible at the following streamable http endpoint:

    http://127.0.0.1:8080/v0/groups/claude-tools/mcp


    Available Tools (7)

    Examples & Tutorials

    Real example code and usage patterns directly from the documentation:

  • Quickstart: Start the server using Docker Compose and verify health

  • curl -O https://raw.githubusercontent.com/mcpjungle/MCPJungle/refs/heads/main/docker-compose.yaml
    docker compose up -d

  • Start a local MCPJungle server via CLI installation (Homebrew)

  • brew install mcpjungle/mcpjungle/mcpjungle
    mcpjungle version

  • Register an MCP server (streamable HTTP)

  • mcpjungle register --name calculator --description "Provides some basic math tools" --url http://127.0.0.1:8000/mcp

  • Register a server via config file

  • cat ./calculator.json
    {
    "name": "calculator",
    "transport": "streamable_http",
    "description": "Provides some basic math tools",
    "url": "http://127.0.0.1:8000/mcp"
    }

    mcpjungle register -c ./calculator.json


  • List tools, check usage, and invoke

  • mcpjungle list tools

    <h1 class="text-2xl font-semibold mt-5 mb-3">Check tool usage</h1>
    mcpjungle usage calculator__multiply

    <h1 class="text-2xl font-semibold mt-5 mb-3">Call a tool</h1>
    mcpjungle invoke calculator__multiply --input '{"a": 100, "b": 50}'


  • Register a STDIO-based MCP server (filesystem)

  • {
    "name": "filesystem",
    "transport": "stdio",
    "description": "filesystem mcp server",
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-filesystem", "."]
    }

  • Deregister servers

  • mcpjungle deregister calculator
    mcpjungle deregister filesystem

  • Claude integration config

  • {
    "mcpServers": {
    "mcpjungle": {
    "command": "npx",
    "args": [
    "mcp-remote",
    "http://localhost:8080/mcp",
    "--allow-http"
    ]
    }
    }
    }

  • Cursor integration config

  • {
    "mcpServers": {
    "mcpjungle": {
    "url": "http://localhost:8080/mcp"
    }
    }
    }

  • Enabling/Disabling tools and prompts

  • <h1 class="text-2xl font-semibold mt-5 mb-3">disable a specific tool</h1>
    mcpjungle disable tool context7__get-library-docs
    <h1 class="text-2xl font-semibold mt-5 mb-3">re-enable the tool</h1>
    mcpjungle enable tool context7__get-library-docs

    <h1 class="text-2xl font-semibold mt-5 mb-3">disable all tools in a server</h1>
    mcpjungle disable tool context7

    <h1 class="text-2xl font-semibold mt-5 mb-3">disable the whole server</h1>
    mcpjungle disable server context7

    <h1 class="text-2xl font-semibold mt-5 mb-3">disable a prompt</h1>
    mcpjungle disable prompt "huggingface_Model Details"
    <h1 class="text-2xl font-semibold mt-5 mb-3">disable all prompts in a server</h1>
    mcpjungle disable prompt context7


  • Prompts examples

  • $ mcpjungle list prompts --server huggingface

    $ mcpjungle get prompt "huggingface__Model Details" --arg model_id="openai/gpt-oss-120b"


  • Tool Groups examples

  • {
    "name": "claude-tools",
    "description": "This group only contains tools for Claude Desktop to use",
    "included_tools": [\
    "filesystem__read_file",\
    "deepwiki__read_wiki_contents",\
    "time__get_current_time"\
    ]
    }

  • Example 2: Including entire servers with exclusions

  • {
    "name": "claude-tools",
    "description": "All tools from time and deepwiki servers except time__convert_time",
    "included_servers": ["time", "deepwiki"],
    "excluded_tools": ["time__convert_time"]
    }

  • Example 3: Mixing approaches

  • {
    "name": "comprehensive-tools",
    "description": "Mix of manual tools, server inclusion, and exclusions",
    "included_tools": ["filesystem__read_file"],
    "included_servers": ["time"],
    "excluded_tools": ["time__convert_time"]
    }

  • Tool Group access endpoint example

  • $ mcpjungle create group -c ./claude-tools-group.json

    Tool Group claude-tools created successfully
    It is now accessible at the following streamable http endpoint:

    http://127.0.0.1:8080/v0/groups/claude-tools/mcp


    Installation Guide

    Step-by-step installation instructions from the docs:

  • Install via Homebrew:

  • brew install mcpjungle/mcpjungle/mcpjungle

  • Verify installation:

  • mcpjungle version

  • Alternative: Docker image for production deployment:

  • docker pull mcpjungle/mcpjungle

  • Quick start with Docker Compose (local development):

  • curl -O https://raw.githubusercontent.com/mcpjungle/MCPJungle/refs/heads/main/docker-compose.yaml

    docker compose up -d


  • Health check:

  • curl http://localhost:8080/health

  • Register a remote STDIO or Streamable HTTP MCP server (examples in docs) as part of setup.

  • Integration Guides

    Frequently Asked Questions

    Is this your MCP?

    Claim ownership and get verified badge

    Repository Stats

    Sponsored

    Ad Space Available
    Important Notes

    Notes and warnings from the docs:

  • SSE support exists but is not yet mature.

  • Prompts are supported and registered when MCP servers provide them.

  • Enterprise mode enables stricter security policies, including authentication, ACLs, and observability; development mode disables telemetry by default.

  • OpenTelemetry metrics are available at /metrics when enabled; in enterprise mode, OTEL is enabled by default, while in development mode you must enable it via OTEL_ENABLED.

  • Tool Groups can expose only a subset of tools; prompts are not currently supported in Tool Groups; you cannot update an existing group—delete and recreate instead.
  • Prerequisites

    MCPJungle is distributed as a stand-alone binary. Install it via Homebrew or download from the Releases page. Docker-based deployments are supported via docker-compose. For persistence, a Postgres DSN can be supplied; otherwise a SQLite database file mcpjungle.db is created by default. No Node.js prerequisite is required for MCPJungle itself.

    Details
    Last Updated1/1/2026
    Websitegithub.com
    SourceGitHub

    Compare Alternatives

    Similar MCP Tools

    9 related tools
    Playwright MCP

    Playwright MCP

    Playwright MCP server. A Model Context Protocol (MCP) server that provides browser automation capabilities using Playwright. This server enables large language models (LLMs) to interact with web pages through structured accessibility snapshots, bypassing the need for screenshots or visually-tuned models. The server is designed to be fast, lightweight, and deterministic, offering LLM-friendly tooling and a rich set of browser automation capabilities via MCP tools. It supports standalone operation, containerized deployments, and integration with a variety of MCP clients (Claude Desktop, VS Code, Copilot, Cursor, Goose, Windsurf, and others).

    Sequential Thinking MCP Server

    Sequential Thinking MCP Server

    Sequential Thinking MCP Server provides a dedicated MCP tool that guides problem-solving through a structured, step-by-step thinking process. It supports dynamic adjustment of the number of thoughts and allows revision and branching within a controlled workflow, making it ideal for complex analysis and solution hypothesis development. This server is designed to register a single tool, sequential_thinking, and is integrated with common MCP deployment methods (NPX, Docker) as well as editor integrations like Claude Desktop and VS Code for quick setup. The documentation provides exact configuration snippets, usage patterns, and building instructions to help you deploy and use the MCP server effectively, including Codex CLI, NPX, and Docker installation examples.

    N8N MCP Server

    N8N MCP Server

    An MCP (Model Context Protocol) server designed to integrate Claude Desktop, Claude Code, Windsurf, and Cursor with n8n workflows. This MCP enables users to build, test, and orchestrate complex workflows by exposing a set of tools that bridge Claude’s capabilities with n8n’s automation platform. The project emphasizes robust trigger handling, multi-tenant readiness, and progressive documentation to help developers understand how tools map to real-world workflow tasks. It also outlines future tooling integration points (such as getNodeEssentials and getNodeInfo) to further enhance node-structure awareness within MCP-powered automations.

    Hugging Face MCP Server

    Hugging Face MCP Server

    Hugging Face Official MCP Server connects your large language models (LLMs) to the Hugging Face Hub and thousands of Gradio AI Applications, enabling seamless MCP (Model Context Protocol) integration across multiple transports. It supports STDIO, SSE (to be deprecated but still commonly deployed), StreamableHTTP, and StreamableHTTPJson, with the Web Application allowing dynamic tool management and status updates. This MCP server is designed to be run locally or in Docker, and it provides integrations with Claude Desktop, Claude Code, Gemini CLI (and its extension), VSCode, and Cursor, making it easy to configure and manage MCP-enabled tools and endpoints. Tools such as hf_doc_search and hf_doc_fetch can be enabled to enhance document discovery, and an optional Authenticate tool can be included to handle OAuth challenges when called.

    Shadcn UI MCP Server v4

    Shadcn UI MCP Server v4

    Shadcn UI v4 MCP Server is an advanced MCP (Model Context Protocol) server designed to give AI assistants comprehensive access to shadcn/ui v4 components, blocks, demos, and metadata. It enables multi-framework support (React, Svelte, Vue, and React Native) with fast, cache-friendly access to component source code, demos, and directory structures, empowering AI-driven development workflows. The project emphasizes production-readiness with Docker Compose, SSE transport for multi-client deployments, and smart caching to optimize GitHub API usage while providing rich metadata and usage patterns for rapid prototyping and learning across frameworks.

    Figma MCP server

    Figma MCP server

    The Figma MCP server enables design context delivery from Figma files to AI agents and code editors, empowering teams to generate code directly from design selections. It supports both a remote hosted server and a locally hosted desktop server, allowing seamless integration with popular editors through Code Connect and a suite of tools that extract design context, metadata, variables, and more. This guide covers enabling the MCP server, configuring clients (VS Code, Cursor, Claude Code, and others), and using a curated set of MCP tools to fetch structured design data for faster, more accurate code generation. It also explains best practices, prompts, and integration workflows that help teams align generated output with their design systems. The documentation includes concrete JSON examples for configuring servers in editors like VS Code and Cursor, as well as command examples for Claude Code integration and plugin installation.

    MarkItDown MCP

    MarkItDown MCP

    MarkItDown-MCP is a lightweight MCP (Model Context Protocol) server provided as the markitdown-mcp package. It exposes a STDIO, Streamable HTTP, and SSE MCP server designed for calling MarkItDown to convert content to Markdown. The package focuses on simplicity and accessibility, enabling you to run the MCP server locally via a simple CLI, or in Docker for containerized workflows, with integration options for Claude Desktop. The core capability is exposed through a single tool, convert_to_markdown(uri), which accepts a URI in http:, https:, file:, or data: schemes to fetch content and convert it to Markdown. This MCP server is easy to install with pip and can be used in various transport modes, including STDIO and HTTP/SSE, making it a flexible choice for automations and integrations.

    Chrome MCP Server

    Chrome MCP Server

    Chrome MCP Server is a Chrome extension-based Model Context Protocol (MCP) server that exposes your Chrome browser functionality to AI assistants like Claude, enabling complex browser automation, content analysis, and semantic search. It leverages your existing Chrome environment, including login states and configurations, to allow large language models and chatbots to control the browser natively without needing to launch a separate automation process. The project emphasizes privacy by remaining fully local and offers capabilities such as cross-tab context, streamable HTTP communication, and a built-in vector database for semantic search and content analysis. As an early-stage project, it includes a growing set of tools for browser control, inspection, and automation, with ongoing development to broaden compatibility and features.

    MCP server for Appwrite docs

    MCP server for Appwrite docs

    The MCP server for Appwrite docs enables LLMs and code-generation tools to interact with comprehensive Appwrite documentation. It empowers AI assistants to access up-to-date API references, SDK guides, and implementation examples, facilitating intelligent code generation, troubleshooting, and best-practice guidance directly from the official docs. This MCP brings real-time context, semantic search, and seamless integration with popular editors and IDEs to accelerate development workflows around Appwrite's APIs and SDKs.