MCP Registry
Browse the complete directory of MCP servers. Search by name, description, or filter by category.
WayStation MCP server
WayStation MCP server is a universal remote MCP server that connects Claude (and other clients) to a broad range of productivity tools through a no-code, secure integration hub. It supports both Streamable HTTPS and SSE transports and negotiates transport and authentication using a default endpoint at https://waystation.ai/mcp. The server also exposes preauthenticated endpoints (for example, https://waystation.ai/mcp/Iddq66dIdkfARDNb3K) that any registered user can obtain from their dashboard at https://waystation.ai/dashboard. Through the Integrations Marketplace, users can discover and connect to Notion, Monday, Airtable, Jira, and many other providers with OAuth2-based authentication flows, enabling seamless workflows without writing code.
plugged.in MCP Hub — Proxy · Knowledge · Memory · Tools
The plugged.in MCP Proxy Server operates as a central hub that aggregates multiple Model Context Protocol (MCP) servers into a single, unified interface. It orchestrates knowledge, memory, and tools across connected MCPs, enabling clients to query documents, manage memory, and invoke tools from various servers through one connection. With support for STDIO, Server-Sent Events (SSE), and Streamable HTTP transports, it enables seamless integration with popular MCP clients like Claude Desktop, Cline, and Cursor while providing policy, telemetry, and registry features for scalable deployments.
This proxy fetches tool, prompt, and resource configurations from the plugged.in App APIs and exposes a unified catalog of capabilities. It supports static built-in tools, memory clipboard operations, and dynamic tools discovered from connected MCP servers, including tool discovery, RAG-based search, document management, and notifications. The hub also offers configuration options for HTTP transport, authentication, and session management, making it possible to run as a stateless HTTP service or a stateful STDIO proxy, with optional API-key protection for HTTP endpoints.
MCP Access Point
MCP Access Point is a lightweight gateway that turns existing HTTP services into MCP (Model Context Protocol) endpoints with zero code changes. Built on high-performance Pingora proxy, it enables seamless protocol conversion between HTTP and MCP, supporting both SSE and Streamable HTTP. Designed for multi-tenant deployments, it offers a RESTful Admin API for real-time configuration management, dynamic updates, and resource administration without restarting the service. This repository provides a clear Quick Start, multi-tenancy guidance, and admin operations to manage upstreams, services, routes, and more, making it easy to expose legacy HTTP APIs to MCP clients like Cursor Desktop and MCP Inspectors.
openai-gpt-image-mcp
A Model Context Protocol (MCP) tool server designed for OpenAI's GPT-4o and gpt-image-1 image generation and editing APIs. This MCP server exposes image-generation capabilities via two primary tools, create-image and edit-image, enabling developers to generate images from prompts and perform inpainting, outpainting, or compositing edits with fine-grained prompt control. It also provides file-output options so generated content can be saved to disk or returned as base64, and it supports a range of MCP-compatible clients, including Claude Desktop, Cursor, VSCode, Windsurf, among others. Built on the MCP SDK and OpenAI and OpenAI-compatible tooling, this server offers a ready-to-run solution for integrating image APIs into MCP-enabled workflows.
MCP Bundles Hub MCP Server
MCP Bundles Hub MCP Server provides direct, unified access to tools from all your enabled MCP bundles through a single authenticated endpoint. It enables executing tools, discovering what tools are available across bundles, searching by name, provider, or capability, and checking readiness and details for each tool. This hub server consolidates bundle tools into one MCP interface, streamlining AI-assisted workflows by securely managing credentials and ensuring bundle-aware execution. Built with OAuth and API key authentication, it supports tool discovery, readiness checks, and detailed tool information, making it easier to scale tool access across multiple providers and bundles.
Magg: The MCP Aggregator
Magg is an MCP Aggregator – a meta-MCP server that manages, aggregates, and proxies multiple MCP servers. It acts as a central hub for discovering, configuring, and orchestrating MCP servers, allowing large language models to extend their capabilities at runtime. Magg exposes a suite of tools to search, add, configure, enable/disable, and proxy MCP servers and their tools, merging them under unified prefixes and persisting configurations across sessions. It also includes built-in health and status tools, Real-time Notifications, and MBro (MCP Browser) for interactive exploration, making it easier to compose, manage, and monitor complex MCP ecosystems. Whether you’re running stdio, HTTP, or hybrid transports, Magg provides flexible deployment modes, kit management, and secure access with optional JWT-based authentication.
Pipedream MCP Server
Pipedream MCP Server is a reference implementation for self-hosting a Model Context Protocol (MCP) server. It showcases how to manage and serve MCP-based apps and tools in your own environment, providing you with a way to run MCP servers locally or within your organization. Note that this MCP server is a reference implementation and is no longer actively maintained; for production workloads, Pipedream recommends using the remote MCP server, which offers hosted reliability and scaling. The server supports two primary modes and integrates with Pipedream Connect for authentication and API management, enabling automatic app discovery and credential storage with enterprise-grade security.
NCP - Natural Context Provider (NCP)
NCP is a unified MCP platform that consolidates 50+ tools, skills, and Photons into a single, intelligent interface. It enables code-mode execution, on-demand loading, scheduling, and semantic tool discovery, dramatically reducing token usage and latency while enabling AI assistants to work with external MCPs, skills, and Photons. This documentation covers how NCP works, the available MCPs and tools, installation and integration steps for popular clients (Claude Desktop, VS Code, and more), and practical examples that demonstrate how to find, run, and compose tools across MCPs. Whether you’re building with internal MCPs or exploring external tools, NCP provides a scalable, vendor-agnostic foundation for AI-powered automation and tool orchestration.
MindsDB MCP server
MindsDB ships with a built-in MCP (Model Context Protocol) server that enables MCP applications to connect, unify and respond to questions over large-scale federated data. It spans databases, data warehouses, and SaaS applications, allowing you to query across diverse sources in a unified manner. The MCP server is integrated into MindsDB's architecture with the Connect-Unify-Respond philosophy, and you can learn more about MCP at docs.mindsdb.com/mcp/overview. You can install MindsDB Server via Docker Desktop or Docker, and the MCP server is part of the standard MindsDB deployment.
MetaMCP
MetaMCP is a MCP proxy that lets you dynamically aggregate MCP servers into a unified MCP server, and apply middlewares. MetaMCP itself is a MCP server so it can be easily plugged into ANY MCP clients. It functions as an MCP Aggregator, Orchestrator, Middleware, and Gateway all in one docker image, enabling scalable, configurable hosting of multiple MCP servers behind a single endpoint with flexible authentication, tooling, and annotations. This README introduces core concepts such as MCP Server configurations, Namespaces, Endpoints, Middleware, Inspector, and Tool Overrides & Annotations, and provides quick-start guidance for running MetaMCP with Docker, building a development environment, and integrating with clients like Claude Desktop via proxies. It also covers MCP protocol compatibility, authentication options (including API keys, OAuth, and OIDC), and integration guidance for developers looking to remix MCP tool flows and middleware pipelines.
Claude Skills MCP Server
Claude Skills MCP Server is an MCP server that enables intelligent search and retrieval of Claude Agent Skills using vector embeddings and semantic similarity. It implements a progressive disclosure architecture so AI applications can discover and load skills in stages (metadata → full content → files) while remaining fast and local. The server can load skills from multiple sources, including Official Anthropic Skills, K-Dense AI Scientific Skills, and local directories, providing a zero-configuration experience out of the box for Cursor or standalone usage. The architecture is split into a lightweight frontend and a heavy backend, enabling instant startup and background backend download, with no API keys required and the ability to connect to remote hosted backends if desired.
External MCP Server
Neurolink includes an External MCP Server capability, enabling seamless integration with external Model Context Protocol (MCP) servers. This feature loads and manages external MCP servers from a dedicated configuration file (.mcp-config.json), enables real JSON-RPC based communication, and supports end-to-end tool execution within the NeuroLink platform. It is designed for multi-provider AI workflows, allowing providers to delegate tool execution to external servers while preserving type safety, robust error handling, and deterministic behavior. The documentation highlights how to configure external MCP servers, register and discover tools, and perform end-to-end tool execution through the CLI, ensuring a production-ready MCP ecosystem.
Anyquery MCP
Anyquery MCP is the Model Context Protocol endpoint for the Anyquery SQL engine, enabling large language models (LLMs) like ChatGPT and Claude to connect to and query data through MCP. This MCP server acts as a bridge between LLMs and Anyquery’s data integrations, allowing LLMs to contextually access files, databases, and apps via the MCP interface. It complements Anyquery’s SQL querying capabilities by providing a standardized, secure channel for LLMs to request data access, execute SQL-like interactions, and receive structured results. The MCP integration is designed to be used in conjunction with Anyquery’s SQL runtime, including its MySQL-compatible server mode for client tooling, and is part of the broader MCP-enabled ecosystem described in the project docs.
Imagen3-MCP
Imagen3-MCP is an image generation service based on Google's Imagen 3.0 that exposes its functionality through MCP (Model Control Protocol). The project provides a server to run a local MCP service that accesses Google Gemini-powered image generation, enabling developers to integrate advanced image synthesis into their applications. The documentation covers prerequisites (Gemini API key), installation steps for Cherry Studio, and a Cursor-based JSON configuration example for embedding the MCP server in broader tooling. This MCP is designed to be deployment-friendly, with configurable environment variables and optional proxy settings to adapt to various network environments.
mcpmcp-server
mcpmcp-server is a focused solution for discovering, setting up, and integrating MCP servers with your favorite clients to unlock AI-powered workflows. It streamlines how you connect MCP-powered servers to popular clients, enabling seamless AI-assisted interactions across your daily tools. The project emphasizes an approachable, config-driven approach to linking MCP servers with clients like Claude Desktop, while directing you to the homepage for variations across apps and platforms. This README highlights a practical JSON configuration example and notes on supported environments, helping you get started quickly and confidently.
MCPJungle
MCPJungle is a self-hosted MCP Gateway and Registry for AI agents. It serves as a central registry and gateway to manage Model Context Protocol (MCP) servers and the tools they expose. By consolidating MCP server registration, tool discovery, and access control, MCPJungle enables AI agents and clients to discover, group, and securely invoke tools from a single, unified gateway. The project provides a CLI, Docker-based deployment options, and enterprise-ready features such as tool grouping, access control, and observability to streamline MCP-based workflows across organizations.
Roundtable AI MCP Server
Roundtable AI MCP Server is a zero-configuration local MCP server that unifies multiple AI coding assistants (Codex, Claude Code, Cursor, Gemini) through intelligent auto-discovery and a standardized interface. It coordinates specialized sub-agents from within your IDE to solve engineering problems in parallel, sharing context and synthesizing responses into a single, high-quality output. This documentation details installation, available MCP tools, integration with popular IDEs, and a broad ecosystem of specialized tools and CLIs that can be invoked as part of a roundtable-powered workflow, enabling developers to delegate tasks to the right AI for each facet of a problem without leaving their development environment.
1MCP Agent
A unified Model Context Protocol server implementation that aggregates multiple MCP servers into one. The 1mcp-app/agent is an open-source project that provides a single entry point for multiple MCP servers, making it easier to manage and interact with various AI models and tools.
Anki MCP Server
A Model Context Protocol (MCP) server that enables AI assistants to interact with Anki, the spaced repetition flashcard application. The Anki MCP Server allows AI models to access Anki's card data, enabling features like automated flashcard creation, review, and management.