Back to Categories

server

Discover the best server MCP servers for AI agents. Browse tools, use cases, installation guides, and integration documentation for server-focused Model Context Protocol implementations.

8 results found

Sequential Thinking MCP Server
Sequential Thinking MCP Server
Sequential Thinking MCP Server
Sequential Thinking MCP Server provides a dedicated MCP tool that guides problem-solving through a structured, step-by-step thinking process. It supports dynamic adjustment of the number of thoughts and allows revision and branching within a controlled workflow, making it ideal for complex analysis and solution hypothesis development. This server is designed to register a single tool, sequential_thinking, and is integrated with common MCP deployment methods (NPX, Docker) as well as editor integrations like Claude Desktop and VS Code for quick setup. The documentation provides exact configuration snippets, usage patterns, and building instructions to help you deploy and use the MCP server effectively, including Codex CLI, NPX, and Docker installation examples.
Hugging Face MCP Server
Hugging Face MCP Server
Hugging Face MCP Server
Hugging Face Official MCP Server connects your large language models (LLMs) to the Hugging Face Hub and thousands of Gradio AI Applications, enabling seamless MCP (Model Context Protocol) integration across multiple transports. It supports STDIO, SSE (to be deprecated but still commonly deployed), StreamableHTTP, and StreamableHTTPJson, with the Web Application allowing dynamic tool management and status updates. This MCP server is designed to be run locally or in Docker, and it provides integrations with Claude Desktop, Claude Code, Gemini CLI (and its extension), VSCode, and Cursor, making it easy to configure and manage MCP-enabled tools and endpoints. Tools such as hf_doc_search and hf_doc_fetch can be enabled to enhance document discovery, and an optional Authenticate tool can be included to handle OAuth challenges when called.
Figma MCP server
Figma MCP server
Figma MCP server
The Figma MCP server enables design context delivery from Figma files to AI agents and code editors, empowering teams to generate code directly from design selections. It supports both a remote hosted server and a locally hosted desktop server, allowing seamless integration with popular editors through Code Connect and a suite of tools that extract design context, metadata, variables, and more. This guide covers enabling the MCP server, configuring clients (VS Code, Cursor, Claude Code, and others), and using a curated set of MCP tools to fetch structured design data for faster, more accurate code generation. It also explains best practices, prompts, and integration workflows that help teams align generated output with their design systems. The documentation includes concrete JSON examples for configuring servers in editors like VS Code and Cursor, as well as command examples for Claude Code integration and plugin installation.
Github MCP Server
Github MCP Server
Github MCP Server
GitHub's official MCP Server. This repository hosts the MCP server implementation that enables Model Context Protocol (MCP) tooling for GitHub data and workflows. It exposes a wide registry of MCP tools spanning code management, repository operations, issues, pull requests, workflows, gists, and more. The documentation and commit history reveal a broad set of tools (GetMe, GetTeams, ListIssues, CreateOrUpdateFile, GetRepositoryTree, and many others) that are designed to be wired into dynamic toolsets and accessed via a consistent ServerTool pattern. This MCP server is built with extensibility in mind, supporting features like tool dependencies, dynamic toolsets, and feature flags to adapt to varied prompts and use cases. The project emphasizes a registry-driven approach where tools, resources, and prompts are defined and validated, enabling robust integration with client apps and AI models.
MCP Bundles Hub MCP Server
MCP Bundles Hub MCP Server
MCP Bundles Hub MCP Server
MCP Bundles Hub MCP Server provides direct, unified access to tools from all your enabled MCP bundles through a single authenticated endpoint. It enables executing tools, discovering what tools are available across bundles, searching by name, provider, or capability, and checking readiness and details for each tool. This hub server consolidates bundle tools into one MCP interface, streamlining AI-assisted workflows by securely managing credentials and ensuring bundle-aware execution. Built with OAuth and API key authentication, it supports tool discovery, readiness checks, and detailed tool information, making it easier to scale tool access across multiple providers and bundles.
MindsDB MCP server
MindsDB MCP server
MindsDB MCP server
MindsDB ships with a built-in MCP (Model Context Protocol) server that enables MCP applications to connect, unify and respond to questions over large-scale federated data. It spans databases, data warehouses, and SaaS applications, allowing you to query across diverse sources in a unified manner. The MCP server is integrated into MindsDB's architecture with the Connect-Unify-Respond philosophy, and you can learn more about MCP at docs.mindsdb.com/mcp/overview. You can install MindsDB Server via Docker Desktop or Docker, and the MCP server is part of the standard MindsDB deployment.
Imagen3-MCP
Imagen3-MCP
Imagen3-MCP
Imagen3-MCP is an image generation service based on Google's Imagen 3.0 that exposes its functionality through MCP (Model Control Protocol). The project provides a server to run a local MCP service that accesses Google Gemini-powered image generation, enabling developers to integrate advanced image synthesis into their applications. The documentation covers prerequisites (Gemini API key), installation steps for Cherry Studio, and a Cursor-based JSON configuration example for embedding the MCP server in broader tooling. This MCP is designed to be deployment-friendly, with configurable environment variables and optional proxy settings to adapt to various network environments.
mcpmcp-server
mcpmcp-server
mcpmcp-server
mcpmcp-server is a focused solution for discovering, setting up, and integrating MCP servers with your favorite clients to unlock AI-powered workflows. It streamlines how you connect MCP-powered servers to popular clients, enabling seamless AI-assisted interactions across your daily tools. The project emphasizes an approachable, config-driven approach to linking MCP servers with clients like Claude Desktop, while directing you to the homepage for variations across apps and platforms. This README highlights a practical JSON configuration example and notes on supported environments, helping you get started quickly and confidently.