Back to Categories

Tools

Discover the best Tools MCP servers for AI agents. Browse tools, use cases, installation guides, and integration documentation for tools-focused Model Context Protocol implementations.

5 results found

Playwright MCP
Playwright MCP
Playwright MCP
Playwright MCP server. A Model Context Protocol (MCP) server that provides browser automation capabilities using Playwright. This server enables large language models (LLMs) to interact with web pages through structured accessibility snapshots, bypassing the need for screenshots or visually-tuned models. The server is designed to be fast, lightweight, and deterministic, offering LLM-friendly tooling and a rich set of browser automation capabilities via MCP tools. It supports standalone operation, containerized deployments, and integration with a variety of MCP clients (Claude Desktop, VS Code, Copilot, Cursor, Goose, Windsurf, and others).
Hugging Face MCP Server
Hugging Face MCP Server
Hugging Face MCP Server
Hugging Face Official MCP Server connects your large language models (LLMs) to the Hugging Face Hub and thousands of Gradio AI Applications, enabling seamless MCP (Model Context Protocol) integration across multiple transports. It supports STDIO, SSE (to be deprecated but still commonly deployed), StreamableHTTP, and StreamableHTTPJson, with the Web Application allowing dynamic tool management and status updates. This MCP server is designed to be run locally or in Docker, and it provides integrations with Claude Desktop, Claude Code, Gemini CLI (and its extension), VSCode, and Cursor, making it easy to configure and manage MCP-enabled tools and endpoints. Tools such as hf_doc_search and hf_doc_fetch can be enabled to enhance document discovery, and an optional Authenticate tool can be included to handle OAuth challenges when called.
plugged.in MCP Hub — Proxy · Knowledge · Memory · Tools
plugged.in MCP Hub — Proxy · Knowledge · Memory · Tools
plugged.in MCP Hub — Proxy · Knowledge · Memory · Tools
The plugged.in MCP Proxy Server operates as a central hub that aggregates multiple Model Context Protocol (MCP) servers into a single, unified interface. It orchestrates knowledge, memory, and tools across connected MCPs, enabling clients to query documents, manage memory, and invoke tools from various servers through one connection. With support for STDIO, Server-Sent Events (SSE), and Streamable HTTP transports, it enables seamless integration with popular MCP clients like Claude Desktop, Cline, and Cursor while providing policy, telemetry, and registry features for scalable deployments. This proxy fetches tool, prompt, and resource configurations from the plugged.in App APIs and exposes a unified catalog of capabilities. It supports static built-in tools, memory clipboard operations, and dynamic tools discovered from connected MCP servers, including tool discovery, RAG-based search, document management, and notifications. The hub also offers configuration options for HTTP transport, authentication, and session management, making it possible to run as a stateless HTTP service or a stateful STDIO proxy, with optional API-key protection for HTTP endpoints.
openai-gpt-image-mcp
openai-gpt-image-mcp
openai-gpt-image-mcp
A Model Context Protocol (MCP) tool server designed for OpenAI's GPT-4o and gpt-image-1 image generation and editing APIs. This MCP server exposes image-generation capabilities via two primary tools, create-image and edit-image, enabling developers to generate images from prompts and perform inpainting, outpainting, or compositing edits with fine-grained prompt control. It also provides file-output options so generated content can be saved to disk or returned as base64, and it supports a range of MCP-compatible clients, including Claude Desktop, Cursor, VSCode, Windsurf, among others. Built on the MCP SDK and OpenAI and OpenAI-compatible tooling, this server offers a ready-to-run solution for integrating image APIs into MCP-enabled workflows.
Pipedream MCP Server
Pipedream MCP Server
Pipedream MCP Server
Pipedream MCP Server is a reference implementation for self-hosting a Model Context Protocol (MCP) server. It showcases how to manage and serve MCP-based apps and tools in your own environment, providing you with a way to run MCP servers locally or within your organization. Note that this MCP server is a reference implementation and is no longer actively maintained; for production workloads, Pipedream recommends using the remote MCP server, which offers hosted reliability and scaling. The server supports two primary modes and integrates with Pipedream Connect for authentication and API management, enabling automatic app discovery and credential storage with enterprise-grade security.