Imagen3-MCP Details
Imagen3-MCP is an image generation service based on Google's Imagen 3.0 that exposes its functionality through MCP (Model Control Protocol). The project provides a server to run a local MCP service that accesses Google Gemini-powered image generation, enabling developers to integrate advanced image synthesis into their applications. The documentation covers prerequisites (Gemini API key), installation steps for Cherry Studio, and a Cursor-based JSON configuration example for embedding the MCP server in broader tooling. This MCP is designed to be deployment-friendly, with configurable environment variables and optional proxy settings to adapt to various network environments.
Use Case
Imagen3-MCP acts as a server that provides access to Google's Imagen 3.0 image generation via MCP. It is intended for developers who want to run a local or containerized image generator and expose it to their applications through MCP. The documentation shows how to configure the MCP server with a command, environment variables (notably GEMINI_API_KEY), and optional settings for proxying and server addresses. Example usage includes a Cursor-style JSON configuration that demonstrates how to structure the mcpServers block and environment variables for Gemini authentication and optional server customization.
Examples & Tutorials
{
"mcpServers": {
"imagen3": {
"command": "C:\\bin\\imagen3-mcp.exe",
"env": {
"GEMINI_API_KEY": "<GEMINI_API_KEY>"
// Optional environment variables:
// "BASE_URL": "<PROXY_URL>",
// "SERVER_LISTEN_ADDR": "0.0.0.0", // Example: Listen on all interfaces
// "SERVER_PORT": "9981",
// "IMAGE_RESOURCE_SERVER_ADDR": "your.domain.com" // Example: Use a domain name for image URLs
}
}
}
}This demonstrates how to configure the MCP server in a Cursor-style JSON snippet as documented.
Installation Guide
安装步骤——Cherry Studio:
C:\bin\imagen3-mcp.exeC:\bin\imagen3-mcp.exeGEMINI_API_KEY 中填写你的 Gemini API 密钥BASE_URL 中填写代理地址,例如 https://lingxi-proxy.hamflx.dev/api/provider/google(这个地址可以解决 GFW 的问题,但是解决不了 Google 对 IP 的限制问题,因此还是得挂梯子)。SERVER_LISTEN_ADDR:设置服务器监听的 IP 地址(默认为 127.0.0.1)。SERVER_PORT:设置服务器监听的端口和图片 URL 使用的端口(默认为 9981)。IMAGE_RESOURCE_SERVER_ADDR:设置图片 URL 中使用的服务器地址(默认为 127.0.0.1)。这在服务器运行在容器或远程机器上时很有用。Integration Guides
Frequently Asked Questions
Is this your MCP?
Claim ownership and get verified badge
Sponsored
Requires a valid Google Gemini API key. Optional: BASE_URL for proxy to bypass GFW, but it does not solve Google IP restrictions. The BASE_URL example shows how to proxy API requests. In addition, there are optional env vars for SERVER_LISTEN_ADDR, SERVER_PORT, and IMAGE_RESOURCE_SERVER_ADDR to customize hosting and image URL generation.
Valid Google Gemini API key (GEMINI_API_KEY). Optional: BASE_URL for proxy, SERVER_LISTEN_ADDR, SERVER_PORT, and IMAGE_RESOURCE_SERVER_ADDR as environment variables.
Compare Alternatives
Similar MCP Tools
9 related toolsGraphiti MCP Server
Graphiti MCP Server is an experimental implementation that exposes Graphiti's real-time, temporally-aware knowledge graph capabilities through the MCP (Model Context Protocol) interface. It enables AI agents and MCP clients to interact with Graphiti's knowledge graph for structured extraction, reasoning, and memory across conversations, documents, and enterprise data. The server supports multiple backends (FalkorDB by default and Neo4j), a variety of LLM providers (OpenAI, Anthropic, Gemini, Groq, Azure OpenAI), and multiple embedder options, all accessible via an HTTP MCP endpoint at /mcp/ for broad client compatibility. It also includes queue-based asynchronous episode processing, rich entity types for structured data, and flexible configuration through config.yaml, environment variables, or CLI arguments.
Context7 MCP Server
Context7 MCP Server delivers up-to-date, code-first documentation and examples for LLMs and AI code editors by pulling content directly from the source. It supports multiple MCP clients and exposes tools that help you resolve library IDs and retrieve library documentation, ensuring prompts use current APIs and usage patterns. The repository provides installation and integration guides for Cursor, Claude Code, Opencode, and other clients, along with practical configuration samples and OAuth options for remote HTTP connections. This MCP server is designed to keep prompts in sync with the latest library docs, reducing hallucinations and outdated code snippets.
TrendRadar MCP
TrendRadar MCP is an AI-driven Model Context Protocol (MCP) based analysis server that exposes a suite of specialized tools for cross-platform news analysis, trend tracking, and intelligent push notifications. It integrates with TrendRadar’s multi-platform data aggregation (RSS and trending topics) and provides advanced AI-powered insights, sentiment analysis, and cross-platform correlation. The MCP server enables developers to query, analyze, and compare news across platforms using a consistent toolset, with ongoing updates that expand capabilities such as RSS querying, date parsing, and multi-date trend analysis. This documentation references the MCP module updates, tool additions, and architecture changes that enhance extensibility, cross-platform data handling, and AI-assisted reporting.
ChainAware Behavioural Prediction MCP
The ChainAware Behavioural Prediction MCP is an MCP-based server that provides AI-powered tools to analyze wallet behaviour prediction, fraud detection, and rug pull prediction. Designed for Web3 security and DeFi analytics, it enables developers and platforms to integrate risk assessment, predictive wallet behavior insights, and rug-pull detection through MCP-compatible clients. The server exposes three specialized tools and uses Server-Sent Events (SSE) for real-time responses, helping safeguard DeFi users, monitor liquidity risks, and score wallet or contract trustworthiness. Access to production endpoints is API-key gated, reflecting a private backend architecture that supports secure, scalable risk analytics across wallets, contracts, and pools.
Playwright MCP
Playwright MCP server. A Model Context Protocol (MCP) server that provides browser automation capabilities using Playwright. This server enables large language models (LLMs) to interact with web pages through structured accessibility snapshots, bypassing the need for screenshots or visually-tuned models. The server is designed to be fast, lightweight, and deterministic, offering LLM-friendly tooling and a rich set of browser automation capabilities via MCP tools. It supports standalone operation, containerized deployments, and integration with a variety of MCP clients (Claude Desktop, VS Code, Copilot, Cursor, Goose, Windsurf, and others).
Sequential Thinking MCP Server
Sequential Thinking MCP Server provides a dedicated MCP tool that guides problem-solving through a structured, step-by-step thinking process. It supports dynamic adjustment of the number of thoughts and allows revision and branching within a controlled workflow, making it ideal for complex analysis and solution hypothesis development. This server is designed to register a single tool, sequential_thinking, and is integrated with common MCP deployment methods (NPX, Docker) as well as editor integrations like Claude Desktop and VS Code for quick setup. The documentation provides exact configuration snippets, usage patterns, and building instructions to help you deploy and use the MCP server effectively, including Codex CLI, NPX, and Docker installation examples.
N8N MCP Server
An MCP (Model Context Protocol) server designed to integrate Claude Desktop, Claude Code, Windsurf, and Cursor with n8n workflows. This MCP enables users to build, test, and orchestrate complex workflows by exposing a set of tools that bridge Claude’s capabilities with n8n’s automation platform. The project emphasizes robust trigger handling, multi-tenant readiness, and progressive documentation to help developers understand how tools map to real-world workflow tasks. It also outlines future tooling integration points (such as getNodeEssentials and getNodeInfo) to further enhance node-structure awareness within MCP-powered automations.
Hugging Face MCP Server
Hugging Face Official MCP Server connects your large language models (LLMs) to the Hugging Face Hub and thousands of Gradio AI Applications, enabling seamless MCP (Model Context Protocol) integration across multiple transports. It supports STDIO, SSE (to be deprecated but still commonly deployed), StreamableHTTP, and StreamableHTTPJson, with the Web Application allowing dynamic tool management and status updates. This MCP server is designed to be run locally or in Docker, and it provides integrations with Claude Desktop, Claude Code, Gemini CLI (and its extension), VSCode, and Cursor, making it easy to configure and manage MCP-enabled tools and endpoints. Tools such as hf_doc_search and hf_doc_fetch can be enabled to enhance document discovery, and an optional Authenticate tool can be included to handle OAuth challenges when called.
Shadcn UI MCP Server v4
Shadcn UI v4 MCP Server is an advanced MCP (Model Context Protocol) server designed to give AI assistants comprehensive access to shadcn/ui v4 components, blocks, demos, and metadata. It enables multi-framework support (React, Svelte, Vue, and React Native) with fast, cache-friendly access to component source code, demos, and directory structures, empowering AI-driven development workflows. The project emphasizes production-readiness with Docker Compose, SSE transport for multi-client deployments, and smart caching to optimize GitHub API usage while providing rich metadata and usage patterns for rapid prototyping and learning across frameworks.