Stop Searching. Start Trusting.
The curated directory of MCP servers, vetted for security, efficiency, and quality.
Tired of the MCP "Marketplace" Chaos?
We built MCPScout.ai to solve the ecosystems biggest pain points.
No Insecure Dumps
We manually analyze every server for basic security flaws.
Easy Setup
Our gotcha notes warn you about complex setups.
Avoid "Token Hogs"
We estimate token costs for cost-effective agents.
Products, Not Demos
We filter out "Hello World" demos.
Vetted Servers(8554)
django-mcp-integration
by mosco23
Provides a seamless integration for exposing Model Context Protocol (MCP) tools as an asynchronous API within a Django application using FastMCP.
Provides a seamless integration for exposing Model Context Protocol (MCP) tools as an asynchronous API within a Django application using FastMCP.
Setup Requirements
- ⚠️Requires Django (version 3.2 or higher).
- ⚠️Requires an ASGI server like Uvicorn for production deployment.
- ⚠️All exposed tools (functions or class methods) must be asynchronous (`async def`).
- ⚠️If API key authentication is desired, API keys must be generated and managed via the Django admin interface.
Verified SafeView Analysis
mcp-reference-server
by chief-builder
An AI agent server that orchestrates LLM interactions with tool execution for the Model Context Protocol (MCP).
An AI agent server that orchestrates LLM interactions with tool execution for the Model Context Protocol (MCP).
Setup Requirements
- ⚠️The `MCP_CURSOR_SECRET` environment variable is mandatory for server startup and must be at least 32 characters long.
- ⚠️For the client-side AI agent functionality, at least one LLM API key (`ANTHROPIC_API_KEY` or `OPENROUTER_API_KEY`) is required for full operation.
- ⚠️The default HTTP server configuration allows all CORS origins (`*`), which is insecure for production environments and should be explicitly restricted.
Verified SafeView Analysis
dynamic-java-mcp-server
by gemo12123
Dynamically registers and exposes internal Java tools as HTTP APIs for AI models using the Model Context Protocol (MCP), enabling a single-process multi-MCP server setup.
Dynamically registers and exposes internal Java tools as HTTP APIs for AI models using the Model Context Protocol (MCP), enabling a single-process multi-MCP server setup.
Setup Requirements
- ⚠️Requires Java Development Kit (JDK) 17+ (typical for modern Spring Boot applications).
- ⚠️Relies on the Model Context Protocol (MCP) and Spring AI components, which might require specific configurations or dependencies to integrate with actual AI models.
- ⚠️Dynamic tool registration and service discovery depend on the source of `ModuleDefinition` and `ServiceInstance` data, which is not detailed in the truncated code.
Review RequiredView Analysis
embabel-confluence-mcp-server
by BootcampToProd
A demo Model Context Protocol (MCP) server built with Embabel Framework to connect AI agents (like Claude Desktop) with the Atlassian Confluence REST API for documentation management.
A demo Model Context Protocol (MCP) server built with Embabel Framework to connect AI agents (like Claude Desktop) with the Atlassian Confluence REST API for documentation management.
Setup Requirements
- ⚠️Requires Java 21 or higher.
- ⚠️Requires an OpenRouter API Key (free tier available).
- ⚠️Requires Confluence Base URL and a base64-encoded Confluence API Token for authentication.
Verified SafeView Analysis
open-wearables-mcp
by healthkowshik
Enables AI assistants to query wearable health data through natural language by integrating with an Open Wearables backend.
Enables AI assistants to query wearable health data through natural language by integrating with an Open Wearables backend.
Setup Requirements
- ⚠️Requires a separate Open Wearables backend to be running or access to a deployed instance.
- ⚠️Requires a valid Open Wearables API key to access any user data.
- ⚠️Requires the 'uv' package manager (Python 3.11+) for dependency management and execution, not just pip.
Verified SafeView Analysis
mcp-server-gitee-pull-request
by liliangshan
Automates Gitee Pull Request creation, review, testing, and merging for development workflows.
Automates Gitee Pull Request creation, review, testing, and merging for development workflows.
Setup Requirements
- ⚠️Requires a Gitee account and a registered OAuth application to generate API credentials (client_id, client_secret).
- ⚠️Eight specific environment variables are mandatory: Gitee account credentials (scope_username, scope_password), OAuth app credentials (scope_client_id, scope_client_secret), and repository details (owner, repo, head, base).
- ⚠️Gitee source (`head`) and target (`base`) branch names will be automatically formatted to 'branch (name)' if not already in that specific format, which might be an unexpected behavior for some users.
Verified SafeView Analysis
spring-kt-mcp-server
by jaydenchuljinlee
This project implements a Multi-Content Prompt (MCP) server that exposes news aggregation functionality as discoverable tools for an MCP client.
This project implements a Multi-Content Prompt (MCP) server that exposes news aggregation functionality as discoverable tools for an MCP client.
Setup Requirements
- ⚠️Requires Java 21 to run.
- ⚠️Requires `OPENAI_API_KEY` environment variable, even if the provided tools don't directly use OpenAI (it's configured for Spring AI).
- ⚠️A client expects the server to be running first for full functionality (e.g., `ping` tool check).
Verified SafeView Analysis
mcp-srver-starter-pack
by namurokuro
Integrate Blender 3D software with Cursor IDE via Model Context Protocol (MCP) to enable natural language control, create 3D scenes, query operation history, access code patterns, monitor agent activities, and leverage specialized AI agents.
Integrate Blender 3D software with Cursor IDE via Model Context Protocol (MCP) to enable natural language control, create 3D scenes, query operation history, access code patterns, monitor agent activities, and leverage specialized AI agents.
Setup Requirements
- ⚠️Requires local Ollama LLM server running on http://localhost:11434.
- ⚠️Requires a custom Python addon running within Blender on http://localhost:9876 for communication.
- ⚠️Requires a local Stable Diffusion server (e.g., via Docker) for image/video generation tools.
- ⚠️Python dependencies must be installed via `pip install -r requirements.txt`.
Review RequiredView Analysis
HomeMCP
by Blizarre
A personal home automation server that plays radio and sends Telegram messages, designed to be controlled by an AI assistant like Claude.
A personal home automation server that plays radio and sends Telegram messages, designed to be controlled by an AI assistant like Claude.
Setup Requirements
- ⚠️Requires Python 3.13+
- ⚠️Requires `ffmpeg` (specifically `ffplay` CLI) or another CLI music player installed locally
- ⚠️Requires `telegram_bot_token`, `my_chat_id`, `madame_chat_id`, `player_args`, `port`, and `host` to be configured in a `.env` file
Verified SafeView Analysis
A multi-agent LangGraph application for generating Cognitive Behavioral Therapy (CBT) exercises, persisted using a PostgreSQL checkpointer.
A multi-agent LangGraph application for generating Cognitive Behavioral Therapy (CBT) exercises, persisted using a PostgreSQL checkpointer.
Setup Requirements
- ⚠️Requires a running PostgreSQL database instance with appropriate credentials.
- ⚠️The PostgreSQL connection string is hardcoded in `main.py` and should be moved to an environment variable.
- ⚠️Requires Python 3.13 or higher, as specified in `pyproject.toml`.
- ⚠️Requires a Groq API Key (e.g., `GROQ_API_KEY`) to be set in environment variables for the `ChatGroq` LLM.
Review RequiredView Analysis
mcp_hello_server
by akeredolukola
Minimal MCP-style server implemented with FastAPI for demonstrating and extending Model Context Protocol interactions.
Minimal MCP-style server implemented with FastAPI for demonstrating and extending Model Context Protocol interactions.
Setup Requirements
- ⚠️Requires Python 3 environment setup (venv recommended)
- ⚠️Requires uvicorn and other dependencies from requirements.txt
Verified SafeView Analysis
pix-bi-mcp
by aiqidao-dg
Backend server for PIX BI platform, integrating with its API and designed for deployment on Render.com.
Backend server for PIX BI platform, integrating with its API and designed for deployment on Render.com.
Setup Requirements
- ⚠️Requires Git installed locally.
- ⚠️Requires a GitHub repository to be created (or existing) for pushing code.
- ⚠️Requires a PIX BI API Token for server functionality, which must be manually configured as an environment variable in Render.com.
- ⚠️Requires a Render.com account for deployment.
- ⚠️The 'publish_to_github.py' script requires a GitHub Personal Access Token or SSH keys configured for pushing to GitHub, with the PAT method being insecure.