Stop Searching. Start Trusting.
The curated directory of MCP servers, vetted for security, efficiency, and quality.
Tired of the MCP "Marketplace" Chaos?
We built MCPScout.ai to solve the ecosystems biggest pain points.
No Insecure Dumps
We manually analyze every server for basic security flaws.
Easy Setup
Our gotcha notes warn you about complex setups.
Avoid "Token Hogs"
We estimate token costs for cost-effective agents.
Products, Not Demos
We filter out "Hello World" demos.
Vetted Servers(7756)
MCPControl
by claude-did-this
A Windows control server for the Model Context Protocol, enabling AI models to programmatically control system operations such as mouse, keyboard, window management, and screen capture.
A Windows control server for the Model Context Protocol, enabling AI models to programmatically control system operations such as mouse, keyboard, window management, and screen capture.
Setup Requirements
- ⚠️Supports Windows only.
- ⚠️The `keysender` provider requires native build tools (VC++ workload, Python for node-gyp) to compile native modules.
- ⚠️The `autohotkey` provider requires AutoHotkey v2.0+ to be installed on the system.
- ⚠️Optimal click accuracy is achieved in a virtual machine at 1280x720 resolution, suggesting potential issues at other resolutions or multi-monitor setups.
- ⚠️HTTPS/TLS certificates are mandatory for production deployments for secure remote access.
Review RequiredView Analysis
gemini-cli-desktop
by Piebald-AI
A cross-platform desktop and web interface for AI coding agents (Gemini CLI, Qwen Code, LLxprt Code) offering visual tool confirmation, real-time thought processes, code diff viewing, chat history management, and file system interaction.
A cross-platform desktop and web interface for AI coding agents (Gemini CLI, Qwen Code, LLxprt Code) offering visual tool confirmation, real-time thought processes, code diff viewing, chat history management, and file system interaction.
Setup Requirements
- ⚠️Requires Rust, Node.js, pnpm, and the `just` task runner for building from source.
- ⚠️Specific system dependencies are required for Linux operating systems.
- ⚠️To function with AI models, it requires the installation of at least one backend CLI (Gemini CLI, Qwen Code, or an LLxprt-compatible provider) and/or API keys for chosen AI providers.
- ⚠️The application itself does not include the AI models; it acts as an interface to external AI services.
Verified SafeView Analysis
mcpcan
by Kymo-MCP
An open-source platform for unified management, deployment, monitoring, and protocol conversion of Model Context Protocol (MCP) services through a modern web interface.
An open-source platform for unified management, deployment, monitoring, and protocol conversion of Model Context Protocol (MCP) services through a modern web interface.
Setup Requirements
- ⚠️Requires a running MySQL 8.0+ database.
- ⚠️Requires a running Redis instance.
- ⚠️Requires either a Kubernetes cluster or Docker daemon for service orchestration.
- ⚠️Default admin credentials ('admin' / 'admin123') are set and must be changed immediately post-installation.
Verified SafeView Analysis
asya
by deliveryhero
An asynchronous, event-driven microservices platform enabling scalable, composable actor-based processing with JSON-RPC 2.0 communication, PostgreSQL job storage, and SSE streaming.
An asynchronous, event-driven microservices platform enabling scalable, composable actor-based processing with JSON-RPC 2.0 communication, PostgreSQL job storage, and SSE streaming.
Setup Requirements
- ⚠️Full deployment requires Kubernetes (Kind, EKS, etc.) and Helm for infrastructure management.
- ⚠️Requires external dependencies for persistence and messaging: PostgreSQL (for gateway job store, with Sqitch migrations) and either RabbitMQ or AWS SQS.
- ⚠️Local development/running requires both Go (for Gateway/Sidecar) and Python (for Actor Runtime) environments set up.
Verified SafeView Analysis
apple-rag-mcp
by BingoWon
Provides a comprehensive RAG (Retrieval-Augmented Generation) server for AI agents to search and retrieve content from Apple's developer documentation and WWDC transcripts.
Provides a comprehensive RAG (Retrieval-Augmented Generation) server for AI agents to search and retrieve content from Apple's developer documentation and WWDC transcripts.
Setup Requirements
- ⚠️Requires an external PostgreSQL database with pgvector extension for RAG data storage.
- ⚠️Requires a Cloudflare D1 database for authentication, rate limiting, and logging, configured via Wrangler bindings.
- ⚠️Requires a DeepInfra API key (a paid service) for embedding generation and AI reranking.
- ⚠️Designed to be deployed as a Cloudflare Worker, requiring a Cloudflare account and Wrangler CLI setup.
Verified SafeView Analysis
opencti_mcp_server
by CooperCyberCoffee
Connects Claude Desktop to OpenCTI's threat intelligence platform for AI-augmented threat intelligence analysis and reporting, enabling natural language queries and context-aware responses.
Connects Claude Desktop to OpenCTI's threat intelligence platform for AI-augmented threat intelligence analysis and reporting, enabling natural language queries and context-aware responses.
Setup Requirements
- ⚠️Requires Claude Desktop for MCP integration.
- ⚠️Requires OpenCTI 6.x instance.
- ⚠️Requires Claude Pro subscription (if using cloud LLM) or local LLM (e.g., Ollama) running for AI analysis.
Verified SafeView Analysis
skilder
by skilder-ai
Skilder is an infrastructure layer for AI agent tooling, providing a private tool registry with embedded runtimes that works across any agent environment.
Skilder is an infrastructure layer for AI agent tooling, providing a private tool registry with embedded runtimes that works across any agent environment.
Setup Requirements
- ⚠️Requires Docker to run the entire platform.
- ⚠️A one-time setup step (`npm run setup-local`) is required to generate cryptographic keys, which depends on Node.js locally.
- ⚠️Local development requires Node.js v22+.
Verified SafeView Analysis
hf-mcp-server
by huggingface
Connects LLMs to the Hugging Face Hub and Gradio AI applications, enabling access to models, datasets, documentation, and job management.
Connects LLMs to the Hugging Face Hub and Gradio AI applications, enabling access to models, datasets, documentation, and job management.
Setup Requirements
- ⚠️Requires a Hugging Face account and personal access token (`HF_TOKEN`) for most functionalities and higher API rate limits.
- ⚠️Requires Node.js runtime (v18+ recommended) to execute the server.
- ⚠️Requires active internet connectivity to interact with the Hugging Face Hub and remote Gradio Spaces.
Review RequiredView Analysis
mcp-foundry
by azure-ai-foundry
A Model Context Protocol (MCP) server for Azure AI Foundry, providing a unified set of tools for interacting with Azure AI models, knowledge bases (AI Search), evaluation services, and finetuning operations.
A Model Context Protocol (MCP) server for Azure AI Foundry, providing a unified set of tools for interacting with Azure AI models, knowledge bases (AI Search), evaluation services, and finetuning operations.
Setup Requirements
- ⚠️Requires 'uv' (universal Python package manager) for execution.
- ⚠️Requires Azure CLI to be installed and configured for Azure resource management tools.
- ⚠️Extensive Azure cloud service dependencies (Azure AI Search, Azure OpenAI, Azure AI Project, Azure Cognitive Services) requiring active subscriptions and API keys/credentials.
- ⚠️Requires Python 3.10 or higher.
- ⚠️Setting `SWAGGER_PATH` environment variable is required for dynamic Swagger tool registration.
Review RequiredView Analysis
2ly
by AlpinAI
Skilder is an infrastructure layer for AI agent tooling, providing a private tool registry and embedded runtimes for integrating with various agent frameworks and custom tools.
Skilder is an infrastructure layer for AI agent tooling, providing a private tool registry and embedded runtimes for integrating with various agent frameworks and custom tools.
Setup Requirements
- ⚠️Requires Docker for deployment (local or production).
- ⚠️Node.js v22+ is required for local development.
- ⚠️Requires initial generation of cryptographic keys via `npm run setup-local` (or `sh ./generate-keys.sh`) for local development, which are then stored in `dev/.docker-keys/`.
- ⚠️Relies on NATS and Dgraph as core infrastructure components, which are managed via Docker Compose.
Verified SafeView Analysis
KiCAD-MCP-Server
by mixelpixx
Enables AI assistants to interact with KiCAD for PCB design automation, providing comprehensive tool schemas and real-time project state access for intelligent PCB design workflows.
Enables AI assistants to interact with KiCAD for PCB design automation, providing comprehensive tool schemas and real-time project state access for intelligent PCB design workflows.
Setup Requirements
- ⚠️Requires KiCAD 9.0 or higher, with its Python module (`pcbnew`) installed and accessible in the Python environment.
- ⚠️Requires proper configuration of `PYTHONPATH` in the MCP client's environment variables to locate KiCAD's Python modules.
- ⚠️For real-time UI synchronization, the KiCAD 9.0+ IPC API Server must be manually enabled in KiCAD preferences (`Preferences > Plugins > Enable IPC API Server`), and KiCAD must be running with a board open.
- ⚠️Node.js 18+ and Python 3.10+ are required, with specific versions of `kicad-python` and `kicad-skip`.
Verified SafeView Analysis
1mcp
by buremba
Orchestrates AI agent tool calls by executing JavaScript/TypeScript code in a WASM sandbox, reducing LLM context bloat and managing security policies.
Orchestrates AI agent tool calls by executing JavaScript/TypeScript code in a WASM sandbox, reducing LLM context bloat and managing security policies.
Setup Requirements
- ⚠️Requires Node.js version >=22.0.0.
- ⚠️Initial setup requires network access to download WASM runtimes (QuickJS/Pyodide) from CDN.
- ⚠️Python dependencies must be 'wheel-only' (no native extensions or sdists) and compatible with Pyodide.