Stop Searching. Start Trusting.
The curated directory of MCP servers, vetted for security, efficiency, and quality.
Tired of the MCP "Marketplace" Chaos?
We built MCPScout.ai to solve the ecosystems biggest pain points.
No Insecure Dumps
We manually analyze every server for basic security flaws.
Easy Setup
Our gotcha notes warn you about complex setups.
Avoid "Token Hogs"
We estimate token costs for cost-effective agents.
Products, Not Demos
We filter out "Hello World" demos.
Vetted Servers(1547)
deep-research
by u14app
An AI-powered research assistant that generates comprehensive reports, leverages various LLMs and web search engines, and offers integration as a SaaS or MCP service.
An AI-powered research assistant that generates comprehensive reports, leverages various LLMs and web search engines, and offers integration as a SaaS or MCP service.
Setup Requirements
- ⚠️Requires API Keys for multiple Large Language Models (LLMs) and search providers, most of which are paid services.
- ⚠️Requires `ACCESS_PASSWORD` environment variable for authentication, particularly in proxy mode.
- ⚠️If using Ollama, a local Ollama instance must be running at `http://localhost:11434`.
Review RequiredView Analysis
mcp
by MicrosoftDocs
Provides AI assistants with direct, real-time access to official Microsoft Learn documentation to prevent hallucinations and retrieve accurate technical information.
Provides AI assistants with direct, real-time access to official Microsoft Learn documentation to prevent hallucinations and retrieve accurate technical information.
Setup Requirements
- ⚠️Requires an MCP-compatible IDE or client (e.g., VS Code, Claude Desktop, Cursor) for integration.
- ⚠️The remote endpoint does not support direct browser access; it returns a '405 Method Not Allowed' error if accessed manually.
Verified SafeView Analysis
genai-toolbox
by googleapis
MCP Toolbox for Databases is an open-source server enabling AI agents to interact with various databases through defined tools, simplifying development, improving performance, and enhancing security for Gen AI applications.
MCP Toolbox for Databases is an open-source server enabling AI agents to interact with various databases through defined tools, simplifying development, improving performance, and enhancing security for Gen AI applications.
Setup Requirements
- ⚠️Requires setup and configuration of specific database instances (e.g., PostgreSQL, MySQL, BigQuery, MongoDB, Neo4j) to be accessible.
- ⚠️Configuration relies on a 'tools.yaml' file, which can contain sensitive credentials if not managed via secret managers (e.g., Google Cloud Secret Manager for Cloud Run deployments).
- ⚠️Many tools, especially 'execute-sql' types and those with 'templateParameters', expose direct database interaction that can lead to injection vulnerabilities if not carefully controlled and reviewed.
Verified SafeView Analysis
awesome-mcp-servers
by punkpeye
This repository serves as a curated directory for discovering a wide range of Model Context Protocol (MCP) servers, designed to extend AI capabilities by enabling interaction with local and remote resources.
This repository serves as a curated directory for discovering a wide range of Model Context Protocol (MCP) servers, designed to extend AI capabilities by enabling interaction with local and remote resources.
Setup Requirements
- ⚠️To inform an LLM about the Model Context Protocol and how to utilize the servers listed, users must manually provide external documentation (e.g., `https://modelcontextprotocol.io/llms-full.txt`) to their AI client.
Verified SafeView Analysis
activepieces
by activepieces
An open-source, extensible AI automation platform designed as a Zapier alternative, supporting low-code/no-code workflows and integration with Large Language Models (LLMs) through a type-safe TypeScript framework.
An open-source, extensible AI automation platform designed as a Zapier alternative, supporting low-code/no-code workflows and integration with Large Language Models (LLMs) through a type-safe TypeScript framework.
Setup Requirements
- ⚠️Requires Docker for production deployment, or Node.js v18/v20 and Bun for local development.
- ⚠️Production deployments (e.g., via Pulumi) require an AWS account, configured Route 53 for custom domains, and familiarity with AWS ECS Fargate, RDS (PostgreSQL), and ElastiCache (Redis).
- ⚠️CRITICAL: The default `AP_EXECUTION_MODE` in Pulumi deployment is `UNSANDBOXED`, enabling arbitrary code execution in user flows. For secure operation, it MUST be explicitly set to `SANDBOX_CODE_ONLY` or `SANDBOX_PROCESS`.
- ⚠️A hardcoded `POSTGRES_PASSWORD` exists in `docker-compose.dev.yml`; this should be replaced with a secure environment variable for any shared development setup.
Review RequiredView Analysis
lemonade
by lemonade-sdk
The Lemonade C++ Server provides a lightweight, high-performance HTTP API for local Large Language Model (LLM) inference and model management, leveraging hardware accelerators like AMD Ryzen AI NPU, integrated GPUs, and discrete GPUs.
The Lemonade C++ Server provides a lightweight, high-performance HTTP API for local Large Language Model (LLM) inference and model management, leveraging hardware accelerators like AMD Ryzen AI NPU, integrated GPUs, and discrete GPUs.
Setup Requirements
- ⚠️Requires a C++ development environment (e.g., Visual Studio 2019+ on Windows, build-essential on Linux, Xcode tools on macOS) and CMake to build from source.
- ⚠️Initial setup requires an active internet connection to download build dependencies, `ryzenai-server`, `llama.cpp` binaries, `whisper.cpp` binaries, and LLM models from GitHub/Hugging Face.
- ⚠️Optimal performance, especially for NPU acceleration with Ryzen AI, depends on having up-to-date and compatible GPU/NPU drivers installed on the host system.
Verified SafeView Analysis
terraform-mcp-server
by hashicorp
The Terraform MCP Server provides seamless integration with Terraform Registry APIs and HCP Terraform/Terraform Enterprise, enabling AI assistants (LLMs) to generate high-quality Terraform code and automate IaC workflows.
The Terraform MCP Server provides seamless integration with Terraform Registry APIs and HCP Terraform/Terraform Enterprise, enabling AI assistants (LLMs) to generate high-quality Terraform code and automate IaC workflows.
Setup Requirements
- ⚠️Requires Docker to be installed and running.
- ⚠️Requires an AI assistant (LLM) that supports the Model Context Protocol (MCP) to interact with the server.
- ⚠️Full functionality, especially with HCP Terraform/Terraform Enterprise, requires a valid TFE_TOKEN and TFE_ADDRESS to be configured (typically via environment variables).
Verified SafeView Analysis
mcp-server-cloudflare
by cloudflare
Enable Large Language Models (LLMs) to interact with and automate tasks across various Cloudflare services through a standardized Model Context Protocol (MCP).
Enable Large Language Models (LLMs) to interact with and automate tasks across various Cloudflare services through a standardized Model Context Protocol (MCP).
Setup Requirements
- ⚠️Requires a Cloudflare account for deployment and API access.
- ⚠️Requires `wrangler` CLI for deployment and local development.
- ⚠️Setting up OAuth involves creating Cloudflare API tokens with specific scopes, KV namespaces, and secrets.
- ⚠️Some advanced features may require a paid Cloudflare Workers plan.
- ⚠️Local development for external contributors requires setting `DEV_DISABLE_OAUTH=true` and providing a `DEV_CLOUDFLARE_API_TOKEN` (global API token with broad permissions, which is sensitive) or setting up OAuth credentials.
- ⚠️The project is a monorepo; each 'server' (app) is a distinct deployable unit, and there is no single command to run 'this server' (the entire monorepo) as a monolithic application.
Review RequiredView Analysis
claude-flow
by ruvnet
Orchestrates AI agents (Claude) for development workflows, including code generation, testing, analysis, research, and project migration, with MLOps capabilities.
Orchestrates AI agents (Claude) for development workflows, including code generation, testing, analysis, research, and project migration, with MLOps capabilities.
Setup Requirements
- ⚠️Requires Node.js (>=14.0.0) and npm for core system and CLI.
- ⚠️Requires Python (3.x) with ML libraries (e.g., pandas, numpy, scikit-learn, torch) for MLE-STAR agents.
- ⚠️Requires Claude Code CLI (`claude`) to be installed and configured with an Anthropic API Key (Paid service).
- ⚠️Requires GitHub CLI (`gh`) for GitHub integration features.
- ⚠️Utilizes SQLite database, often requiring specific `better-sqlite3` native bindings.
- ⚠️Relies heavily on environment variables (e.g., `ANTHROPIC_API_KEY`, `CLAUDE_FLOW_ENV`, `GITHUB_TOKEN`).
Review RequiredView Analysis
UltraRAG
by OpenBMB
A low-code RAG framework for researchers to build and iterate on complex multi-stage, multimodal Retrieval-Augmented Generation (RAG) pipelines using a Model Context Protocol (MCP) architecture.
A low-code RAG framework for researchers to build and iterate on complex multi-stage, multimodal Retrieval-Augmented Generation (RAG) pipelines using a Model Context Protocol (MCP) architecture.
Setup Requirements
- ⚠️Requires Node.js (version 20+) for launching remote MCP servers (`npx mcp-remote`).
- ⚠️Many functionalities (e.g., web search, OpenAI LLMs) require external API keys (Exa, Tavily, ZhipuAI, OpenAI) which incur usage costs.
- ⚠️Leverages GPU hardware extensively for performance (e.g., vLLM, FAISS-GPU, sentence-transformers, infinity-emb); specific CUDA versions (e.g., CUDA 12.x) may be required depending on chosen dependencies.
Verified SafeView Analysis
mcp
by awslabs
Enables AI assistants to interact with AWS DocumentDB databases by providing tools for connection management, database/collection operations, document CRUD, aggregation, schema analysis, and query planning.
Enables AI assistants to interact with AWS DocumentDB databases by providing tools for connection management, database/collection operations, document CRUD, aggregation, schema analysis, and query planning.
Setup Requirements
- ⚠️Requires Python 3.10+ and 'uv' package manager for installation and local development.
- ⚠️Requires network access to an AWS DocumentDB cluster.
- ⚠️Requires a valid SSL/TLS certificate ('global-bundle.pem') for DocumentDB connections if TLS is enabled.
- ⚠️Requires AWS credentials with appropriate permissions to access DocumentDB.
- ⚠️Manual configuration of the MCP client's JSON settings file is needed for local server or 'uvx' package usage.
Verified SafeView Analysis
osaurus
by dinoki-ai
Osaurus is a native macOS LLM server running local language models with OpenAI and Ollama compatible APIs, enabling tool calling and a plugin ecosystem for AI agents.
Osaurus is a native macOS LLM server running local language models with OpenAI and Ollama compatible APIs, enabling tool calling and a plugin ecosystem for AI agents.
Setup Requirements
- ⚠️Requires macOS 15.5+ and Apple Silicon (M1 or newer) for native execution and optimized performance.
- ⚠️Users must manually download LLM models via the application's UI or CLI after installation.
- ⚠️Integration with external MCP clients (e.g., Cursor) requires adding specific JSON configuration to the client.