Stop Searching. Start Trusting.
The curated directory of MCP servers, vetted for security, efficiency, and quality.
Tired of the MCP "Marketplace" Chaos?
We built MCPScout.ai to solve the ecosystems biggest pain points.
No Insecure Dumps
We manually analyze every server for basic security flaws.
Easy Setup
Our gotcha notes warn you about complex setups.
Avoid "Token Hogs"
We estimate token costs for cost-effective agents.
Products, Not Demos
We filter out "Hello World" demos.
Vetted Servers(8554)
checkmk-mcp-server
by ry-ops
Provides a Model Context Protocol (MCP) server for AI assistants to manage CheckMK monitoring systems, including hosts, folders, services, and configurations.
Provides a Model Context Protocol (MCP) server for AI assistants to manage CheckMK monitoring systems, including hosts, folders, services, and configurations.
Setup Requirements
- ⚠️Requires Python 3.10 or higher.
- ⚠️Requires `uv` package manager, which the `setup.sh` attempts to install via `curl | sh`, a method that might be restricted or undesirable in some environments. Manual installation may be needed.
- ⚠️Requires access to an existing CheckMK instance (version 2.0+) and an automation user with appropriate permissions configured on that instance.
Verified SafeView Analysis
tsai-s10-multi-agent-orchestration
by RoyRushreeta
Orchestrates a multi-agent loop to answer user queries by leveraging Google Gemini models, MCP tool servers, and a retrieval pipeline.
Orchestrates a multi-agent loop to answer user queries by leveraging Google Gemini models, MCP tool servers, and a retrieval pipeline.
Setup Requirements
- ⚠️Requires `GEMINI_API_KEY` environment variable for Google Gemini API access (a paid service).
- ⚠️Requires a local Ollama server running with specific models installed (`nomic-embed-text`, `gemma3:12b`, `phi4:latest`, `qwen2.5:32b-instruct-q4_0`) for RAG and semantic chunking.
- ⚠️The `cwd` paths in `config/mcp_server_config.yaml` are hardcoded to absolute Windows paths (e.g., `C:/Users/Rushreeta Roy/...`) and MUST be updated to reflect the user's local environment.
- ⚠️Requires Python >= 3.11.
Review RequiredView Analysis
asddsaadsadssad-mcp-server
by Traia-IO
Provides an MCP (Model Context Protocol) server to enable AI agents and LLMs to interact with the asddsaadsadssad API.
Provides an MCP (Model Context Protocol) server to enable AI agents and LLMs to interact with the asddsaadsadssad API.
Setup Requirements
- ⚠️Requires Docker for recommended setup, utilizing `run_local_docker.sh` to build and run the container.
- ⚠️Requires configuration of D402 payment protocol environment variables (e.g., SERVER_ADDRESS, MCP_OPERATOR_PRIVATE_KEY, D402_FACILITATOR_URL), which may involve blockchain wallet setup and management.
- ⚠️Python 3.12 or newer is specified as a requirement in `pyproject.toml`.
- ⚠️The `run_local_docker.sh` script modifies `pyproject.toml` and attempts to locate a local `traia-iatp` project, which might cause friction if the path is not correctly configured or if `rsync` is unavailable.
Verified SafeView Analysis
mcp-partner
by Ericwyn
A Postman-like client for testing and interacting with Model Context Protocol (MCP) servers via Server-Sent Events (SSE) or Streamable HTTP.
A Postman-like client for testing and interacting with Model Context Protocol (MCP) servers via Server-Sent Events (SSE) or Streamable HTTP.
Setup Requirements
- ⚠️Requires an existing Model Context Protocol (MCP) server to connect to.
- ⚠️Web browser CORS (Cross-Origin Resource Sharing) policies can prevent direct connection to local or different-domain MCP servers; users may need to enable the built-in proxy (if on Vercel), use a public proxy, or run a local proxy (like Pancors).
- ⚠️The built-in '/cors' proxy functionality is only available when deployed on Vercel; GitHub Pages deployments will need an external proxy.
Verified SafeView Analysis
ai-assistant-public
by turmex
The AI Assistant provides a local-first conversational interface, enabling users to chat with LLMs directly in their browser via WebLLM (WebGPU) or locally using an Ollama server. It features intelligent model management, hardware-aware recommendations, and token optimization for efficient conversation.
The AI Assistant provides a local-first conversational interface, enabling users to chat with LLMs directly in their browser via WebLLM (WebGPU) or locally using an Ollama server. It features intelligent model management, hardware-aware recommendations, and token optimization for efficient conversation.
Setup Requirements
- ⚠️Requires Ollama to be installed and running locally for server-based models (`ollama serve`).
- ⚠️Requires Chrome 113+ or Edge 113+ with WebGPU support for browser-based (WebLLM) inference.
- ⚠️Requires Python 3.8+.
Verified SafeView Analysis
mcp-server-github-review
by mendesrobson
A server designed for managing or facilitating review processes related to GitHub repositories.
A server designed for managing or facilitating review processes related to GitHub repositories.
Review RequiredView Analysis
graphiti-fastmcp
by donbr
Provides AI agents with persistent, temporally-aware knowledge graph memory through episodic ingestion, entity extraction, and semantic search.
Provides AI agents with persistent, temporally-aware knowledge graph memory through episodic ingestion, entity extraction, and semantic search.
Setup Requirements
- ⚠️Requires Docker and Docker Compose for the recommended default setup (or a running Neo4j/FalkorDB instance).
- ⚠️Requires API keys for LLM (e.g., OPENAI_API_KEY) and Embedder (e.g., OpenAI).
- ⚠️Requires Python 3.10+ and the `uv` package manager.
- ⚠️Tuning `SEMAPHORE_LIMIT` is critical to avoid LLM rate limits (429 errors) and manage API costs, as episode ingestion involves multiple LLM calls.
- ⚠️Database connection issues (e.g., 'Connection refused') are common setup friction points.
- ⚠️The default Neo4j password 'demodemo' in config files should be changed for production.
Verified SafeView Analysis
mcp
by abhishekkhasgiwala
An MCP server that exposes Activiti BPM case and transaction data as AI-consumable tools for IDE-integrated, natural language case analysis.
An MCP server that exposes Activiti BPM case and transaction data as AI-consumable tools for IDE-integrated, natural language case analysis.
Setup Requirements
- ⚠️Requires Docker to run the PostgreSQL database (`activiti-bpm-docker/docker-compose.yml`).
- ⚠️Requires Java 17 runtime environment.
- ⚠️Hardcoded database credentials make it unsafe for production environments without modification.
Review RequiredView Analysis
MCP-server-built-in-Rust-to-interact-with-the-Ethereum-blockchain
by maybleeess-collab
Interacting with the Ethereum blockchain for balance queries, token price fetching, and Uniswap V3 swap simulations.
Interacting with the Ethereum blockchain for balance queries, token price fetching, and Uniswap V3 swap simulations.
Setup Requirements
- ⚠️Requires an Ethereum RPC URL (e.g., from Alchemy or Infura), which may be a paid service or have rate limits.
- ⚠️Requires a private key for signing transactions/simulations, which must be kept secure.
- ⚠️Communication is via Stdio, not a typical HTTP API, which may require specific integration patterns.
Verified SafeView Analysis
vibe-workflow-mcp
by Ch1nyzzz
Automates AI-assisted software development workflows by managing project initialization, documentation (PRD/GDD, tech stack, architecture, changelog), and progress tracking within a 'memory-bank' directory.
Automates AI-assisted software development workflows by managing project initialization, documentation (PRD/GDD, tech stack, architecture, changelog), and progress tracking within a 'memory-bank' directory.
Setup Requirements
- ⚠️Requires Python 3.12+.
- ⚠️Requires `fastmcp` library (`pip install fastmcp`).
- ⚠️Requires manual configuration for AI tools (e.g., Claude Code/Desktop, Codex CLI) to register the MCP server with the correct file path.
Verified SafeView Analysis
mcp-basics
by RealGustavoHerrera
A minimal example of building an MCP client and server in Python for connecting AI models to external tools and data, demonstrating AI agent capabilities.
A minimal example of building an MCP client and server in Python for connecting AI models to external tools and data, demonstrating AI agent capabilities.
Setup Requirements
- ⚠️Requires OpenAI API Key (Paid Service)
- ⚠️Requires manual creation of a `.env` file for the API key
- ⚠️Requires Python 3 environment and `pip install -r requirements.txt`
Verified SafeView Analysis
juju-mcp
by nsklikas
Enables LLMs to interact with and manage Juju environments by exposing CLI commands as Model Context Protocol (MCP) tools.
Enables LLMs to interact with and manage Juju environments by exposing CLI commands as Model Context Protocol (MCP) tools.
Setup Requirements
- ⚠️Requires Juju CLI installed and available in system PATH.
- ⚠️Requires Kubectl CLI installed and available in system PATH.
- ⚠️Requires being logged in to a Juju controller (juju login).