Stop Searching. Start Trusting.
The curated directory of MCP servers, vetted for security, efficiency, and quality.
Tired of the MCP "Marketplace" Chaos?
We built MCPScout.ai to solve the ecosystems biggest pain points.
No Insecure Dumps
We manually analyze every server for basic security flaws.
Easy Setup
Our gotcha notes warn you about complex setups.
Avoid "Token Hogs"
We estimate token costs for cost-effective agents.
Products, Not Demos
We filter out "Hello World" demos.
Vetted Servers(8554)
SerialMemoryServer
by sblanchard
A temporal knowledge graph memory system for AI agents, enabling semantic search, multi-hop reasoning, and user persona tracking.
A temporal knowledge graph memory system for AI agents, enabling semantic search, multi-hop reasoning, and user persona tracking.
Setup Requirements
- ⚠️Docker & Docker Compose required for infrastructure (PostgreSQL, Redis, RabbitMQ)
- ⚠️Python 3.11+ required
- ⚠️Manual download of spaCy language model (en_core_web_sm)
Verified SafeView Analysis
SimpleGitWorkShop
by berserus
A simple repository likely intended for a Git workshop or tutorial.
A simple repository likely intended for a Git workshop or tutorial.
Verified SafeView Analysis
mem0-mcp-server
by MAnders333
Provides a Model Context Protocol (MCP) server for persistent memory, enabling AI agents to store and semantically search conversational context.
Provides a Model Context Protocol (MCP) server for persistent memory, enabling AI agents to store and semantically search conversational context.
Setup Requirements
- ⚠️Requires an OpenAI API Key (or compatible LLM API Key via `LLM_API_KEY` or `OPENAI_API_KEY`) for embeddings and LLM calls, incurring paid service costs.
- ⚠️Requires Python 3.10 or higher.
- ⚠️By default, stores ChromaDB data locally in '/app/data' within the container; ensure appropriate volume mapping for data persistence when using Docker.
Verified SafeView Analysis
durion-chat
by louisburroughs
Provides a Moqui-hosted chat UI for interacting with a backend MCP server, facilitating operational support, diagnostics, and guided workflows.
Provides a Moqui-hosted chat UI for interacting with a backend MCP server, facilitating operational support, diagnostics, and guided workflows.
Setup Requirements
- ⚠️Requires a running Moqui framework instance for deployment and hosting.
- ⚠️Requires a separate backend MCP server endpoint to function.
Review RequiredView Analysis
google-search-mcp
by ShogoOkamoto
Enables Large Language Models (LLMs) to perform real-time web searches using Google Custom Search Engine.
Enables Large Language Models (LLMs) to perform real-time web searches using Google Custom Search Engine.
Setup Requirements
- ⚠️Requires a Google Custom Search API key (from Google Cloud Console) and a Custom Search Engine ID (from Programmable Search Engine).
- ⚠️Google Custom Search API usage can incur costs depending on usage beyond the free tier.
- ⚠️Windows users integrating with Claude Desktop might need to specify the full path to their Python executable.
Verified SafeView Analysis
MCP-SERVER
by annmalavet
This server acts as a Model Context Protocol (MCP) tool provider, exposing functionalities like email search, appointment creation, and email sending for consumption by an MCP client or AI agent.
This server acts as a Model Context Protocol (MCP) tool provider, exposing functionalities like email search, appointment creation, and email sending for consumption by an MCP client or AI agent.
Setup Requirements
- ⚠️Requires `RESEND_API_KEY` environment variable for email sending functionality.
- ⚠️Requires `EMAIL_SEARCH_API_URL` environment variable pointing to an external email search service.
- ⚠️Requires `APPOINTMENT_SERVICE_URL` environment variable pointing to an external appointment booking service.
Verified SafeView Analysis
Lab-7-Weather-MCP-Server
by cayour
Provides real-time weather forecasts and alerts by integrating with the National Weather Service (NWS) API, primarily intended as a tool for an LLM.
Provides real-time weather forecasts and alerts by integrating with the National Weather Service (NWS) API, primarily intended as a tool for an LLM.
Setup Requirements
- ⚠️Requires the 'mcp-server' Python library to be installed.
- ⚠️Requires the 'httpx' Python library for async HTTP requests.
- ⚠️Designed to be run as a sub-process by an MCP client that handles its standard input/output (stdio).
Verified SafeView Analysis
A Model Context Protocol (MCP) server for basic Universal Scene Description (USD) operations and NVIDIA Omniverse integration.
A Model Context Protocol (MCP) server for basic Universal Scene Description (USD) operations and NVIDIA Omniverse integration.
Setup Requirements
- ⚠️Requires Pixar USD ('pxr') Python bindings, which can be complex to install and might need specific system configurations or pre-built binaries.
- ⚠️The project is explicitly marked as 'WARNING totally WIP!!!' and 'not working in any meaningful way!!!' in the README, indicating it is not production-ready and may be unstable.
- ⚠️Python 3.7+ is required.
Review RequiredView Analysis
playwright-mcp-vercel
by HaolongChen
Provides a REST API endpoint to proxy requests to the Playwright Model Context Protocol (MCP) tool, enabling serverless browser automation.
Provides a REST API endpoint to proxy requests to the Playwright Model Context Protocol (MCP) tool, enabling serverless browser automation.
Setup Requirements
- ⚠️Requires a Vercel account for serverless deployment.
- ⚠️During cold starts in the serverless environment, `npx playwright-mcp` will likely download `playwright-mcp` and potentially Playwright browser binaries, leading to increased latency and resource consumption.
- ⚠️The `package.json` specifies `main: "index.js"` and `start: "node index.js"`, but the provided server logic is in `server.js` and Vercel configuration explicitly points to `server.js` for deployment. This discrepancy could cause confusion for local development if `index.js` is a wrapper file not provided.
Review RequiredView Analysis
iwsdk-rag-mcp
by felixtrz
Provides a Model Context Protocol (MCP) server for AI assistants to perform semantic code search and understand the Immersive Web SDK (IWSDK) codebase using Retrieval-Augmented Generation (RAG) with vector embeddings.
Provides a Model Context Protocol (MCP) server for AI assistants to perform semantic code search and understand the Immersive Web SDK (IWSDK) codebase using Retrieval-Augmented Generation (RAG) with vector embeddings.
Setup Requirements
- ⚠️Requires Node.js version 20.19.0 or higher.
- ⚠️pnpm is the recommended package manager for installation and development.
- ⚠️The first run involves downloading a ~420MB embedding model (`jinaai/jina-embeddings-v2-base-code`) from Hugging Face, which can cause a slow initial startup.
- ⚠️This is an MCP server designed for integration with tools like Claude Code/Desktop, not a standalone web service.
Verified SafeView Analysis
MCP-Server-y-SAP-B1
by JeanmarcoLujan
Acts as a Smart Connector for SAP Business One (B1) by exposing a subset of its data via the Model Context Protocol (MCP), enabling integration with AI models or other MCP-compatible clients.
Acts as a Smart Connector for SAP Business One (B1) by exposing a subset of its data via the Model Context Protocol (MCP), enabling integration with AI models or other MCP-compatible clients.
Setup Requirements
- ⚠️Requires an active SAP HANA database instance with specified schema and valid credentials.
- ⚠️Environment variables (HANA_HOST, HANA_PORT, HANA_USER, HANA_PASSWORD, HANA_DB_SCHEMA) must be configured for the application to connect to SAP HANA.
- ⚠️The server is designed to be consumed by a Model Context Protocol (MCP) client or an LLM, not a generic HTTP client, requiring specific client-side integration.
Review RequiredView Analysis
mcp-agent
by sanjanb
A production-ready HR Assistant that answers HR policy questions using RAG, performs resume screening, and tracks onboarding tasks.
A production-ready HR Assistant that answers HR policy questions using RAG, performs resume screening, and tracks onboarding tasks.
Setup Requirements
- ⚠️Requires OpenAI or Gemini API Key (Paid) for AI responses, otherwise falls back to basic retrieval.
- ⚠️Requires Streamlit for the user interface.
- ⚠️Optional: Redis server for conversation caching; otherwise, an in-memory fallback is used.
- ⚠️Python 3.8+ required.
- ⚠️Embedding models (e.g., 'all-MiniLM-L6-v2') are downloaded on first use, which may cause an initial delay.