Stop Searching. Start Trusting.
The curated directory of MCP servers, vetted for security, efficiency, and quality.
Tired of the MCP "Marketplace" Chaos?
We built MCPScout.ai to solve the ecosystems biggest pain points.
No Insecure Dumps
We manually analyze every server for basic security flaws.
Easy Setup
Our gotcha notes warn you about complex setups.
Avoid "Token Hogs"
We estimate token costs for cost-effective agents.
Products, Not Demos
We filter out "Hello World" demos.
Vetted Servers(9120)
UltraRAG
by OpenBMB
An open-source RAG framework for building, experimenting, and evaluating complex Retrieval-Augmented Generation (RAG) pipelines with low-code YAML configurations and native multimodal support.
An open-source RAG framework for building, experimenting, and evaluating complex Retrieval-Augmented Generation (RAG) pipelines with low-code YAML configurations and native multimodal support.
Setup Requirements
- ⚠️Requires GPUs for optimal performance, especially for vLLM, FAISS-GPU, and certain embedding models; `gpu_ids` is frequently configured.
- ⚠️Requires various API keys (e.g., OPENAI_API_KEY, EXA_API_KEY, TAVILY_API_KEY, ZHIPUAI_API_KEY) for accessing external LLM and search services, which are typically paid.
- ⚠️External system dependencies include Node.js (version >=20 is checked) for remote MCP servers and the `mineru` executable for advanced document parsing.
- ⚠️FAISS (faiss-cpu or faiss-gpu-cu12, specific to CUDA version) is an optional dependency for the retriever backend.
Verified SafeView Analysis
mcp
by awslabs
Enables AI assistants to interact with AWS DocumentDB databases, providing tools for connection management, database/collection operations, document querying, aggregation pipelines, query planning, and schema analysis. It acts as a bridge for safe and efficient database operations through the Model Context Protocol (MCP).
Enables AI assistants to interact with AWS DocumentDB databases, providing tools for connection management, database/collection operations, document querying, aggregation pipelines, query planning, and schema analysis. It acts as a bridge for safe and efficient database operations through the Model Context Protocol (MCP).
Setup Requirements
- ⚠️Requires network access to the DocumentDB cluster (e.g., via VPC peering, security group rules).
- ⚠️Requires an SSL/TLS certificate (typically `global-bundle.pem`) for TLS-enabled DocumentDB clusters.
- ⚠️The DocumentDB connection string must explicitly include `retryWrites=false`.
- ⚠️Requires the `uv` Python package manager for installation (`uvx` command in examples).
Review RequiredView Analysis
osaurus
by dinoki-ai
Osaurus is an AI edge runtime for macOS, enabling users to run local and cloud AI models, orchestrate tools via the Model Context Protocol (MCP), and power AI applications and workflows on Apple Silicon.
Osaurus is an AI edge runtime for macOS, enabling users to run local and cloud AI models, orchestrate tools via the Model Context Protocol (MCP), and power AI applications and workflows on Apple Silicon.
Setup Requirements
- ⚠️Requires macOS 15.5+ and Apple Silicon (M1 or newer) due to MLX Runtime optimization.
- ⚠️Initial setup involves downloading Whisper models for voice input and LLM models from Hugging Face, requiring internet connection and several gigabytes of disk space.
- ⚠️Voice input (WhisperKit) and Transcription Mode require granting specific macOS permissions: Microphone, Screen Recording (for system audio), and Accessibility (for global dictation).
Verified SafeView Analysis
5ire
by nanbingxyz
A desktop AI assistant client that integrates with various LLM providers and connects to Model Context Protocol (MCP) servers for extended tool-use and knowledge base capabilities.
A desktop AI assistant client that integrates with various LLM providers and connects to Model Context Protocol (MCP) servers for extended tool-use and knowledge base capabilities.
Setup Requirements
- ⚠️Requires Python, Node.js, and the uv Python package manager if local MCP servers (for tools feature) are to be used.
- ⚠️Requires API keys/credentials for external LLM providers (e.g., OpenAI, Anthropic, Google, Mistral, Grok), incurring monetary costs for API usage.
- ⚠️Downloads embedding models (e.g., Xenova/bge-m3) from HuggingFace upon first use, which can be a significant initial download size.
Review RequiredView Analysis
mcp-memory-service
by doobidoo
A Model Context Protocol (MCP) server providing persistent, semantic memory storage and retrieval capabilities for AI agents. It supports lightweight semantic reasoning (contradiction, causal inference), content chunking, multi-backend storage (SQLite-vec, Cloudflare, Hybrid), autonomous memory consolidation (decay, association, clustering, compression, forgetting), and real-time updates via SSE. It's designed for token-efficient interaction with LLMs.
A Model Context Protocol (MCP) server providing persistent, semantic memory storage and retrieval capabilities for AI agents. It supports lightweight semantic reasoning (contradiction, causal inference), content chunking, multi-backend storage (SQLite-vec, Cloudflare, Hybrid), autonomous memory consolidation (decay, association, clustering, compression, forgetting), and real-time updates via SSE. It's designed for token-efficient interaction with LLMs.
Setup Requirements
- ⚠️Requires Python dependencies like PyTorch (or ONNX Runtime & Tokenizers for CPU-only), sentence-transformers, sqlite-vec, mcp, aiohttp, fastapi, and uvicorn. Installation might be complex due to platform-specific PyTorch/GPU setup.
- ⚠️Initial model downloads (~300MB for 'all-MiniLM-L6-v2') can cause timeouts during first-time startup if network is slow or dependencies are not pre-cached.
- ⚠️Cloudflare storage backend requires `CLOUDFLARE_API_TOKEN` and `CLOUDFLARE_ACCOUNT_ID` environment variables configured, alongside other D1/Vectorize/R2 specifics.
Verified SafeView Analysis
context7
by upstash
Provides up-to-date, version-specific documentation and code examples to Large Language Models (LLMs) and AI coding assistants to improve code generation accuracy and relevance, preventing outdated or hallucinated information.
Provides up-to-date, version-specific documentation and code examples to Large Language Models (LLMs) and AI coding assistants to improve code generation accuracy and relevance, preventing outdated or hallucinated information.
Setup Requirements
- ⚠️Requires Node.js >= v18.0.0
- ⚠️Requires an MCP client (e.g., Cursor, Claude Code, VSCode)
- ⚠️Context7 API Key is optional but recommended for higher rate limits and private repositories.
Verified SafeView Analysis
excel-mcp-server
by haris-musa
This server allows AI agents to manipulate Excel files (create, read, update, format, chart, pivot, validate) without requiring Microsoft Excel to be installed.
This server allows AI agents to manipulate Excel files (create, read, update, format, chart, pivot, validate) without requiring Microsoft Excel to be installed.
Setup Requirements
- ⚠️Requires Python 3.10 or newer.
- ⚠️The `uvx` command implies the `uv` Python package manager must be installed.
- ⚠️When using SSE or Streamable HTTP transports, the `EXCEL_FILES_PATH` environment variable must be set (defaults to `./excel_files`).
Verified SafeView Analysis
npcpy
by NPC-Worldwide
Core library of the NPC Toolkit that supercharges natural language processing pipelines and agent tooling. It's a flexible framework for building state-of-the-art applications and conducting novel research with LLMs. Supports multi-agent systems, fine-tuning, reinforcement learning, genetic algorithms, model ensembling, and NumPy-like operations for AI models (NPCArray). Includes a built-in Flask server for deploying agent teams via REST APIs, and multimodal generation (image, video, audio).
Core library of the NPC Toolkit that supercharges natural language processing pipelines and agent tooling. It's a flexible framework for building state-of-the-art applications and conducting novel research with LLMs. Supports multi-agent systems, fine-tuning, reinforcement learning, genetic algorithms, model ensembling, and NumPy-like operations for AI models (NPCArray). Includes a built-in Flask server for deploying agent teams via REST APIs, and multimodal generation (image, video, audio).
Setup Requirements
- ⚠️Requires Ollama for local LLMs (e.g., llama3.2, gemma3:4b) installed and running.
- ⚠️Platform-specific dependencies for audio (ffmpeg, portaudio, espeak) and screenshots (pywin32 on Windows, screencapture on Mac, gnome-screenshot/scrot on Linux).
- ⚠️Requires API keys for various cloud LLM/generation providers (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY, DEEPSEEK_API_KEY, PERPLEXITY_API_KEY, ELEVENLABS_API_KEY).
- ⚠️Dependencies for local fine-tuning and diffusion models require PyTorch, diffusers, transformers, and sentence-transformers, which can be resource-intensive and require CUDA for optimal performance.
- ⚠️Uses SQLite for conversation history and internal state; `psycopg2-binary` required for PostgreSQL.
- ⚠️Jinja templates can be complex to debug without prior experience.
Review RequiredView Analysis
firecrawl-mcp-server
by firecrawl
A Model Context Protocol (MCP) server for integrating Firecrawl's web scraping, crawling, search, and structured data extraction capabilities with AI agents.
A Model Context Protocol (MCP) server for integrating Firecrawl's web scraping, crawling, search, and structured data extraction capabilities with AI agents.
Setup Requirements
- ⚠️Requires a Firecrawl API Key (paid service, unless self-hosting your own Firecrawl instance).
- ⚠️Requires Node.js version 18.0.0 or higher.
- ⚠️Crawl operations can return very large amounts of data, potentially exceeding token limits or incurring high costs if processed by an LLM.
- ⚠️Windows users running via `npx` might need to prefix the command with `cmd /c`.
Verified SafeView Analysis
Skill_Seekers
by yusufkaraaslan
Automate the conversion of diverse documentation (websites, GitHub repos, PDFs, local codebases) into high-quality AI skills for various LLM coding agents like Claude Code, Gemini, and OpenAI.
Automate the conversion of diverse documentation (websites, GitHub repos, PDFs, local codebases) into high-quality AI skills for various LLM coding agents like Claude Code, Gemini, and OpenAI.
Setup Requirements
- ⚠️Requires 'mcp' Python package (pip install mcp) for core functionality.
- ⚠️Platform-specific API keys (e.g., ANTHROPIC_API_KEY, GOOGLE_API_KEY, OPENAI_API_KEY) are mandatory for most core features and uploads.
- ⚠️GitHub personal access token (GITHUB_TOKEN) is required for GitHub scraping and config submission.
- ⚠️Requires local installation of 'git' for repository operations.
- ⚠️For local AI enhancement, the 'claude' command-line tool (Claude Code CLI) must be installed and in PATH.
- ⚠️HTTP transport mode requires 'uvicorn' and 'starlette' Python packages.
- ⚠️PDF OCR features require 'pytesseract' and 'Pillow', and the Tesseract OCR engine installation.
- ⚠️Python 3.8+ is required.
Verified SafeView Analysis
UI-TARS-desktop
by bytedance
UI-TARS-desktop is a native GUI Agent application powered by multimodal AI models, enabling users to control their computer and browser through natural language instructions.
UI-TARS-desktop is a native GUI Agent application powered by multimodal AI models, enabling users to control their computer and browser through natural language instructions.
Setup Requirements
- ⚠️Requires Node.js >=20.x (and >=22 for multimodal workspace)
- ⚠️Requires pnpm for package management
- ⚠️Requires API keys for LLM providers (e.g., VolcEngine, Anthropic, OpenAI) which are paid services
- ⚠️On macOS, requires granting 'Accessibility' and 'Screen Recording' permissions to the application
Review RequiredView Analysis
mcp-for-beginners
by microsoft
Building custom Model Context Protocol (MCP) servers for AI agent development, including weather data retrieval and GitHub repository automation.
Building custom Model Context Protocol (MCP) servers for AI agent development, including weather data retrieval and GitHub repository automation.
Setup Requirements
- ⚠️Requires Python 3.10+.
- ⚠️Requires `uv` or `pip` for dependency management.
- ⚠️Requires `Node.js` and `npm` for MCP Inspector and `@playwright/mcp` dependency.
- ⚠️Requires `Git CLI` to be installed and in PATH for `git_clone_repo` tool.
- ⚠️Requires `VS Code` or `VS Code Insiders` to be installed in standard paths for `open_in_vscode` tool.
- ⚠️Requires environment variables (`AZURE_OPENAI_CHAT_DEPLOYMENT_NAME`, `AZURE_OPENAI_API_KEY`, `AZURE_OPENAI_ENDPOINT`, `AZURE_OPENAI_API_VERSION`, `GITHUB_TOKEN`) for full client functionality and Azure OpenAI integration, which may incur costs.