Stop Searching. Start Trusting.
The curated directory of MCP servers, vetted for security, efficiency, and quality.
Tired of the MCP "Marketplace" Chaos?
We built MCPScout.ai to solve the ecosystems biggest pain points.
No Insecure Dumps
We manually analyze every server for basic security flaws.
Easy Setup
Our gotcha notes warn you about complex setups.
Avoid "Token Hogs"
We estimate token costs for cost-effective agents.
Products, Not Demos
We filter out "Hello World" demos.
Vetted Servers(8554)
ares
by dirmacs
A production-grade agentic chatbot server with multi-provider LLM support, tool calling, Retrieval Augmented Generation (RAG), and advanced research capabilities.
A production-grade agentic chatbot server with multi-provider LLM support, tool calling, Retrieval Augmented Generation (RAG), and advanced research capabilities.
Setup Requirements
- ⚠️Requires a local Ollama server running with a compatible LLM (e.g., 'ministral-3:3b') for default LLM functionality.
- ⚠️Mandatory environment variables (JWT_SECRET, API_KEY) must be set for the server to start, even in development.
- ⚠️For UI development, requires Rust toolchain (with wasm32-unknown-unknown target), Trunk bundler, and Node.js/npm for Tailwind CSS.
Verified SafeView Analysis
mcp-4get
by yshalsager
An MCP server providing LLM clients seamless access to the 4get Meta Search engine API for web, image, and news searches.
An MCP server providing LLM clients seamless access to the 4get Meta Search engine API for web, image, and news searches.
Setup Requirements
- ⚠️Requires Python 3.13+
- ⚠️Requires uv for dependency management and running
- ⚠️Relies on the external 4get.ca API, which may require an optional pass token for rate-limited instances.
Verified SafeView Analysis
chrome-mcp-docker
by null-runner
Provides a persistent, stable Chrome DevTools environment for AI coding assistants to perform UI debugging and web interactions.
Provides a persistent, stable Chrome DevTools environment for AI coding assistants to perform UI debugging and web interactions.
Setup Requirements
- ⚠️Requires Docker Desktop (or Docker Engine) to be installed and running.
- ⚠️The official Docker MCP Gateway has known bugs affecting custom servers; standalone installation (using 'docker run' directly) or a patched Gateway fork is recommended.
- ⚠️When configuring for Windows/WSL, catalog paths in `~/.claude.json` must use Windows-style paths (e.g., `C:\Users\...`) with double backslashes for escaping.
Verified SafeView Analysis
awesome-ionic-mcp
by Tommertom
Acts as an intelligent server for AI assistants to access Ionic Framework and Capacitor component definitions, plugin documentation, code examples, and execute CLI commands for mobile app development.
Acts as an intelligent server for AI assistants to access Ionic Framework and Capacitor component definitions, plugin documentation, code examples, and execute CLI commands for mobile app development.
Setup Requirements
- ⚠️Requires GITHUB_TOKEN environment variable for full functionality, otherwise GitHub API rate limits will be hit during startup when fetching community and CapGo plugin data.
- ⚠️Puppeteer launches a visible browser window (`headless: false`) for some documentation lookups, requiring a graphical environment or Xvfb if run on a headless server.
- ⚠️Requires active internet connection for initial data loading and most documentation lookups.
Verified SafeView Analysis
chronosphere-mcp
by chronosphereio
The Chronosphere MCP server acts as an intermediary, exposing Chronosphere monitoring data (logs, metrics, traces, events) and configuration as tools for AI applications and agents.
The Chronosphere MCP server acts as an intermediary, exposing Chronosphere monitoring data (logs, metrics, traces, events) and configuration as tools for AI applications and agents.
Setup Requirements
- ⚠️Requires a Chronosphere API Token or OAuth configuration for backend authentication.
- ⚠️Requires a Chronosphere Organization Name for API endpoint construction.
- ⚠️Relies on 'unstable' Chronosphere APIs which are not recommended for production use and may change without warning.
- ⚠️Requires a YAML configuration file to define server transports (stdio, SSE, HTTP) and Chronosphere API details.
Verified SafeView Analysis
mcp-server
by exasol
Provides an LLM access to the Exasol database via MCP tools, enabling metadata browsing, SQL query execution, and BucketFS file system interaction.
Provides an LLM access to the Exasol database via MCP tools, enabling metadata browsing, SQL query execution, and BucketFS file system interaction.
Setup Requirements
- ⚠️Requires an Exasol database to connect to, configured via environment variables.
- ⚠️Specific Python version constraint (>=3.10,<3.14).
- ⚠️`uv` package is required for recommended installation/execution methods.
- ⚠️Extensive environment variable configuration if using advanced features like OAuth2 or SaaS for authentication and/or BucketFS access.
Verified SafeView Analysis
fluidmcp
by Fluid-AI
Orchestrates Model Context Protocol (MCP) servers and LLM inference engines (like vLLM) via a unified FastAPI gateway, enabling dynamic management, tool invocation, and multi-model LLM serving.
Orchestrates Model Context Protocol (MCP) servers and LLM inference engines (like vLLM) via a unified FastAPI gateway, enabling dynamic management, tool invocation, and multi-model LLM serving.
Setup Requirements
- ⚠️Requires Python 3.9+ to run.
- ⚠️For LLM features, vLLM must be separately installed (`pip install vllm>=0.6.0`).
- ⚠️A CUDA-capable GPU is strongly recommended for vLLM to function efficiently.
- ⚠️A Hugging Face Hub token (set as `HUGGING_FACE_HUB_TOKEN`) is required for accessing gated LLM models.
- ⚠️Requires a running MongoDB instance for persistent configuration and logging by default; otherwise, data is lost on server restart. Persistence can be enforced with `--require-persistence`.
- ⚠️Careful management of ports is necessary to avoid conflicts when running multiple MCP servers or LLM models simultaneously.
Verified SafeView Analysis
containerized-strands-agents
by mkmeral
Hosts isolated Strands AI agents in Docker containers, managing their lifecycle, persistence, and tool access.
Hosts isolated Strands AI agents in Docker containers, managing their lifecycle, persistence, and tool access.
Setup Requirements
- ⚠️Requires Docker to be running for agent isolation and execution.
- ⚠️Requires AWS credentials configured with access to Amazon Bedrock (Claude models) for default LLM functionality.
- ⚠️Requires Python 3.11+.
Review RequiredView Analysis
get-biji-dev-by-gemini3pro
by PancrePal-xiaoyibao
Integrates the Get Notes API with a Model Context Protocol (MCP) server to provide AI-powered knowledge search and recall from multiple knowledge bases.
Integrates the Get Notes API with a Model Context Protocol (MCP) server to provide AI-powered knowledge search and recall from multiple knowledge bases.
Setup Requirements
- ⚠️Requires a valid API key for the 'Get Notes' service (a paid service).
- ⚠️Requires Node.js (version 18+ is recommended).
- ⚠️Complex configuration for multiple knowledge bases, either via a 'knowledge_bases.json' file or a large JSON string environment variable.
Verified SafeView Analysis
slither-mcp
by trailofbits
Provides static analysis for Solidity smart contracts using Slither via the Model Context Protocol (MCP), making contract metadata, inheritance, function calls, and security vulnerabilities accessible to LLMs and other tools.
Provides static analysis for Solidity smart contracts using Slither via the Model Context Protocol (MCP), making contract metadata, inheritance, function calls, and security vulnerabilities accessible to LLMs and other tools.
Setup Requirements
- ⚠️Requires Python 3.11+.
- ⚠️Requires a Solidity development environment (e.g., Foundry 'forge' or Hardhat 'npx') to be installed and discoverable in the system's PATH for static analysis.
- ⚠️For specific LLM client integrations (e.g., Claude, Cursor), 'uvx' might be required and configured in the system's PATH.
Verified SafeView Analysis
Dynamic-Smart-MCP
by UAEpro
An intelligent FastMCP 2 server that converts natural language questions into SQL queries or API requests for any SQL database or OpenAPI-defined API using AI.
An intelligent FastMCP 2 server that converts natural language questions into SQL queries or API requests for any SQL database or OpenAPI-defined API using AI.
Setup Requirements
- ⚠️Requires an LLM API Key (e.g., OpenAI, OpenWebUI) which might be a paid service.
- ⚠️Requires specific database drivers (e.g., psycopg2-binary for PostgreSQL, pymysql for MySQL) if not using the default SQLite.
- ⚠️Requires manual execution of `python generate_schema.py` to create initial database/API context.
Verified SafeView Analysis
n8n-mcp-server-custom
by burnham
This server acts as a Model Context Protocol (MCP) intermediary, enabling AI assistants like Antigravity to interact with and manage n8n automation workflows via its REST API.
This server acts as a Model Context Protocol (MCP) intermediary, enabling AI assistants like Antigravity to interact with and manage n8n automation workflows via its REST API.
Setup Requirements
- ⚠️Requires Node.js v18.0.0 or higher.
- ⚠️Requires an active n8n instance with a generated API Key and its corresponding URL.
- ⚠️Specific configuration for Antigravity involves editing `mcp_config.json` with absolute paths, which can be error-prone across different operating systems.