mcp
Verified Safeby jpotter80
Overview
A framework for building and deploying self-contained, searchable Model Context Protocol (MCP) servers for technical documentation, leveraging hybrid semantic and keyword search.
Installation
cd servers/mojo-manual-mcp && pixi run serveEnvironment Variables
- MAX_SERVER_URL
- EMBED_MODEL_NAME
- AUTO_START_MAX
- EMBED_CACHE_SIZE
- MAX_BINARY
Security Notes
The system utilizes `subprocess.Popen` to auto-start a local `MAX` embedding server, which is a common pattern for local infrastructure orchestration. While this inherently carries a degree of risk, the command executed is fixed and not derived from arbitrary user input, mitigating severe vulnerabilities. All `openai.OpenAI` client instances are configured with `api_key="EMPTY"`, preventing accidental exposure of sensitive API keys. The primary server interaction occurs via `stdio` with an MCP-compatible host (e.g., VS Code, Claude Desktop), reducing direct external network exposure. File operations are confined to predefined paths for database and configuration files, preventing arbitrary file system access.
Similar Servers
Context-Engine
Self-improving code search and context engine for IDEs and AI agents, providing hybrid semantic/lexical search, symbol graph navigation, and persistent memory.
blz
Provides fast, local documentation search and retrieval for AI agents, using llms.txt files for line-accurate citations.
qdrant-mcp-server
This server provides semantic search capabilities using Qdrant vector database, primarily focused on code vectorization for intelligent codebase indexing and semantic code search, as well as general document search.
doc-mcp-server
Provides real-time access to up-to-date documentation from various package ecosystems (PyPI, npm, GitHub, etc.) for LLM-powered coding agents, mitigating hallucination and outdated information.