rmcp_memex
Verified Safeby LibraxisAI
Overview
Lightweight Model Context Protocol (MCP) server providing local Retrieval-Augmented Generation (RAG) capabilities with embedded vector store and local/remote embeddings.
Installation
cargo run --release -- --log-level infoEnvironment Variables
- DISABLE_MLX
- DRAGON_BASE_URL
- MLX_JIT_MODE
- MLX_JIT_PORT
- EMBEDDER_PORT
- RERANKER_PORT
- EMBEDDER_MODEL
- RERANKER_MODEL
- FASTEMBED_CACHE_PATH
- HF_HUB_CACHE
- LANCEDB_PATH
- PROTOC
Security Notes
The `rag_index` function accepts a `path` argument, allowing a client to specify arbitrary local file paths for indexing. If the server is exposed to untrusted input or runs with elevated privileges, this could lead to information disclosure by indexing sensitive files. Additionally, LanceDB filter predicates are constructed with manual string sanitization (`replace('\'', "''")`), which, while present, always carries a higher risk compared to parameterized queries for preventing injection vulnerabilities.
Similar Servers
mcp-local-rag
Provides a local, RAG-like web search tool for Large Language Models to retrieve current information and context.
rag-server-mcp
Provides Retrieval Augmented Generation (RAG) capabilities to Model Context Protocol (MCP) clients by indexing project documents and retrieving relevant content for LLMs.
local_faiss_mcp
Provides a local FAISS-based vector database as an MCP server for Retrieval-Augmented Generation (RAG) applications, enabling document ingestion, semantic search, and prompt generation.
mcp-memory-server
Provides semantic memory storage for AI assistants, enabling them to store and retrieve decisions, patterns, and context across sessions using vector embeddings.