doc-rag-mcp-server
Verified Safeby aruc-dev
Overview
A Retrieval Augmented Generation (RAG) system for ingesting documents and performing AI-powered semantic search.
Installation
python rag_mcp_server.pyEnvironment Variables
- LLM_PROVIDER
- GOOGLE_GEMINI_API_KEY
- OLLAMA_BASE_URL
- OLLAMA_MODEL
- OLLAMA_EMBEDDING_MODEL
Security Notes
The system uses environment variables for API keys and adheres to good practices by excluding `.env` from version control. The core MCP server communicates via stdio, limiting direct network exposure. However, the `ingest_document` tool directly accepts a `file_path: str`. If the server were deployed in an environment accessible to untrusted users, this could be exploited for Local File Inclusion (LFI) to read arbitrary files from the server's filesystem, or for Denial of Service (DoS) by ingesting extremely large files. For a local, trusted user setup, this risk is mitigated.
Similar Servers
flexible-graphrag
The Flexible GraphRAG MCP Server integrates document processing, knowledge graph building, hybrid search, and AI query capabilities via the Model Context Protocol (MCP) for clients like Claude Desktop and MCP Inspector.
rag-server-mcp
Provides Retrieval Augmented Generation (RAG) capabilities to Model Context Protocol (MCP) clients by indexing project documents and retrieving relevant content for LLMs.
concept-rag
This MCP server provides conceptual search, document analysis, and library exploration capabilities over a knowledge base using LanceDB and LLM-based concept extraction.
the-pensieve
The Pensieve server acts as a RAG-based knowledge management system, allowing users to store, query, and analyze their knowledge using natural language and LLM-powered insights.