rag-mcp-server
by Pond500
Overview
This server implements a Multi-Knowledge Base RAG system, allowing AI agents to upload, manage, and semantically search documents across multiple knowledge bases through a FastAPI-based MCP (Model Context Protocol) API.
Installation
python -B -m uvicorn mcp_server_multi_kb:app --host 0.0.0.0 --port 8000Environment Variables
- LLM_API_BASE
- LLM_API_KEY
- LLM_MODEL_NAME
Security Notes
CRITICAL: The `LLM_API_KEY` in `app/config.py` has a hardcoded default value, which is a severe security vulnerability as it can lead to API key leakage if the `.env` file is not properly configured or if the server is run in an exposed environment. The `OCR_API_ENDPOINT` is hardcoded to an external IP, creating a single point of failure and potential data privacy concerns. The server is configured to listen on `0.0.0.0`, making it accessible from any network interface without any explicit authentication or authorization layer at the API level, relying solely on the integrating AI agent (e.g., Dify) for access control. This makes it highly vulnerable if exposed directly to the internet.
Similar Servers
fastmcp
FastMCP is an ergonomic interface for the Model Context Protocol (MCP), providing a comprehensive framework for building and interacting with AI agents, tools, resources, and prompts across various transports and authentication methods.
mcpo
Exposes Model Context Protocol (MCP) tools as OpenAPI-compatible HTTP servers.
mcp-context-forge
Converts web content (HTML, PDF, DOCX, etc.) and local files from a URL into high-quality Markdown format. It supports multiple conversion engines, content optimization, batch processing, and image handling.
context-portal
Manages structured project context for AI assistants and developer tools, enabling Retrieval Augmented Generation (RAG) and prompt caching within IDEs.