llm-rag-mcp-example
Verified Safeby giuliolibrando
Overview
Deploys an on-premise AI platform utilizing RAG and LLMs to manage IT infrastructure by indexing data from Redmine and Wiki.js.
Installation
No command providedSecurity Notes
The project emphasizes local deployment with Nginx Proxy Manager for external access and TLS termination. The README explicitly provides critical security advice regarding changing default credentials, protecting the NPM dashboard, and exposing only necessary ports. Adhering to these recommendations is crucial for maintaining a secure environment.
Similar Servers
MaxKB
MaxKB (Max Knowledge Brain) is an enterprise-grade intelligent agent platform designed to lower the technical barrier and deployment costs of AI implementation, helping businesses quickly integrate mainstream large language models, build proprietary knowledge bases, and offer a progressive upgrade path from RAG to complex workflow automation and advanced agents for various application scenarios like smart customer service and office assistants.
Context-Engine
Self-improving code search and context engine for IDEs and AI agents, providing hybrid semantic/lexical search, symbol graph navigation, and persistent memory.
mcp_massive
An AI agent orchestration server, likely interacting with LLMs and managing multi-agent workflows.
flexible-graphrag
The Flexible GraphRAG MCP Server integrates document processing, knowledge graph building, hybrid search, and AI query capabilities via the Model Context Protocol (MCP) for clients like Claude Desktop and MCP Inspector.