Back to Home

llm-rag-mcp-example

Verified Safe

by giuliolibrando

Overview

Deploys an on-premise AI platform utilizing RAG and LLMs to manage IT infrastructure by indexing data from Redmine and Wiki.js.

Installation

Run Command
No command provided

Security Notes

The project emphasizes local deployment with Nginx Proxy Manager for external access and TLS termination. The README explicitly provides critical security advice regarding changing default credentials, protecting the NPM dashboard, and exposing only necessary ports. Adhering to these recommendations is crucial for maintaining a secure environment.

Similar Servers

Stats

Interest Score0
Security Score8
Cost ClassLow
Avg Tokens1500
Stars0
Forks0
Last Update2025-11-17

Tags

RAGOn-Premise AILLMIT OperationsDocker