DOREMUS_MCP
Verified Safeby SimoneFassio
Overview
A Model Context Protocol (MCP) server for accessing the DOREMUS Knowledge Graph, enabling LLMs to query classical music metadata including composers, works, performances, recordings, and instrumentation using natural language.
Installation
docker-compose up --buildEnvironment Variables
- HOST
- PORT
- LLM_SAMPLING_PROVIDER
- LLM_SAMPLING_MODEL
- OPENAI_API_KEY
- GROQ_API_KEY
- CEREBRAS_API_KEY
- OLLAMA_API_URL
- OLLAMA_API_KEY
Security Notes
The project uses environment variables for API keys, which is good practice. It makes outgoing network requests to a public SPARQL endpoint and various third-party LLM providers (OpenAI, Groq, Cerebras, Ollama) for 'sampling' (disambiguation) during query building. This introduces data privacy considerations as user-derived query intents are sent to external services. The 'validate_doremus_uri' function makes POST requests to DOREMUS URIs to detect hallucinations; while checked to be DOREMUS-specific, a highly sophisticated attack might theoretically exploit redirects or vulnerabilities within that domain. No direct 'eval' calls, obfuscation, or obvious malicious patterns were found.
Similar Servers
MaxKB
MaxKB (Max Knowledge Brain) is an enterprise-grade intelligent agent platform designed to lower the technical barrier and deployment costs of AI implementation, helping businesses quickly integrate mainstream large language models, build proprietary knowledge bases, and offer a progressive upgrade path from RAG to complex workflow automation and advanced agents for various application scenarios like smart customer service and office assistants.
npcpy
Core library of the NPC Toolkit that supercharges natural language processing pipelines and agent tooling. It's a flexible framework for building state-of-the-art applications and conducting novel research with LLMs. Supports multi-agent systems, fine-tuning, reinforcement learning, genetic algorithms, model ensembling, and NumPy-like operations for AI models (NPCArray). Includes a built-in Flask server for deploying agent teams via REST APIs, and multimodal generation (image, video, audio).
Lynkr
Lynkr is an AI orchestration layer that acts as an LLM gateway, routing language model requests to various providers (Ollama, Databricks, OpenAI, etc.). It provides an OpenAI-compatible API and enables AI-driven coding tasks via a rich set of tools and a multi-agent framework, with a strong focus on security, performance, and token efficiency. It allows AI agents to interact with a defined workspace (reading/writing files, executing shell commands, performing Git operations) and leverages long-term memory and agent learning to enhance task execution.
mcp_massive
An AI agent orchestration server, likely interacting with LLMs and managing multi-agent workflows.