fleetpkg-mcp
by andrewkroh
Overview
Enables LLMs to query low-level metadata about Elastic Fleet integration packages from a SQLite database.
Installation
go run github.com/andrewkroh/fleetpkg-mcp@main -dir /path/to/elastic/integrationsSecurity Notes
The server allows LLMs to execute arbitrary SQLite queries against a read-only database via the `fleetpkg_execute_sql_query` tool. While the database is read-only, preventing data modification, this design poses risks: 1. **Resource Exhaustion (DoS)**: Malicious or poorly optimized queries crafted by an LLM could consume significant CPU and memory, leading to a denial of service. 2. **Information Leakage**: Complex queries could potentially extract unintended correlations or meta-information from the database schema that might not be intended for direct exposure. 3. **Path Disclosure**: Although the database is opened in `mode=ro`, certain SQLite features (if enabled in the `modernc.org/sqlite` driver's build, e.g., `readfile()`) could theoretically be exploited to read files on the system, though this is less likely with standard builds. Without explicit query sandboxing, validation, or rate-limiting for the `fleetpkg_execute_sql_query` tool, allowing untrusted LLMs to issue arbitrary SQL queries is a significant security risk. The `mode=ro` on the database connection is a strong mitigation against data corruption but not against resource abuse or some forms of information exposure.
Similar Servers
logfire-mcp
Enables LLMs to retrieve and analyze application telemetry data (OpenTelemetry traces and metrics) from Pydantic Logfire, including executing arbitrary SQL queries.
memory-mcp-server-go
A Model Context Protocol server providing knowledge graph management capabilities for LLMs to maintain memory across conversations.
mkp
MKP is a Model Context Protocol (MCP) server for Kubernetes, enabling LLM-powered applications to interact with Kubernetes clusters by providing tools for resource listing, getting, applying, deleting, and executing commands.
blz
Provides fast, local documentation search and retrieval for AI agents, using llms.txt files for line-accurate citations.