mcp-grafana
Verified Safeby grafana
Overview
Provides a Model Context Protocol (MCP) server for Grafana, enabling AI agents to interact with Grafana features such as dashboards, datasources, alerting, incidents, and more through a structured tool-based interface.
Installation
docker run --rm -i -e GRAFANA_URL="http://localhost:3000" -e GRAFANA_SERVICE_ACCOUNT_TOKEN="<your_service_account_token>" grafana/mcp-grafana -t stdioEnvironment Variables
- GRAFANA_URL
- GRAFANA_SERVICE_ACCOUNT_TOKEN
Security Notes
The server demonstrates good security practices for an integration component. It reads authentication credentials (API keys, basic auth, access tokens) from environment variables, preventing hardcoding. Network communication with Grafana and its datasources is handled with TLS configuration options. Response bodies are read with limits to prevent memory exhaustion, and non-200 HTTP statuses are handled with error messages. No obvious 'eval' or obfuscation patterns were found. The 'disable-write' flag and RBAC guidance are crucial for secure deployments.
Similar Servers
inspector
A web-based client and proxy server for inspecting and interacting with Model Context Protocol (MCP) servers, allowing users to browse resources, prompts, and tools, perform requests, and debug OAuth authentication flows.
opentelemetry-mcp-server
Enables AI assistants to query and analyze OpenTelemetry traces from LLM applications for debugging, performance, and cost optimization.
mcp-apache-spark-history-server
Connect AI agents to Apache Spark History Server for intelligent job analysis and performance monitoring.
shinzo
Shinzo is an open-source observability platform for monitoring and analyzing the performance, usage, and telemetry data of AI agents and Model Context Protocol (MCP) servers.