lc-mcp-server
Verified Safeby refractionPOINT
Overview
This server bridges AI assistants with the LimaCharlie security platform, enabling natural language interaction for querying telemetry, investigating endpoints, responding to threats, and managing security content.
Installation
docker run -d -e LC_OID="your-org-id" -e LC_API_KEY="your-api-key" -e MCP_MODE="http" -e PORT="8080" -p 8080:8080 lc-mcp-server:latestEnvironment Variables
- LC_OID
- LC_API_KEY
- MCP_MODE
- MCP_PROFILE
- LOG_LEVEL
- REDIS_ENCRYPTION_KEY
- GOOGLE_API_KEY
- SDK_CACHE_TTL
- AUDIT_LOG_ENABLED
- LC_UID
- LC_JWT
- PORT
- MCP_SERVER_URL
- REDIS_URL
Security Notes
The project demonstrates a strong focus on security, especially for multi-tenant environments. Key strengths include mandatory AES-256-GCM encryption for OAuth tokens (REDIS_ENCRYPTION_KEY), extensive testing for credential isolation and concurrency safety (100% test coverage in internal/auth/), and UID validation. The `lc_call_tool` meta-tool allows dynamic invocation of other tools, but is protected by a meta-tool filter mechanism (allow/deny lists). Powerful operations like direct sensor command execution and payload upload/download are core to the security platform integration, and are inherently high-risk, though file path validation is implemented. A hardcoded Firebase API key is noted, but generally considered low risk for client-side Firebase interaction.
Similar Servers
mcp-filesystem-server
Provides secure and controlled access to the local filesystem via the Model Context Protocol (MCP) for AI agents and other applications.
tmcp
A server implementation for the Model Context Protocol (MCP) to enable LLMs to access external context and tools.
opensearch-mcp-server-py
Enables AI assistants and LLMs to interact with OpenSearch clusters by providing a standardized Model Context Protocol (MCP) interface through built-in and dynamic tools.
toolhive-studio
ToolHive is a desktop application (Electron UI) for discovering, deploying, and managing Model Context Protocol (MCP) servers in isolated containers, and connecting them to AI agents and clients.