ros2_mcp
by wise-vision
Overview
The ROS2 MCP server allows AI tooling to connect with ROS 2 nodes, topics, services, and actions using the Model Context Protocol over stdio or SSE for real-time robotic control, data analysis, and AI-powered debugging.
Installation
docker run -i --rm wisevision/ros2_mcp:humbleEnvironment Variables
- MCP_CUSTOM_PROMPTS
- MCP_PROMPTS_LOCAL
- MCP_PROMPTS_PATH
- MCP_PROMPTS_MODULE
Security Notes
The server uses `importlib.import_module` to load custom prompts based on user-controlled environment variables (`MCP_PROMPTS_PATH`, `MCP_PROMPTS_MODULE`) or CLI arguments. This presents a critical Remote Code Execution (RCE) vulnerability if an attacker can manipulate these settings to point to a malicious Python module. The prompt rendering (`template.format(**arguments)`) also risks template injection if prompts or arguments are untrusted. While the primary transport is stdio, an SSE (HTTP) transport is available via `uvicorn` and `Starlette` without explicit authentication/authorization mechanisms shown, creating a potential network exposure. Deserialization of `lora_msgs/srv/GetMessages` responses, while using trusted ROS2 libraries, could be an indirect vector if message types are supplied maliciously by a compromised AI agent.
Similar Servers
ros-mcp-server
Connects large language models (LLMs) with ROS/ROS2 robots, enabling natural language control and real-time observation without modifying robot code.
kubernetes-mcp-server
Provides a Model Context Protocol (MCP) server for AI agents to interact with Kubernetes and OpenShift clusters, enabling AI-driven cluster management and diagnosis.
gdb-mcp-server
Provides an AI-assisted debugging server for GDB using the Model Context Protocol, enabling AI agents to interact with and control GDB sessions.
Polymcp
A comprehensive toolkit and agent framework for building Model Context Protocol (MCP) servers and orchestrating them with Large Language Models (LLMs) across Python and TypeScript environments.