VLArm
Verified Safeby gconsigli
Overview
Enables a physical robotic arm (SO-101) to be controlled by a local Large Language Model via the Model Context Protocol (MCP) for autonomous, language-based task execution.
Installation
ollmcp --mcp-server server.py --model qwen2.5Environment Variables
- API_BASE_URL
- ROBOT_ID
Security Notes
The server primarily acts as a proxy, forwarding commands from a local LLM to a locally-running Phosphobot API. It uses hardcoded local URLs and a default robot ID (0), which is safe for local operation. No dynamic code execution (e.g., 'eval') or obvious injection vulnerabilities were found in the provided code. The primary risk would stem from vulnerabilities within the local Phosphobot API itself or if the local 'localhost' endpoint is inadvertently exposed externally.
Similar Servers
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation for local LLMs.
ollama-fastmcp-wrapper
A proxy service that bridges Ollama with FastMCP, enabling local LLM tool-augmented reasoning by exposing MCP servers' functionality to Ollama models.
mcp-server-llmling
mcp-server-llmling serves as a Machine Chat Protocol (MCP) server, providing a YAML-based system to configure and manage LLM applications, including resources, prompts, and tools.
ollama-mcp-server
Provides a self-contained Model Context Protocol (MCP) server for local Ollama management, enabling features like listing models, chatting, server control, and intelligent model recommendations.