VLArm
Verified Safeby gconsigli
Overview
Enables a physical robotic arm (SO-101) to be controlled by a local Large Language Model via the Model Context Protocol (MCP) for autonomous, language-based task execution.
Installation
ollmcp --mcp-server server.py --model qwen2.5Environment Variables
- API_BASE_URL
- ROBOT_ID
Security Notes
The server primarily acts as a proxy, forwarding commands from a local LLM to a locally-running Phosphobot API. It uses hardcoded local URLs and a default robot ID (0), which is safe for local operation. No dynamic code execution (e.g., 'eval') or obvious injection vulnerabilities were found in the provided code. The primary risk would stem from vulnerabilities within the local Phosphobot API itself or if the local 'localhost' endpoint is inadvertently exposed externally.
Similar Servers
osaurus
Osaurus is a native macOS LLM server running local language models with OpenAI and Ollama compatible APIs, enabling tool calling and a plugin ecosystem for AI agents.
mcp-client-for-ollama
An interactive Python client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation.
ollama-fastmcp-wrapper
A proxy service that bridges Ollama models with FastMCP servers, enabling local LLM-tool augmented reasoning and persistent conversational history via an API or CLI.
mcp-server-llmling
mcp-server-llmling serves as a Machine Chat Protocol (MCP) server, providing a YAML-based system to configure and manage LLM applications, including resources, prompts, and tools.