coral-tpu-mcp
by marc-shade
Overview
Provides fast, local, hardware-accelerated ML inference (image classification, object detection, pose estimation, semantic segmentation, audio classification, keyword spotting) and text embeddings for AI agentic systems using Google Coral TPUs.
Installation
python -m coral_tpu_mcp.serverEnvironment Variables
- AGENTIC_SYSTEM_PATH
- TF_CPP_MIN_LOG_LEVEL
- ABSL_MIN_LOG_LEVEL
Security Notes
The `AGENTIC_SYSTEM_PATH` environment variable is used to dynamically add a directory to `sys.path` for importing the `tpu_monitor` module. If this environment variable is controlled by an untrusted entity, it could lead to arbitrary code execution through Python module injection, posing a Remote Code Execution (RCE) vulnerability. While model file validation (checksums, size, extension) is implemented to mitigate risks during model loading, this does not address the `sys.path` injection vector. The server uses stdio for communication and does not appear to open direct network listeners.
Similar Servers
lemonade
Lemonade Server is a high-performance C++ HTTP server providing local OpenAI-compatible API endpoints for various AI inference tasks including large language models (LLMs), embeddings, reranking, and audio transcription, with a focus on AMD Ryzen AI hardware acceleration.
Lynkr
Lynkr is an AI orchestration layer that acts as an LLM gateway, routing language model requests to various providers (Ollama, Databricks, OpenAI, etc.). It provides an OpenAI-compatible API and enables AI-driven coding tasks via a rich set of tools and a multi-agent framework, with a strong focus on security, performance, and token efficiency. It allows AI agents to interact with a defined workspace (reading/writing files, executing shell commands, performing Git operations) and leverages long-term memory and agent learning to enhance task execution.
AgentUp
A developer-first framework for building, deploying, and managing AI agents, bringing Docker-like consistency and operational ease to AI agent development.
2ly
Skilder is an infrastructure layer for AI agent tooling, providing a private tool registry and embedded runtimes for integrating with various agent frameworks and custom tools.