llm_mcp
Verified Safeby AuraFriday
Overview
Provides a local, offline LLM inference server with integrated tool-calling capabilities for the MCP ecosystem, enabling autonomous AI agents without cloud dependencies.
Installation
No command providedEnvironment Variables
- HF_HOME
Security Notes
The server performs auto-installation/upgrade of PyTorch and Transformers via `pip`, potentially uninstalling existing Python packages, and downloads models from HuggingFace Hub, which requires network access and trust in those upstream sources. It also triggers internal server restarts (`mcp_bridge.call("server_control", {"operation": "restart"})`). The `tool_unlock_token` provides a local access control mechanism for operations.
Similar Servers
toolsdk-mcp-registry
An API-driven registry for Model Context Protocol (MCP) servers, enabling discovery, detail retrieval, and execution of various AI tools and agents.
mcpc
Build and compose agentic Model Context Protocol (MCP) servers and tools, enabling AI assistants to discover, integrate, and orchestrate other MCP servers for complex tasks.
mcp-servers
An MCP server for managing files in Google Cloud Storage, supporting CRUD operations (save, get, search, delete) and exposing files as resources.
1xn-vmcp
An open-source platform for composing, customizing, and extending multiple Model Context Protocol (MCP) servers into a single logical, virtual MCP server, enabling fine-grained context engineering for AI workflows and agents.