llm_mcp
Verified Safeby AuraFriday
Overview
Provides a local, offline LLM inference server with integrated tool-calling capabilities for the MCP ecosystem, enabling autonomous AI agents without cloud dependencies.
Installation
No command providedEnvironment Variables
- HF_HOME
Security Notes
The server performs auto-installation/upgrade of PyTorch and Transformers via `pip`, potentially uninstalling existing Python packages, and downloads models from HuggingFace Hub, which requires network access and trust in those upstream sources. It also triggers internal server restarts (`mcp_bridge.call("server_control", {"operation": "restart"})`). The `tool_unlock_token` provides a local access control mechanism for operations.
Similar Servers
toolsdk-mcp-registry
A unified registry and API gateway for discovering, managing, and executing Model Context Protocol (MCP) servers, supporting local and secure sandbox execution with OAuth 2.1 integration.
agentor
Deploy scalable AI agents with tool integrations (weather, email, GitHub, etc.) and support for A2A and MCP communication protocols.
mcp-servers
An MCP server for fetching, cleaning, and intelligently extracting content from web pages, designed for agent-building frameworks.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.