Back to Home
AuraFriday icon

llm_mcp

Verified Safe

by AuraFriday

Overview

Provides a local, offline LLM inference server with integrated tool-calling capabilities for the MCP ecosystem, enabling autonomous AI agents without cloud dependencies.

Installation

Run Command
No command provided

Environment Variables

  • HF_HOME

Security Notes

The server performs auto-installation/upgrade of PyTorch and Transformers via `pip`, potentially uninstalling existing Python packages, and downloads models from HuggingFace Hub, which requires network access and trust in those upstream sources. It also triggers internal server restarts (`mcp_bridge.call("server_control", {"operation": "restart"})`). The `tool_unlock_token` provides a local access control mechanism for operations.

Similar Servers

Stats

Interest Score0
Security Score8
Cost ClassLow
Avg Tokens1000
Stars0
Forks0
Last Update2025-12-02

Tags

Local LLMOffline AITool-CallingAutonomous AgentsMCP Integration