LLM_with_MCP_server
Verified Safeby ishant1234567890
Overview
Facilitates real-time, AI-powered chat interaction between a Large Language Model and players within a Minecraft Pi Edition server.
Installation
python main.pyEnvironment Variables
- LLM_MODEL_PATH
- MC_HOST
- MC_PORT
Security Notes
The server does not contain 'eval' calls, obfuscated code, or apparent hardcoded secrets. It connects to a local LLM and a Minecraft server, posting LLM responses to chat. The primary security consideration would be the content generated by the LLM if deployed in an uncontrolled environment, but the code itself does not introduce direct system vulnerabilities.
Similar Servers
mcp-omnisearch
Provides a unified interface for various search, AI response, content processing, and enhancement tools via Model Context Protocol (MCP).
mcp-rubber-duck
An MCP (Model Context Protocol) server that acts as a bridge to query multiple OpenAI-compatible LLMs, enabling multi-agent AI workflows and providing an AI 'rubber duck' debugging panel.
mcp-servers
An MCP server for managing files in Google Cloud Storage, supporting CRUD operations (save, get, search, delete) and exposing files as resources.
cross-llm-mcp
Provides unified access to multiple Large Language Model APIs (ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, Mistral) for AI coding environments, enabling intelligent model selection, preferences, and prompt logging.