lmstudio-bridge-enhanced
Verified Safeby ahmedibrahim085
Overview
Bridges local LLMs running in LM Studio with the Model Context Protocol (MCP) ecosystem, enabling autonomous AI agents to use external tools (filesystem, web, knowledge graph, GitHub, vision) with multi-model orchestration, structured JSON output, and intelligent model capability management.
Installation
python3 /absolute/path/to/lmstudio-bridge-enhanced/main.pyEnvironment Variables
- LMSTUDIO_HOST
- LMSTUDIO_PORT
- DEFAULT_MODEL
- MCP_JSON_PATH
- GITHUB_PERSONAL_ACCESS_TOKEN
- WORKING_DIR
- LMS_MAX_RETRIES
- LMS_RETRY_BASE_DELAY
- LMS_RETRY_MAX_DELAY
- LMS_EXTRA_NUMERIC_PARAMS
Security Notes
The server acts as an MCP client, dynamically discovering and executing other MCP servers defined in a user-configurable `.mcp.json` file. While input parameters like `mcp_name` and `working_directory` are properly validated (e.g., `validate_mcp_name` regex, `validate_working_directory` for path traversal, fixed in v3.2.1), the `command` and `args` fields read directly from `.mcp.json` for spawning subprocesses are inherently trusted. A malicious or compromised `.mcp.json` file could still lead to arbitrary command execution on the host system if the user's configuration is not secured. HTML escaping (`html.escape`) is used for LLM reasoning output to prevent XSS. `GITHUB_PERSONAL_ACCESS_TOKEN` is handled via environment variables, not hardcoded. Logging is standardized and bare `except` clauses are replaced (fixed in v3.2.1). Overall, generally safe with the strong recommendation to ensure the `.mcp.json` configuration file is secure and trusted.
Similar Servers
fastmcp
FastMCP is an ergonomic interface for the Model Context Protocol (MCP), providing a comprehensive framework for building and interacting with AI agents, tools, resources, and prompts across various transports and authentication methods.
mcp-interviewer
A Python CLI tool to evaluate Model Context Protocol (MCP) servers for agentic use-cases, by inspecting capabilities, running functional tests, and providing LLM-as-a-judge evaluations.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.
memory-mcp-server
Provides long-term memory and context storage/retrieval for Large Language Models (LLMs) via an API, adhering to the Model Context Protocol (MCP).