interactive-terminal-mcp
by WangYihang
Overview
Provides LLMs with stateful, interactive terminal access for persistent processes, remote management, and debugging.
Installation
uvx interactive-terminal-mcpSecurity Notes
The server provides full terminal access to the connected LLM via `pexpect.spawn`, allowing execution of arbitrary commands with the permissions of the user running the server. There is no inherent sandboxing or command validation within the application logic. The README explicitly warns that it should 'only be used in trusted environments or sandboxed containers (e.g., Docker) to prevent unauthorized system modifications.'
Similar Servers
wcgw
Empowering chat applications to code, build, and run on your local machine by providing tightly integrated shell and code editing tools.
Lynkr
Lynkr is an AI orchestration layer that acts as an LLM gateway, routing language model requests to various providers (Ollama, Databricks, OpenAI, etc.). It provides an OpenAI-compatible API and enables AI-driven coding tasks via a rich set of tools and a multi-agent framework, with a strong focus on security, performance, and token efficiency. It allows AI agents to interact with a defined workspace (reading/writing files, executing shell commands, performing Git operations) and leverages long-term memory and agent learning to enhance task execution.
llms
A centralized configuration and documentation management system for LLMs, providing tools for building skills, commands, agents, prompts, and managing MCP servers across multiple LLM providers.
Local_MCP_Client
The client acts as a cross-platform web and API interface for natural language interaction with configurable MCP servers, facilitating structured tool execution and dynamic agent behavior using local LLMs.