Chatbot-the-MCP-way
by jayyprajapati
Overview
Interactive command-line chat application enabling AI model interaction with document retrieval and extensible command execution via the Model Control Protocol (MCP).
Installation
uv run main.pyEnvironment Variables
- LLAMA_MODEL
- OLLAMA_BASE_URL
- USE_UV
Security Notes
The `MCPClient` in `mcp_client.py` is designed to execute arbitrary commands (`command` and `args`). While the `doc_client` in `main.py` uses fixed, safe commands for `mcp_server.py`, the application also iterates through `sys.argv[1:]` to launch additional `MCPClient` instances with `uv run server_script`. This allows any script passed as a command-line argument to `main.py` to be executed. This poses a significant command injection risk if `main.py` is ever invoked with untrusted arguments, as no input validation or whitelisting is evident for these additional server scripts. Document content editing via `edit_document` also presents a minor data integrity risk if the LLM is unconstrained.
Similar Servers
wcgw
Empowering chat applications to code, build, and run on your local machine by providing tightly integrated shell and code editing tools.
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation for local LLMs.
mcp-server
Provides a Model Context Protocol (MCP) server for AI agents to search and retrieve curated documentation for the Strands Agents framework, facilitating AI coding assistance.
mcp-use-cli
An interactive command-line interface (CLI) tool for connecting to and interacting with Model Context Protocol (MCP) servers using natural language, acting as an AI client that orchestrates LLM responses with external tools.