mcp-explore
Verified Safeby AteetAgarwal
Overview
A Streamlit chat application integrating multiple MCP servers (local and remote) to orchestrate LLM tool calls.
Installation
uv run streamlit run .\client_v2.pyEnvironment Variables
- AZURE_OPENAI_API_KEY
- AZURE_OPENAI_ENDPOINT
- OPENAI_API_VERSION
Security Notes
The code generally follows good security practices for a demo application. API keys are loaded from environment variables (.env). SQL queries for the expense tracker use parameterized statements, preventing common SQL injection vulnerabilities. Direct execution of external processes via `uv run` in the `SERVERS` configuration uses hardcoded command arguments, limiting risks from user input-driven arbitrary command execution. Tool arguments from the LLM are expected to be JSON or dicts, and `json.loads()` is used, which is generally safe within this context. No explicit `eval` or `exec` on user-controlled input was found.
Similar Servers
fastmcp
FastMCP is a Python framework for building and interacting with Model Context Protocol (MCP) servers. It provides client and server capabilities, enabling the creation of AI agents and services through definable tools, resources, and prompts. It supports various transports, authentication methods, logging, and background task execution, with strong integration for OpenAPI specifications.
Docker_MCPGUIApp
A conversational AI chatbot leveraging Docker's Model and Component Protocol (MCP) to integrate with LLMs and perform various tool-augmented searches (web, academic papers).
fastchat-mcp
A Python client integrating Language Models (LLMs) with Model Context Protocol (MCP) servers, enabling natural language interaction with external tools, resources, and prompts via terminal or a FastAPI/WebSocket API.
polybrain-mcp
Connects AI agents to multiple LLM models, providing conversation history management and model switching capabilities.