htm-mcp-server
by cserock
Overview
Deploys a user-friendly Streamlit interface for a LangGraph ReAct AI agent to interact with various external tools and data sources via the Model Context Protocol (MCP).
Installation
docker compose -f dockers/docker-compose.yaml up -dEnvironment Variables
- ANTHROPIC_API_KEY
- OPENAI_API_KEY
- LANGSMITH_API_KEY
- LANGSMITH_PROJECT
- LANGSMITH_TRACING
- LANGSMITH_ENDPOINT
- USE_LOGIN
- USER_ID
- USER_PASSWORD
- UPSTAGE_API_KEY
Security Notes
The project uses `allow_dangerous_deserialization=True` when loading FAISS vector stores (in `resources/mcp_rag_kbs/rag/kbs.py`), which is a critical security vulnerability if the serialized data comes from an untrusted source, potentially leading to arbitrary code execution. While currently loading from internal project paths, this pattern is highly risky. Additionally, MCP servers bind to `0.0.0.0` (e.g., `mcp_server_time.py`), making them accessible from any network interface if not adequately protected by a firewall.
Similar Servers
Polymcp
A comprehensive TypeScript framework for building and orchestrating Model Context Protocol (MCP) servers and AI agents, enabling LLMs to intelligently discover, select, and execute external tools.
chatbot-with-search
An advanced AI client for autonomous web research, utilizing a decoupled microservices architecture with the Model Context Protocol (MCP) for tool communication.
AGAI09-MCP-Server
An AI agent leveraging LangGraph and OpenAI to interact with external tools via the Model Context Protocol (MCP) using JSON-RPC over STDIO.
mcp-langchain-client
Enables LLMs to interact with external tools from MCP servers through a LangChain-integrated client, offering both CLI and Streamlit web interfaces.