fastchat-mcp
by rb58853
Overview
A Python client integrating Language Models (LLMs) with Model Context Protocol (MCP) servers, enabling natural language interaction with external tools, resources, and prompts via terminal or a FastAPI/WebSocket API.
Installation
uvicorn api:app --host 0.0.0.0 --port 8000 --ws-ping-interval 0 --ws-ping-timeout 1200 --workers 1Environment Variables
- CRIPTOGRAFY_KEY
- OPENAI_API_KEY
- MASTER_TOKEN
Security Notes
The WebSocket API route (`/chat/user`, `/chat/admin`) accepts an `aditional_servers` header. This header's value is parsed as JSON, allowing clients to dynamically inject MCP server configurations. If a malicious client provides a `stdio` protocol server configuration with arbitrary commands, it could lead to Remote Code Execution (RCE) on the server running the FastApp. This constitutes a severe security vulnerability for any publicly exposed API.
Similar Servers
fastmcp
FastMCP is a Python framework for building and interacting with Model Context Protocol (MCP) servers. It provides client and server capabilities, enabling the creation of AI agents and services through definable tools, resources, and prompts. It supports various transports, authentication methods, logging, and background task execution, with strong integration for OpenAPI specifications.
slack-mcp-server
Model Context Protocol (MCP) server providing real-time and historical Slack data access to AI models.
mcp-client-for-ollama
An interactive Python client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.