medium_gcp_mcp_fastapi
Verified Safeby markwkiehl
Overview
Provides a template for deploying a Model Context Protocol (MCP) Server using FastAPI on Google Cloud Run, demonstrating server deployment, client interaction, and GCS FUSE integration.
Installation
uvicorn api_mcp_fastapi_server:app --reloadEnvironment Variables
- OPENAI_API_KEY
- MOUNT_PATH
Security Notes
Secrets (like `OPENAI_API_KEY`) are correctly handled via environment variables, loaded from a `.env` file or Cloud Run environment. The code explicitly avoids logging full API keys. It demonstrates good practice by separating database storage (ephemeral `/tmp` directory) from GCS FUSE mounts due to the latter's characteristics. No `eval` or obvious malicious patterns were found in the provided source code.
Similar Servers
fastapi_mcp
Automatically converts FastAPI endpoints into Model Context Protocol (MCP) tools for seamless integration with LLM agents.
fastify-mcp-server
Enables Fastify applications to act as high-performance, streamable HTTP servers for the Model Context Protocol (MCP), facilitating secure communication between AI assistants and external services.
mcp_server
Provides a Python server that exposes various external APIs (Microsoft Graph, GitHub, OpenWeatherMap) as tools to be consumed by AI assistants via the Model Context Protocol (MCP).
MCP-Servers-using-Python
Demonstrates how to build Model Context Protocol (MCP) servers using `fastmcp` and `fastapi_mcp` libraries through various examples.