fastmcp-example
Verified Safeby gauravsingh8026
Overview
Integrate Model Context Protocol (MCP) with LangChain and LangGraph to build AI agent workflows by exposing a variety of custom and pre-defined tools.
Installation
python server.pyEnvironment Variables
- TAVILY_API_KEY
- OPENAI_API_KEY
- HTTP_TIMEOUT_SECONDS
- CALENDLY_CLIENT_ID
- CALENDLY_CLIENT_SECRET
- CALENDLY_REDIRECT_URI
- FASTAPI_PORT
- MCP_SERVER_PORT
- ENVIRONMENT
- MCP_SERVER_URL
Security Notes
The `http_request` tool in `server.py` and the `_make_http_request` function in `config/custom_tools.py` enable making arbitrary HTTP requests. If an LLM's input can be manipulated via prompt injection, this could lead to Server-Side Request Forgery (SSRF) or unauthorized access to internal network resources. This is an inherent risk in tool-using AI agents. OAuth tokens for Calendly are stored locally in `data/calendly_tokens.json`. The code does not use `eval` or `exec` directly, and relies on environment variables for sensitive API keys.
Similar Servers
End-to-End-Agentic-Ai-Automation-Lab
This MCP Server provides an API gateway for an AutoGen multi-agent system to interact with Notion via the Model Context Protocol (MCP), enabling AI-driven automation of Notion tasks and public exposure through ngrok.
MultiServer-Mcp
Demonstrates building and interacting with multiple Microservice-Compatible Protocol (MCP) servers for math and text processing using a LangChain MCP client for direct tool invocation.
ToolStore
A proof-of-concept pipeline for automatic tool discovery, toolchain assembly, and agentic reasoning powered by semantic search and LLMs.
SimpleMCPServer_Langchain
This project demonstrates an AI agent using LangGraph and LangChain's MCP adapters to orchestrate calls to multiple local MCP services (Math and Weather tools) for processing user queries.