ollama_langchain_mcp_server
by mlb0925
Overview
Develop a custom MCP (Multi-Modal Chat Protocol) server in Python to integrate external tools with LangChain agents, enabling automatic tool registration and invocation for LLMs.
Installation
python demo.pyEnvironment Variables
- OPENAI_API_KEY
- OPENAI_API_BASE
Security Notes
The `pymcp.py` server communicates via standard input/output, limiting direct network exposure. However, the `demo.py` agent hardcodes placeholder `OPENAI_API_KEY` and `OPENAI_API_BASE`, which is a security risk if not replaced with actual environment variables or a secure configuration before deployment. Tool invocation in `pymcp.py` directly calls Python functions with arguments, which, while safe for the provided demo tools, could pose a risk with less vetted or more complex custom tools if input schemas are not sufficiently restrictive.
Similar Servers
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.
fastmcp-example
Integrate Model Context Protocol (MCP) with LangChain and LangGraph to build AI agent workflows by exposing a variety of custom and pre-defined tools.
fastchat-mcp
A Python client for integrating Language Models with Model Context Protocol (MCP) servers, allowing natural language interaction with external tools, resources, and prompts.
MultiServer-Mcp
Demonstrates building and interacting with multiple Microservice-Compatible Protocol (MCP) servers for math and text processing using a LangChain MCP client for direct tool invocation.