ollama-fastmcp-wrapper
Verified Safeby andreamoro
Overview
A proxy service that bridges Ollama models with FastMCP servers, enabling local LLM-tool augmented reasoning and persistent conversational history via an API or CLI.
Installation
python ollama_wrapper.py apiSecurity Notes
The wrapper's FastAPI instance binds to '0.0.0.0:8000' by default, making it accessible from any network interface, which could be a security risk if exposed to the public internet without proper firewall rules or access control. FastMCP servers configured with 'STDIO' transport use `subprocess.run` to execute commands (e.g., `uv run --with fastmcp ...`). While the default commands provided are safe, this mechanism could be exploited via configuration file manipulation to achieve remote code execution if the configuration file is writable by an attacker. LLM-driven tool calls, where `tool_name` and `arguments` are derived from the model's output, could potentially be used for arbitrary tool execution or data manipulation if the LLM hallucinates malicious inputs and server-side validation/sanitization is insufficient. The project appropriately uses `.toml` files for API tokens, which are designed to be git-ignored, preventing hardcoded secrets.
Similar Servers
mcpo
Exposes Model Context Protocol (MCP) tools as OpenAPI-compatible HTTP servers for integration with LLM agents and other applications.
mcp-language-server
Proxies a Language Server Protocol (LSP) server to provide semantic code intelligence tools to Model Context Protocol (MCP) clients, enabling LLMs to interact with codebases.
mcp-client-for-ollama
An interactive Python client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation.
modular-mcp
A proxy server that efficiently manages and loads large tool collections from multiple Model Context Protocol (MCP) servers on-demand for LLMs, reducing context overhead.