mcp-server
Verified Safeby izardy
Overview
Serves a local Ollama Large Language Model (LLM) via the MCP protocol over standard I/O, enabling client applications to interact with the LLM as a tool.
Installation
python ollama-mcp-server.pySecurity Notes
The server code itself does not contain obvious vulnerabilities like 'eval' or hardcoded secrets. It relies on a locally running Ollama instance, meaning its security posture is largely dependent on the security of the local Ollama setup. Input prompts are passed directly to Ollama, which is expected behavior for an LLM tool.
Similar Servers
fastmcp
FastMCP is an ergonomic interface for the Model Context Protocol (MCP), providing a comprehensive framework for building and interacting with AI agents, tools, resources, and prompts across various transports and authentication methods.
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation for local LLMs.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.
ollama-fastmcp-wrapper
A proxy service that bridges Ollama with FastMCP, enabling local LLM tool-augmented reasoning by exposing MCP servers' functionality to Ollama models.