Back to Home

mcp-server

Verified Safe

by izardy

Overview

Serves a local Ollama Large Language Model (LLM) via the MCP protocol over standard I/O, enabling client applications to interact with the LLM as a tool.

Installation

Run Command
python ollama-mcp-server.py

Security Notes

The server code itself does not contain obvious vulnerabilities like 'eval' or hardcoded secrets. It relies on a locally running Ollama instance, meaning its security posture is largely dependent on the security of the local Ollama setup. Input prompts are passed directly to Ollama, which is expected behavior for an LLM tool.

Similar Servers

Stats

Interest Score0
Security Score9
Cost ClassMedium
Avg Tokens100
Stars0
Forks0
Last Update2025-11-28

Tags

MCPOllamaLLMPythonAI