Back to Home
paolodalprato icon

ollama-mcp-server

Verified Safe

by paolodalprato

Overview

Provides a self-contained Model Context Protocol (MCP) server for local Ollama management, enabling features like listing models, chatting, server control, and intelligent model recommendations.

Installation

Run Command
ollama-mcp-server

Environment Variables

  • OLLAMA_HOST
  • OLLAMA_PORT
  • OLLAMA_TIMEOUT
  • HARDWARE_ENABLE_GPU_DETECTION
  • HARDWARE_GPU_MEMORY_FRACTION
  • HARDWARE_ENABLE_CPU_FALLBACK
  • HARDWARE_MEMORY_THRESHOLD_GB

Security Notes

The server uses `subprocess.run` and `subprocess.Popen` to interact with the local `ollama` command-line tool and other system utilities (`nvidia-smi`, `rocm-smi`, `lspci`, `sysctl`). While the command arguments appear to be well-controlled and do not directly expose arbitrary command injection from raw user input, executing external binaries always carries an inherent risk. The primary security consideration is the integrity and security of the locally installed `ollama` executable and other system tools.

Similar Servers

Stats

Interest Score23
Security Score8
Cost ClassLow
Stars2
Forks1
Last Update2026-01-02

Tags

OllamaMCP ServerLocal AIModel ManagementCross-Platform