Back to Home
hiroki-yokoyama icon

vision-mcp-server

Verified Safe

by hiroki-yokoyama

Overview

A Model Context Protocol (MCP) server for local CPU-based vision language model inference using GGUF models via llama-cpp-python, designed to run as a Windows resident process and analyze images.

Installation

Run Command
scripts\run_server.ps1

Environment Variables

  • HF_ENDPOINT
  • HF_TOKEN

Security Notes

The server primarily operates with local file paths for images and models, relying on PIL for image processing and llama-cpp-python for LLM inference. No direct 'eval' or execution of arbitrary code from user input was found. The dynamic loading of chat handlers uses a controlled dictionary, mitigating risks. Main risks involve running potentially malicious GGUF models or providing large/malformed image files, which are inherent to the use case.

Similar Servers

Stats

Interest Score0
Security Score9
Cost ClassLow
Avg Tokens256
Stars0
Forks0
Last Update2025-11-20

Tags

Vision LLMMCPllama-cpp-pythonLocal InferenceWindows Server