Back to Home
waqasm86 icon

windsurf-llama-cpp-mcp-bridge

Verified Safe

by waqasm86

Overview

Provides a Model Control Protocol (MCP) interface for local GGUF Large Language Models, enabling tool-based interaction.

Installation

Run Command
python server.py

Environment Variables

  • LLAMA_COMPLETION_URL
  • LLAMA_TIMEOUT_S
  • MODEL_PATH

Security Notes

The primary FastAPI server binds to localhost (127.0.0.1) by default, limiting network exposure. The 'server_stdio.py' component, if run directly as an HTTP server, would bind to all network interfaces (0.0.0.0), potentially exposing it; however, it appears intended for subprocess management via stdio. Both servers make HTTP requests to a configurable LLAMA_COMPLETION_URL; ensuring this URL points to a trusted local llama.cpp instance is crucial to prevent data leakage or external compromise. No 'eval', obfuscation, or hardcoded secrets were found. Basic input validation for numeric parameters is present.

Similar Servers

Stats

Interest Score0
Security Score8
Cost ClassMedium
Avg Tokens256
Stars0
Forks0
Last Update2025-12-13

Tags

MCPLLMGGUFFastAPILocal Inference