windsurf-llama-cpp-mcp-bridge
Verified Safeby waqasm86
Overview
Provides a Model Control Protocol (MCP) interface for local GGUF Large Language Models, enabling tool-based interaction.
Installation
python server.pyEnvironment Variables
- LLAMA_COMPLETION_URL
- LLAMA_TIMEOUT_S
- MODEL_PATH
Security Notes
The primary FastAPI server binds to localhost (127.0.0.1) by default, limiting network exposure. The 'server_stdio.py' component, if run directly as an HTTP server, would bind to all network interfaces (0.0.0.0), potentially exposing it; however, it appears intended for subprocess management via stdio. Both servers make HTTP requests to a configurable LLAMA_COMPLETION_URL; ensuring this URL points to a trusted local llama.cpp instance is crucial to prevent data leakage or external compromise. No 'eval', obfuscation, or hardcoded secrets were found. Basic input validation for numeric parameters is present.
Similar Servers
fastapi_mcp
Automatically converts FastAPI endpoints into Model Context Protocol (MCP) tools for seamless integration with LLM agents.
mcp-use-cli
An interactive command-line interface (CLI) tool for connecting to and interacting with Model Context Protocol (MCP) servers using natural language, acting as an AI client that orchestrates LLM responses with external tools.
fabric_mcp
Provides a Model Context Protocol (MCP) server to expose FABRIC Testbed API and inventory queries to LLM clients.
MCP-Server
A server for exposing local tool APIs via the Model Context Protocol (MCP) to be consumed by AI/ML clients or agents.