hf-mcp-server
Verified Safeby huggingface
Overview
The Hugging Face MCP Server acts as a universal adapter, allowing various LLM clients (like Claude, Gemini, VSCode, Cursor) to interact with the Hugging Face Hub, Gradio applications, and other Hugging Face services through a standardized Model Context Protocol (MCP) interface.
Installation
npx @llmindset/hf-mcp-serverEnvironment Variables
- TRANSPORT
- PORT
- JSON_MODE
- DEFAULT_HF_TOKEN
- HF_TOKEN
- HF_API_TIMEOUT
- USER_CONFIG_API
- MCP_STRICT_COMPLIANCE
- AUTHENTICATE_TOOL
- SEARCH_ENABLES_FETCH
- GRADIO_DISCOVERY_CONCURRENCY
- GRADIO_SPACE_INFO_TIMEOUT
- GRADIO_SCHEMA_TIMEOUT
- GRADIO_SPACE_CACHE_TTL
- GRADIO_SCHEMA_CACHE_TTL
- HSTS
- CORS_ALLOWED_ORIGINS
- ANALYTICS_MODE
- TEMPLOG_MAX
- LOG_QUERY_EVENTS
- LOG_SYSTEM_EVENTS
- LOGGING_DATASET_ID
- LOGGING_HF_TOKEN
- LOG_LEVEL
- DYNAMIC_SPACE_DATA
Security Notes
The server's design focuses on proxying and facilitating interactions with external Hugging Face APIs and Gradio Spaces. It handles authentication via Hugging Face tokens, which can be provided in the Authorization header or as a `DEFAULT_HF_TOKEN` environment variable. The `start.sh` script explicitly warns about the security implications of `DEFAULT_HF_TOKEN`, indicating awareness of this risk. Extensive network interactions occur with `huggingface.co` and `*.hf.space`. CORS is configured with a default allowlist but is overrideable via environment variables. Input validation for tool calls is performed using `zod` schemas, mitigating common injection risks. While the server constructs commands for remote job execution (e.g., `uv run` commands for the Hugging Face Jobs API), it does not execute these commands locally, shifting that security boundary to the remote Hugging Face Jobs platform. No direct instances of `eval()` for local server execution were found. The primary security considerations for operators revolve around secure management of Hugging Face API tokens and careful configuration of environment variables, especially `DEFAULT_HF_TOKEN` and `CORS_ALLOWED_ORIGINS`.
Similar Servers
tmcp
A server implementation for the Model Context Protocol (MCP) to enable LLMs to access external context and tools.
mcp-rubber-duck
An MCP (Model Context Protocol) server that acts as a bridge to query multiple OpenAI-compatible LLMs, enabling multi-agent AI workflows and providing an AI 'rubber duck' debugging panel.
mcp-servers
An MCP server for managing files in Google Cloud Storage, supporting CRUD operations (save, get, search, delete) and exposing files as resources.
compound-mcp-server
Provides a Model Context Protocol (MCP) server for interacting with Groq models, including compound/meta models, exposing tools for real-time information and code execution capabilities from the Groq AI.