hf-mcp-server
by huggingface
Overview
Connects LLMs to the Hugging Face Hub and Gradio AI applications, enabling access to models, datasets, documentation, and job management.
Installation
npx @llmindset/hf-mcp-serverEnvironment Variables
- DEFAULT_HF_TOKEN
- TRANSPORT
- USER_CONFIG_API
- LOGGING_DATASET_ID
- LOGGING_HF_TOKEN
- HF_API_TIMEOUT
Security Notes
The server uses `child_process.spawn` and `shell-quote` for the `hf_jobs` tool, which can execute arbitrary commands on the host. While input is validated with Zod, direct shell execution of user-supplied commands is an inherent high-risk area. The `DEFAULT_HF_TOKEN` is explicitly warned about in the README, indicating awareness, but its presence as a fallback for unauthenticated requests still carries risk. All external network calls are to trusted Hugging Face domains, mitigating certain network risks.
Similar Servers
mcphub
The MCPHub acts as a centralized gateway for managing and orchestrating various Model Context Protocol (MCP) servers and OpenAPI-compatible services. It provides a unified API, OAuth 2.0 authorization, user management, and AI-powered 'smart routing' for dynamic tool discovery and invocation.
tmcp
Build Model Context Protocol (MCP) servers for AI agents to interact with external tools and data sources, enabling LLMs to access context and perform actions.
mcp-rubber-duck
An MCP server acting as a bridge to query and orchestrate multiple OpenAI-compatible LLMs for rubber duck debugging and multi-agent operations.
mcp-servers
An MCP Server for robust web content fetching, anti-bot bypassing, intelligent caching, and LLM-powered information extraction from the open internet, designed for agent-building frameworks and MCP clients.