Back to Home
Fluid-AI icon

fluidmcp

Verified Safe

by Fluid-AI

Overview

Orchestrates Model Context Protocol (MCP) servers and LLM inference engines (like vLLM) via a unified FastAPI gateway, enabling dynamic management, tool invocation, and multi-model LLM serving.

Installation

Run Command
fluidmcp run examples/vllm-config.json --file --start-server

Environment Variables

  • HUGGING_FACE_HUB_TOKEN
  • S3_BUCKET_NAME
  • S3_ACCESS_KEY
  • S3_SECRET_KEY
  • S3_REGION
  • MCP_FETCH_URL
  • MCP_TOKEN
  • FMCP_BEARER_TOKEN
  • FMCP_SECURE_MODE
  • FMCP_GITHUB_TOKEN
  • GITHUB_TOKEN
  • FMCP_MONGODB_SERVER_TIMEOUT
  • FMCP_MONGODB_CONNECT_TIMEOUT
  • FMCP_MONGODB_SOCKET_TIMEOUT
  • FMCP_MONGODB_ALLOW_INVALID_CERTS
  • LLM_STREAMING_TIMEOUT
  • MCP_PORT_RELEASE_TIMEOUT
  • FMCP_ALLOWED_COMMANDS
  • MCP_CLIENT_SERVER_PORT
  • MCP_CLIENT_SERVER_ALL_PORT

Security Notes

The server includes robust validation against command injection (e.g., whitelisting commands, stripping dangerous shell patterns in arguments) and MongoDB injection (sanitizing input). It supports configurable bearer token authentication for its management API and explicit warnings for insecure CORS settings. While running external processes inherently carries some risk, the implemented input validation and whitelisting significantly mitigate common vulnerabilities.

Similar Servers

Stats

Interest Score35
Security Score9
Cost ClassLow
Stars6
Forks6
Last Update2026-01-19

Tags

MCPLLMFastAPIOrchestrationAIDeveloper Tools