llama-stack-mcp-server
by rh-ai-quickstart
Overview
Provides a comprehensive REST API for HR operations, designed to be integrated as a tool for AI agents via the Model Context Protocol (MCP).
Installation
npm startEnvironment Variables
- PORT
- NODE_ENV
- ALLOWED_ORIGINS
- ENABLE_SWAGGER
- ENABLE_RATE_LIMITING
- HR_API_BASE_URL
- HR_API_KEY
Security Notes
The custom MCP server (Python) and HR Enterprise API (Node.js) rely on an `X-API-Key` for authentication. Critically, the `custom-mcp-server/server.py` hardcodes `HR_API_KEY="hr-api-default-key"` as a default, which is a major security vulnerability for any non-trivial use. The HR API also defaults `ALLOWED_ORIGINS=*` for CORS, insecure for production. Swagger UI is exposed by default, potentially leaking API structure if not secured. The `/health` endpoint in the custom MCP server reveals the internal `HR_API_BASE_URL`.
Similar Servers
mcpo
Exposes Model Context Protocol (MCP) tools as OpenAPI-compatible HTTP servers.
kubernetes-mcp-server
Provides a Model Context Protocol (MCP) interface for AI agents to interact with and manage Kubernetes and OpenShift clusters.
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation.
mcp-servers
An MCP Server for robust web content fetching, anti-bot bypassing, intelligent caching, and LLM-powered information extraction from the open internet, designed for agent-building frameworks and MCP clients.