llama-stack-mcp-server
Verified Safeby rh-ai-quickstart
Overview
Integrates HR operations (employee, vacation, job, performance management) into a Llama Stack AI agent as a custom Model Context Protocol (MCP) tool.
Installation
cd custom-mcp-server && python server.pyEnvironment Variables
- HR_API_BASE_URL
- HR_API_KEY
- PORT
- NODE_ENV
- ALLOWED_ORIGINS
- ENABLE_SWAGGER
- ENABLE_RATE_LIMITING
Security Notes
The HR Enterprise API (`hr-api`) implements security headers (`helmet`), rate limiting, and input validation (`express-validator`). However, its default `ALLOWED_ORIGINS` for CORS is `*`, which is highly insecure for production use with sensitive HR data. The Custom MCP Server uses a hardcoded default `HR_API_KEY='hr-api-default-key'` if not provided via environment variables, which is also a security concern for non-demo deployments. All data within the HR API is in-memory and non-persistent, making it unsuitable for real production use of sensitive data, though acceptable for a quickstart demo. There are no signs of 'eval', obfuscation, or overtly malicious patterns.
Similar Servers
mcpo
Exposes Model Context Protocol (MCP) tools as OpenAPI-compatible HTTP servers.
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation for local LLMs.
mcp-servers
An MCP server for managing files in Google Cloud Storage, supporting CRUD operations (save, get, search, delete) and exposing files as resources.
1xn-vmcp
An open-source platform for composing, customizing, and extending multiple Model Context Protocol (MCP) servers into a single logical, virtual MCP server, enabling fine-grained context engineering for AI workflows and agents.