mcp-container-ts
Verified Safeby Azure-Samples
Overview
An MCP server providing secure, role-based access to external tools (like a TODO list) for Large Language Models via Streamable HTTP, with built-in observability.
Installation
npm run devEnvironment Variables
- NODE_ENV
- DEBUG
- APPLICATIONINSIGHTS_CONNECTION_STRING
- JWT_SECRET
- JWT_AUDIENCE
- JWT_ISSUER
- JWT_EXPIRY
- JWT_TOKEN
- PORT
- ALLOWED_ORIGINS
- API_KEYS
Security Notes
The server implements robust JWT-based authentication and role-based access control (RBAC) to secure API endpoints and tools. It uses `helmet` for common security headers and `express-rate-limit` to mitigate brute-force attacks. OpenTelemetry is integrated for detailed logging and tracing, which aids in security monitoring. The primary concern is that the `validationMiddleware` for JSON-RPC request body validation is commented out by default in `src/server-middlewares.ts`, potentially allowing malformed requests to bypass initial structural validation. While individual tool handlers use Zod for argument validation, a top-level JSON-RPC validation layer would enhance overall robustness. The `generate-token.ts` script explicitly warns about its demo nature and advises against using generated `.env` tokens directly in production.
Similar Servers
mcp-openapi-server
Exposes OpenAPI endpoints as Model Context Protocol (MCP) tools, enabling Large Language Models (LLMs) to discover and interact with REST APIs through a standardized protocol.
opentelemetry-mcp-server
The OpenTelemetry Model Context Protocol (MCP) server enables LLMs to efficiently use the OpenTelemetry stack by providing tools to configure an OpenTelemetry collector and returning strict JSON schemas for collector components to ensure correct configuration.
mcp-http-agent-md
This server provides a Minimal Model Context Protocol (MCP) HTTP server for managing AI agent projects, structured tasks, and versioned history, enabling subagent orchestration and document management.
mcp-typescript-simple
A production-ready MCP (Model Context Protocol) server for building AI agent backends, offering dual-mode operation (STDIO + Streamable HTTP with OAuth), multi-LLM integration, and comprehensive observability.