Back to Home
Azure-Samples icon

AI-Gateway

Verified Safe

by Azure-Samples

Overview

Provides a playground and lab environment to experiment with the Model Context Protocol (MCP) using Azure API Management to enable plug-and-play AI tools for Large Language Models (LLMs).

Installation

Run Command
uvicorn shared.mcp-servers.weather.http.mcp_server:app --host 0.0.0.0 --port 8080

Environment Variables

  • APIM_GATEWAY_URL
  • SUBSCRIPTION_ID
  • RESOURCE_GROUP_NAME
  • APIM_SERVICE_NAME
  • AZURE_TENANT_ID
  • AZURE_CLIENT_ID
  • POST_LOGIN_REDIRECT_URL
  • APIM_IDENTITY_OBJECT_ID
  • AZURE_MANAGED_IDENTITY_CLIENT_ID
  • HOST
  • PORT

Security Notes

The MCP servers themselves (Starlette/FastMCP apps) do not appear to have direct code injection vulnerabilities with 'eval' or 'subprocess.run' on user input. They rely on environment variables for sensitive Azure AD/APIM configuration. Network calls are made using 'httpx' to a configured 'APIM_GATEWAY_URL'. The 'shared/utils.py' module uses 'subprocess.run(command, shell=True)' for executing Azure CLI commands. While this function is used in the context of deployment/cleanup scripts within Jupyter notebooks (labs), if it were exposed to arbitrary user input in a production server, it would be a critical command injection vulnerability. However, it is not part of the core runtime of the MCP tools themselves. The security of the overall solution heavily depends on the correct configuration of Azure API Management, Azure AD, and the underlying cloud resources, which is external to the MCP server code itself but central to the project's purpose.

Similar Servers

Stats

Interest Score82
Security Score7
Cost ClassMedium
Avg Tokens150
Stars812
Forks352
Last Update2025-12-02

Tags

Model Context ProtocolAI GatewayAzure API ManagementLLMsAI Agents