azure-openai-mcp-server
Verified Safeby harshv2013
Overview
A Gradio chatbot providing an interactive interface to Azure OpenAI capabilities, enhanced with integrated file system read-only access and calculator tools.
Installation
python app.pyEnvironment Variables
- AZURE_OPENAI_API_KEY
- AZURE_OPENAI_API_VERSION
- AZURE_OPENAI_ENDPOINT
- AZURE_OPENAI_DEPLOYMENT
- MAX_TOKENS_PER_REQUEST
- MAX_HISTORY
- FILE_SERVER_PATH
- APP_PORT
Security Notes
The server leverages environment variables for sensitive API keys, avoiding hardcoded secrets. It implements path traversal prevention using `pathlib.Path.resolve().startswith()` for file system operations, which is generally robust for its intended use. Crucially, while a 'write_file' tool is defined in `fs_server.py`, it is NOT exposed to the AI via the `MCPToolRegistry` used by the main application, limiting file system interaction to read-only operations within the designated `test_files` directory. The AI-driven tool execution pattern relies on the inherent safety of the registered Python functions, which appear well-contained.
Similar Servers
hf-mcp-server
The Hugging Face MCP Server acts as a universal adapter, allowing various LLM clients (like Claude, Gemini, VSCode, Cursor) to interact with the Hugging Face Hub, Gradio applications, and other Hugging Face services through a standardized Model Context Protocol (MCP) interface.
PowerShell.MCP
Enables AI assistants to execute arbitrary PowerShell commands and CLI tools for system automation, development tasks, and data analysis in a persistent, shared console environment.
RAMIE-RAD_AI_Messing_In_Earthworks
A local AI-powered wheeled robot (RAMIE) capable of listening, speaking, and executing commands via a Gradio web interface, designed for local compute and real-time interaction.
tiny_chat
A RAG-enabled chat application that integrates with various LLM backends (OpenAI, Ollama, vLLM) and a Qdrant vector database, offering web search capabilities and an OpenAI-compatible API.