CloudUxMCP
by pandiaaman
Overview
Provides an AI-powered assistant for Avid MediaCentral CTMS, enabling natural language interaction and system browsing of Production Asset Management (PAM) and Media Asset Management (MAM) systems.
Installation
uvicorn main:app --reload --host 0.0.0.0 --port 8000Environment Variables
- MCS_HOST
- MCS_USER
- MCS_PASSWORD
- BEARER_TOKEN
- OPENAI_API_KEY
- SERVER_HOST
- SERVER_PORT
Security Notes
Critical security risks identified: Hardcoded MediaCentral credentials (host, username, password, bearer_token) are present in frontend source files (`avidmcpui/src/components/LoginScreen.jsx`) and backend test scripts (`avidmcpserver/test_auth.py`, `avidmcpserver/test_complete.py`), making them easily discoverable and exploitable. SSL certificate verification is explicitly disabled (`verify=False`) for `httpx` client in `auth_service.py` and `ctms_service.py`, which is highly insecure for production environments and susceptible to Man-in-the-Middle (MITM) attacks. The backend's CORS middleware (`allow_origins=["*"]`) allows requests from any origin, which is overly permissive and potentially risky if sensitive data is involved. OpenAI API key is configured via environment variables but requires careful handling to prevent exposure.
Similar Servers
gemini-cli-desktop
A cross-platform desktop and web application providing a modern UI for various AI CLIs (Gemini, Qwen, LLxprt), enabling structured interaction with AI models, visual tool confirmation, real-time thought processes, code diff viewing, chat history management, and file system integration.
MediaWiki-MCP-Server
An MCP server that enables Large Language Model (LLM) clients to interact with any MediaWiki wiki.
contentful-mcp-server
Enables AI assistants to manage Contentful content, assets, and content models via natural language commands, facilitating content creation, organization, and workflow automation.
llms
A centralized configuration and documentation management system for LLMs, providing tools for building skills, commands, agents, prompts, and managing MCP servers across multiple LLM providers.