greenroom
Verified Safeby chrisbrickey
Overview
Provides entertainment recommendations and analysis utilities to agents via a Model Context Protocol (MCP) server, integrating with TMDB and leveraging LLMs.
Installation
uv run greenroomEnvironment Variables
- TMDB_API_KEY
- OLLAMA_BASE_URL
Security Notes
The server correctly uses os.getenv for API keys and httpx for external API calls, including setting timeouts to prevent hanging connections. Pydantic models are used for data validation from TMDB API responses, which helps prevent malformed data processing. No direct 'eval' or 'os.system' calls are present. The LLM interaction (ctx.sample and Ollama calls) involves sending user-provided prompts to external/local LLMs, which carries inherent risks like prompt injection and potential for generating harmful or biased content, though this is a general risk of LLM applications rather than a specific code vulnerability within the server's implementation. The ctx.sample feature allows the client's LLM to process data, which is noted as 'security-sensitive' by the project itself, requiring client support.
Similar Servers
spotify-streamable-mcp-server
Provides an LLM-friendly interface to control Spotify playback, search music, and manage playlists/saved songs, enabling voice control and smart-home automations.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.
jotsu-mcp
General-purpose library for implementing the Model Context Protocol (MCP) and creating workflows that use MCP tools, resources, and prompts.
MCP-Student-Recommendation-Server
An AI-powered recommendation system for students, providing personalized content (forums, courses, events, scholarships, etc.) and integrating with an AI assistant via the Model Context Protocol (MCP).