GENIE
Verified Safeby Sidharth-e
Overview
To develop and deploy intelligent full-stack AI applications by orchestrating multiple AI agents and leveraging a rich set of built-in tools via the Model Context Protocol.
Installation
python genie_server/server.pyEnvironment Variables
- MONGO_URI
- MONGO_DEFAULT_DB
- LOG_LEVEL
- DATABASE_URL
- GOOGLE_CLIENT_ID
- GOOGLE_CLIENT_SECRET
- AZURE_AD_CLIENT_ID
- AZURE_AD_CLIENT_SECRET
- AZURE_AD_TENANT_ID
- NEXT_PUBLIC_API_BASE_URL
- GOOGLE_API_KEY
- OPENAI_API_KEY
- AZURE_OPENAI_API_KEY
- AZURE_OPENAI_ENDPOINT
- AZURE_OPENAI_DEPLOYMENT_NAME
Security Notes
The Python backend (genie_server) itself appears reasonably secure, primarily consisting of pre-defined utility functions, math operations, and string manipulations. It uses environment variables for sensitive database connection strings (MONGO_URI). The client-side (genie_client) also relies on environment variables for API keys and database connections. The use of `officeparser` for document processing involves writing temporary files, which is a common pattern for such libraries but could be a vector for vulnerabilities if the parsing library itself has flaws. The overall architecture supports connecting to external MCP servers, meaning the client *could* be configured to connect to a malicious third-party server, but this is an inherent extensibility feature of MCP and not a direct vulnerability in the provided server's code. No direct 'eval' or obvious hardcoded secrets were found.
Similar Servers
mcp-use
Develop full-stack AI applications and agents using the Model Context Protocol (MCP), providing tools, resources, prompts, and UI widgets in both Python and TypeScript.
mcpstore
MCPStore acts as an orchestration layer for managing Microservice Context Protocol (MCP) services and adapting them as tools for AI frameworks like LangChain, AutoGen, and others.
mesh
A full-stack AI-native platform for building, deploying, and managing AI agents, workflows, and applications with integrated context management, access control, and observability.
agentor
Deploy a scalable AI Agent server with tool integration and async streaming capabilities using LitServe, compatible with the Celesto AI platform.