tsai-s10-multi-agent-orchestration
by RoyRushreeta
Overview
Orchestrates a multi-agent loop to answer user queries by leveraging Google Gemini models, MCP tool servers, and a retrieval pipeline.
Installation
python main.pyEnvironment Variables
- GEMINI_API_KEY
Security Notes
The `action/executor.py` module, which is responsible for running user-generated Python code in a sandbox, includes `__import__` in its `__builtins__` for `exec`. This allows arbitrary modules (e.g., `os`, `subprocess`) to be imported and executed by user-provided code, effectively bypassing any intended sandboxing and leading to full system compromise. Hardcoded absolute Windows paths in `config/mcp_server_config.yaml` also pose a slight risk if not correctly managed in a multi-user environment, but the `__import__` vulnerability is critical.
Similar Servers
gemini-mcp-server
An MCP server providing a suite of 7 AI-powered tools (Image Gen/Edit, Chat, Audio Transcribe, Code Execute, Video/Image Analysis) powered by Google Gemini, featuring a self-learning "Smart Tool Intelligence" system for prompt enhancement and user preference adaptation.
thinkingcap
A multi-agent research MCP server that runs multiple LLM providers in parallel and synthesizes their responses to a given query.
mcp-gemini-prompt-enhancer
A Model Context Protocol (MCP) server that provides a prompt optimization service for Large Language Models (LLMs) using Google Gemini, with advanced prompt engineering support and automatic PDF asset management.
nova-llm
A full-stack LLM agent workflow with custom tool calling capabilities and configuration with Model Context Protocol (MCP) servers, supporting multiple Gemini models.