GeminiHydra
Verified Safeby EPS-AI-SOLUTIONS
Overview
Lightweight MCP server for integration with Ollama and Gemini CLI, providing AI text generation, multi-agent task execution, task queuing, caching, and prompt optimization.
Installation
pnpm startEnvironment Variables
- OLLAMA_HOST
- DEFAULT_MODEL
- FAST_MODEL
- CODER_MODEL
- CACHE_ENCRYPTION_KEY
- HYDRA_YOLO
- HYDRA_RISK_BLOCKING
- GOOGLE_API_KEY
- GEMINI_API_KEY
- LOG_LEVEL
- NODE_ENV
- CACHE_ENABLED
- CACHE_TTL
Security Notes
The server includes a `run_shell_command` tool, which inherently carries security risks if exposed to untrusted input. However, significant efforts have been made to mitigate these risks through a `CommandSanitizer`, `SecurityEnforcer`, `AuditLogger`, and the use of `DANGEROUS_PATTERNS` to block or warn about potentially malicious commands and path traversals. The `HYDRA_RISK_BLOCKING` environment variable allows administrators to control this exposure. Cache encryption is optional via `CACHE_ENCRYPTION_KEY`, with a warning issued if not configured. No obvious hardcoded secrets were found; API keys are expected via environment variables or standard config files.
Similar Servers
claude-flow
AI Agent Orchestration and Development Platform for Claude Code
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation for local LLMs.
ultimate_mcp_server
The Ultimate MCP Server acts as a comprehensive AI agent operating system, providing advanced AI agents access to a rich ecosystem of tools, cognitive systems, and specialized services via the Model Context Protocol for cognitive augmentation, tool use, and intelligent orchestration.
claude-prompts-mcp
Enhances AI assistant behavior through structured prompt management, multi-step chains, quality gates, and autonomous verification loops, primarily for development tasks.