Claude-Code-Gemini-Fallback-Providers-Full-Setup-Guide-Windows-Linux-
Verified Safeby samade747
Overview
A local AI model router and orchestration server for Claude Code, integrating multiple LLM providers (Gemini, OpenAI, etc.) with fallback capabilities and supporting Model Context Protocol (MCP) servers for tool execution.
Installation
python mcp_server.pyEnvironment Variables
- GOOGLE_API_KEY
- OPENAI_API_KEY
- QWEN_API_KEY
- XAI_API_KEY
- OPENROUTER_API_KEY
Security Notes
The project explicitly warns against hardcoding API keys, recommends using environment variables, and advises keeping the router bound to '127.0.0.1'. The optional MCP server example demonstrates a 'list_files' tool, which could expose local file system information if misused by an LLM or if the MCP server's 'stdio' transport were redirected insecurely, but the provided setup is local and relatively contained. No critical vulnerabilities like 'eval' or obfuscation were found.
Similar Servers
claude-code-mcp
Acts as an MCP server to enable LLMs to run Claude Code CLI in one-shot mode, bypassing permissions for complex coding, file system, Git, and terminal operations.
claude-codex-settings
A comprehensive toolkit and configuration for developing Claude Code plugins, integrating various external services and APIs, and enhancing AI-assisted coding workflows.
consult-llm-mcp
An MCP server that allows AI agents like Claude Code to consult stronger, more capable AI models (e.g., GPT-5.2, Gemini 3.0 Pro) for complex code analysis, debugging, and architectural advice.
local-mcp-gateway
Orchestrates local AI tools by acting as a middleware layer and central hub to manage multiple Model Context Protocol (MCP) servers, offering profile-based tool access, OAuth 2.1, and observability for AI clients.