consult-llm-mcp
Verified Safeby raine
Overview
Facilitates Claude Code to consult powerful external AI models for complex code analysis, debugging, and review tasks.
Installation
npx -y consult-llm-mcpEnvironment Variables
- OPENAI_API_KEY
- GEMINI_API_KEY
- DEEPSEEK_API_KEY
- CONSULT_LLM_DEFAULT_MODEL
- GEMINI_MODE
- OPENAI_MODE
- CODEX_REASONING_EFFORT
Security Notes
The server uses `child_process.spawn` to execute external CLI tools (`gemini`, `codex`). While arguments are passed as arrays to mitigate shell injection, the security relies on the trustworthiness of these external executables and their robust argument parsing. The tool has broad file system access to read user-specified files and git repositories, which is expected for its functionality but requires careful usage to avoid exposing sensitive data. No hardcoded secrets were found; API keys are expected via environment variables.
Similar Servers
claude-code-mcp
Provides an MCP server to allow LLMs to directly invoke Claude Code CLI for complex coding, file system, and Git operations, bypassing interactive permission prompts.
Delphi-MCP-Server
Provides a Model Context Protocol (MCP) server implementation in Delphi to integrate AI agents like Claude Code with Delphi development workflows via an extensible tool and resource system.
tenets
Serves as a Model Context Protocol (MCP) server for AI coding assistants, automatically finding, ranking, and aggregating relevant codebase files for AI prompts, and providing code intelligence tools.
mcp-devtools-server
Standardizes development tool interaction and enhances AI code generation, autocorrection, and workflow automation.