binja-codemode-mcp
Verified Safeby akrutsinger
Overview
Enables LLM-assisted reverse engineering in Binary Ninja by executing Python code directly against its API.
Installation
python3 ~/.binaryninja/plugins/repositories/community/plugins/akrutsinger_binja_codemode_mcp/bridge/mcp_bridge.pyEnvironment Variables
- BINJA_MCP_URL
- BINJA_MCP_KEY
- BINJA_MCP_LOG_LEVEL
Security Notes
The server executes arbitrary Python code provided by the LLM. However, it implements strong security measures: - Code is validated using an AST parser (`CodeValidator`) to block forbidden modules (e.g., `os`, `subprocess`, `socket`, `importlib`, `sys`, `shutil`) and dangerous built-ins/attributes (e.g., `eval`, `exec`, `open`, `__import__`, `__subclasses__`). - Execution occurs in a restricted global environment, exposing only safe built-ins and the `binja` API object. - There is a 30-second execution timeout to prevent resource exhaustion. - The HTTP server binds only to localhost (`127.0.0.1`). - API key authentication is required for all requests. While `exec` is used, the comprehensive sandboxing significantly mitigates risk. The README explicitly warns users to 'only use with trusted MCP clients and LLMs,' which is appropriate for a tool that inherently deals with arbitrary code execution for analysis.
Similar Servers
ida-pro-mcp
This project provides an MCP (Model Context Protocol) server that integrates with IDA Pro, enabling AI assistants to perform reverse engineering tasks like binary analysis, decompilation, memory manipulation, and debugging within the IDA Pro environment.
cclsp
Integrate LLM-based coding agents with Language Server Protocol (LSP) servers to enable robust code navigation, symbol resolution, and refactoring across various programming languages.
jadx-mcp-server
Facilitates live, LLM-driven reverse engineering and vulnerability analysis of Android APKs by integrating JADX with the Model Context Protocol.
mcp-server-code-execution-mode
This server enables LLM agents to execute Python code in a highly secure, isolated container environment, facilitating complex multi-tool orchestration and data analysis with minimal LLM context token usage.