jinni
Verified Safeby smat-dev
Overview
A tool to efficiently provide Large Language Models with structured project context for code comprehension and generation tasks.
Installation
uvx jinni-serverEnvironment Variables
- JINNI_MAX_SIZE_MB
- JINNI_NO_WSL_TRANSLATE
- JINNI_ASSUME_WSL_DISTRO
Security Notes
The tool executes external commands (`wslpath`, `wsl`) via `subprocess` for WSL path translation, based on user-provided paths. While safeguards like NUL byte checks (`ensure_no_nul`), explicit command arguments (`--`), and server-side `--root` path confinement are implemented, a sophisticated path traversal or OS-level vulnerability in the WSL environment could pose a risk. No direct 'eval', obfuscation, or hardcoded secrets were found.
Similar Servers
mcp-language-server
Serves as an MCP (Model Context Protocol) gateway, enabling LLMs to interact with Language Servers (LSPs) for codebase navigation, semantic analysis, and code editing operations.
mcp-interviewer
A Python CLI tool to evaluate Model Context Protocol (MCP) servers for agentic use-cases, by inspecting capabilities, running functional tests, and providing LLM-as-a-judge evaluations.
mcp-use-cli
An interactive command-line interface (CLI) tool for connecting to and interacting with Model Context Protocol (MCP) servers using natural language, acting as an AI client that orchestrates LLM responses with external tools.
tenets
Provides intelligent, token-optimized code context and automatically injects guiding principles to AI coding assistants for enhanced understanding and consistent interactions.