Back to Home
wx-b icon

long-context-mcp

Verified Safe

by wx-b

Overview

A Model Context Protocol (MCP) server that implements Recursive Language Models (RLM) for solving long-context problems by programmatically probing, recursing, and synthesizing over large codebases or documents.

Installation

Run Command
uv run rlm_mcp_server/server.py

Environment Variables

  • OPENROUTER_API_KEY
  • OPENAI_API_KEY
  • RLM_DEFAULT_MODEL
  • RLM_DEFAULT_RECURSION_MODEL

Security Notes

The server's core functionality involves executing LLM-generated code (`rlm.core.rlm.RLM`) to probe context. While this inherently carries execution risk, the project is highly transparent about it, strongly recommends and defaults to a Docker sandbox for execution (`environment: docker`), and enforces API key management via environment variables (no hardcoded secrets). File ingestion (`rlm_mcp_server/ingest.py`) also enforces repo boundaries to prevent path traversal. The use of `ast.literal_eval` for parsing LLM output, while safer than `eval`, is a point of note for output processing.

Similar Servers

Stats

Interest Score0
Security Score7
Cost ClassHigh
Avg Tokens53304
Stars0
Forks0
Last Update2026-01-19

Tags

RLMMCPLong-contextLLMAgentic CodingBenchmarking