Back to Home
delonsp icon

rlm-mcp-server

Verified Safe

by delonsp

Overview

Manages large datasets outside of an LLM's context, providing a persistent Python REPL and tools for data analysis, PDF processing, and S3 integration to enable Recursive Language Models.

Installation

Run Command
docker-compose up -d --build

Environment Variables

  • RLM_MAX_MEMORY_MB
  • RLM_PORT
  • RLM_API_KEY
  • OPENAI_API_KEY
  • RLM_SUB_MODEL
  • RLM_MAX_SUB_CALLS
  • MISTRAL_API_KEY
  • MINIO_ENDPOINT
  • MINIO_ACCESS_KEY
  • MINIO_SECRET_KEY
  • MINIO_SECURE
  • RLM_PERSIST_DIR

Security Notes

The server employs a sandboxed Python REPL using `exec()`, but mitigates risks with `ast.parse` for static code analysis, whitelisted imports (`ALLOWED_IMPORTS`), and blocked built-in functions (`BLOCKED_BUILTINS`). File access is restricted to a read-only `/data/` volume with path traversal checks. Environment variables are used for API keys. However, the `README.md` misleadingly states 'Container em rede isolada (sem acesso à internet)' under security, while `llm_client`, `s3_client`, and `pdf_parser` (Mistral OCR) components explicitly require external network access. While the internal sandbox is robust for the intended use, network access to external APIs should be clearly stated as a dependency.

Similar Servers

Stats

Interest Score0
Security Score8
Cost ClassLow
Avg Tokens150
Stars0
Forks0
Last Update2026-01-19

Tags

LLM Context ManagementData AnalysisPython REPLPDF ProcessingS3 Integration