Back to Home
robbgatica icon

memory-lane

Verified Safe

by robbgatica

Overview

AI-powered memory forensics analysis using Volatility 3 and an LLM.

Installation

Run Command
cd /path/to/memory-forensics-mcp/examples && export MCP_LLM_PROFILE=llama70b && python ollama_client.py

Environment Variables

  • VOLATILITY_PATH
  • DUMPS_DIR
  • MCP_LLM_PROFILE
  • OLLAMA_MODEL
  • MCP_SERVER_PATH
  • OPENAI_API_KEY

Security Notes

The server integrates Volatility 3 for memory analysis and exposes its capabilities via the Model Context Protocol (MCP). It handles memory dumps (potentially malicious data) for analysis. No direct 'eval' or 'exec' calls were found in the provided source. Network communication from the Ollama client to the Ollama server is localhost by default, reducing external network risks. Sensitive configurations like API keys are expected as environment variables. Provenance tracking (`provenance.py`) provides an audit trail. The primary security considerations involve the inherent risks of handling potentially malicious memory dumps in a forensic context, and the security of the underlying Volatility 3 framework. The parsing of LLM responses for tool calls introduces a potential, but mitigated, risk if the JSON parsing or tool argument mapping is flawed. Recommendations for running in isolated environments are present in the documentation.

Similar Servers

Stats

Interest Score0
Security Score9
Cost ClassHigh
Avg Tokens4000
Stars0
Forks0
Last Update2025-12-14

Tags

Memory ForensicsVolatility 3AI AnalysisIncident ResponseLLM-agnostic