remembrances-mcp
Verified Safeby madeindigio
Overview
Provides long-term memory capabilities to AI agents through key-value, vector/RAG, and graph database layers, with advanced code indexing for semantic search and navigation.
Installation
docker run -d --name remembrances-mcp --gpus all -p 8080:8080 -v /path/to/data:/data -v /path/to/knowledge-base:/knowledge-base ghcr.io/madeindigio/remembrances-mcp:cudaEnvironment Variables
- GOMEM_GGUF_MODEL_PATH
- GOMEM_OPENAI_KEY
- GOMEM_OLLAMA_MODEL
- GOMEM_DB_PATH
- GOMEM_SURREALDB_URL
- GOMEM_SURREALDB_USER
- GOMEM_SURREALDB_PASS
- GOMEM_KNOWLEDGE_BASE
- GOMEM_GGUF_GPU_LAYERS
- GOMEM_LOG_LEVEL
Security Notes
The underlying `go-llama.cpp` bindings explicitly disable `eval` and text generation functionalities, focusing solely on embeddings, which mitigates common LLM execution risks. Default SurrealDB credentials (`root`/`root`) are used but are clearly configurable via environment variables or a YAML file. The `GOMEM_SURREALDB_START_CMD` environment variable allows executing an external command to start SurrealDB, which could pose a risk if configured with an untrusted command or executed in an insecure environment by the administrator. This is an administrative setup configuration, not a runtime vulnerability via the MCP API.
Similar Servers
mcp_massive
An AI agent orchestration server, likely interacting with LLMs and managing multi-agent workflows.
simplenote-mcp-server
A lightweight Model Context Protocol (MCP) server that integrates Simplenote with Claude Desktop to provide note management, search, and organization capabilities for AI assistants.
simple-memory-mcp-server
A Python server designed to manage and serve memory for AI agents, facilitating their interaction with external Large Language Models or data sources.
mcpserve
Serve Deep Learning models and provide shell execution capabilities, with Docker containerization and Ngrok connectivity.