mem0
by Pintaro
Overview
OpenMemory is a personal AI memory management server with a web dashboard, allowing users to store, retrieve, and manage facts and preferences for AI agents and applications. It provides a centralized knowledge base for various client applications.
Installation
uvicorn app.main:app --host 0.0.0.0 --port 8765 --reloadEnvironment Variables
- NEXT_PUBLIC_USER_ID
- NEXT_PUBLIC_API_URL
- OPENAI_API_KEY
- ANTHROPIC_API_KEY
- GROQ_API_KEY
- DATABASE_URL
- MCP_AUTH_TOKEN
Security Notes
The underlying `mem0` library, used by this server, appears to use `eval()` for parsing tool call arguments in some contexts (observed in `mem0-ts/src/oss/src/llms/langchain.ts` and `mem0/proxy/main.py`). This is a critical security vulnerability if attacker-controlled input can reach these `eval` calls, potentially leading to arbitrary code execution. While the immediate server code might sanitize inputs, the dependency introduces a high risk. Other aspects, such as using environment variables for API keys and local API communication, are generally good practices.
Similar Servers
context-sync
Context Sync provides AI systems with persistent, queryable memory across all development tools, sessions, and projects, allowing AI to remember codebase details, architectural decisions, and conversation history.
memory-mcp
Provides persistent memory and intelligent context window caching for LLM conversations within AI coding environments.
mcp-structured-memory
Provides structured, domain-specific memory management for AI agents to use in ongoing projects, storing accumulated context in local markdown files.
remind
Generalization-capable memory layer for LLMs that extracts, stores, and retrieves semantic concepts from raw episodic experiences, mimicking human memory consolidation.