mcp-memory-server
Verified Safeby AerionDyseti
Overview
Provides semantic memory storage for AI assistants, enabling them to store and retrieve decisions, patterns, and context across sessions using vector embeddings.
Installation
bunx --bun @aeriondyseti/vector-memory-mcpSecurity Notes
The server operates locally by default (127.0.0.1:3271) reducing external network exposure. It uses `randomUUID` for internal ID generation, which is good. However, several database queries in `src/db/memory.repository.ts` construct `where` clauses using direct string interpolation for user-provided IDs (e.g., `id = '${id}'`). While IDs are typically UUIDs generated by trusted clients or internally, a malicious MCP client could send specially crafted strings (e.g., containing single quotes) as part of tool call arguments (`ids` in `delete_memories`, `get_memories`, `update_memories`), potentially leading to a form of SQL injection in the LanceDB queries. This is mitigated by its local-first, client-controlled nature, but remains a vulnerability pattern.
Similar Servers
MemoryMesh
A local knowledge graph server for AI models, focusing on structured memory for text-based RPGs and interactive storytelling.
context-sync
Context Sync provides AI systems with persistent, queryable memory across all development tools, sessions, and projects, allowing AI to remember codebase details, architectural decisions, and conversation history.
knowns
A CLI-first knowledge layer and task/documentation management tool that provides AI agents with persistent project context.
rag-server-mcp
Provides Retrieval Augmented Generation (RAG) capabilities to Model Context Protocol (MCP) clients by indexing project documents and retrieving relevant content for LLMs.