savecontext
Verified Safeby greenfieldlabs-inc
Overview
Provides persistent memory, issue tracking, and project planning for AI coding assistants via the Model Context Protocol (MCP).
Installation
bunx @savecontext/mcpEnvironment Variables
- OLLAMA_ENDPOINT
- OLLAMA_MODEL
- HF_TOKEN
- HF_MODEL
- HF_ENDPOINT
- TRANSFORMERS_MODEL
- SAVECONTEXT_COMPACTION_THRESHOLD
- SAVECONTEXT_COMPACTION_MODE
- SAVECONTEXT_AGENT_ID
- HUGGINGFACE_TOKEN
Security Notes
The server operates locally using SQLite for data storage, enhancing privacy. It communicates via stdio (standard input/output), which is generally secure for inter-process communication. Optional features for semantic search involve network calls to Ollama (local server) or HuggingFace (cloud API), configurable via environment variables. CLI setup scripts modify local user configuration files and execute system commands (e.g., `git`, `python`, `bunx`). These operations are standard but require user trust, as they perform system-level changes. No hardcoded secrets or obvious malicious patterns were found. The local web UI (`dashboard`) is intended for single-user local access, implying local network security should be managed by the user if it's exposed.
Similar Servers
In-Memoria
Provides persistent intelligence infrastructure for AI agents, enabling them to understand codebases, detect patterns, predict coding approaches, and generate context-aware insights.
memory-graph
A graph-based MCP server that provides intelligent memory capabilities for Claude Code, enabling persistent knowledge tracking, relationship mapping, and contextual development assistance.
mcp-memory-keeper
Provides persistent context management for Claude AI coding assistants, ensuring work history, decisions, and progress are preserved across sessions and context limits.
tenets
Provides intelligent, token-optimized code context and automatically injects guiding principles to AI coding assistants for enhanced understanding and consistent interactions.