letta-mcp-server
Verified Safeby SNYCFIRE-CORE
Overview
Universal MCP server to bridge any AI client (e.g., Claude, GitHub Copilot, Cursor) with Letta.ai's powerful stateful agents, enabling seamless interaction and tool orchestration.
Installation
python -m letta_mcp.serverEnvironment Variables
- LETTA_API_KEY
Security Notes
API keys are handled securely via environment variables or explicitly fetched from them when loaded from YAML, preventing hardcoding. Input validation (e.g., for agent IDs) and error formatting are in place to prevent common vulnerabilities and avoid leaking sensitive internal details. The project uses `httpx` with retry logic for robust and secure API communication. No `eval` or obvious obfuscation patterns were found, and a `SECURITY.md` outlines reporting procedures and best practices.
Similar Servers
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation for local LLMs.
Letta-MCP-server
A Model Context Protocol (MCP) server that provides comprehensive tools for agent management, memory operations, and integration with the Letta system.
consult-llm-mcp
An MCP server that allows AI agents like Claude Code to consult stronger, more capable AI models (e.g., GPT-5.2, Gemini 3.0 Pro) for complex code analysis, debugging, and architectural advice.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.