workflowy
by mholzen
Overview
This server and CLI tool allows AI assistants to interact with Workflowy outlines, supporting operations like search, bulk replacement, usage reports, and basic CRUD, with offline access capabilities.
Installation
docker run --rm -i -e WORKFLOWY_API_KEY="your_workflowy_api_key" ghcr.io/mholzen/workflowy:latest mcp --expose=read --log-file=/tmp/workflowy-mcp.logEnvironment Variables
- WORKFLOWY_API_KEY
Security Notes
The `workflowy_transform` MCP tool (and CLI `--exec` flag) allows executing arbitrary shell commands, which presents a critical command injection vulnerability if the AI assistant's output or user input is not fully trusted and sanitized, or if the server environment is not properly sandboxed. While configurable (`--expose` flag), enabling this specific tool or `--expose=all` allows potential arbitrary code execution. API keys are handled securely via file/environment variables.
Similar Servers
mcp-devtools
A unified, high-performance Go-based MCP server providing access to a comprehensive suite of developer tools, with robust OAuth 2.1 authentication and observability features.
mcp-cli
A command-line interface tool for managing Model Context Protocol (MCP) server configuration files across various AI tools.
mcp
This server provides a Model Context Protocol (MCP) interface for Large Language Models (LLMs) to integrate with and interact with Teamwork.com's project management and customer support functionalities.
mcp-cli-ent
Acts as a CLI client and daemon to interact with Model Context Protocol (MCP) servers without loading their definitions into an agent's context window, enabling on-demand tool calls and persistent sessions.