linear-streamable-mcp-server
Verified Safeby iceener
Overview
Provides an LLM-friendly Model Context Protocol (MCP) server for managing Linear issues, projects, teams, cycles, and comments via AI agents.
Installation
bun devEnvironment Variables
- PROVIDER_CLIENT_ID
- PROVIDER_CLIENT_SECRET
- OAUTH_SCOPES
- OAUTH_REDIRECT_URI
- OAUTH_REDIRECT_ALLOWLIST
- RS_TOKENS_ENC_KEY
- PORT
- AUTH_STRATEGY
- BEARER_TOKEN
- LINEAR_ACCESS_TOKEN
- AUTH_ENABLED
- AUTH_REQUIRE_RS
- AUTH_ALLOW_DIRECT_BEARER
Security Notes
The server employs strong authentication (OAuth 2.1 PKCE, encrypted token storage) and includes rate limiting. However, a critical vulnerability exists in the default production configuration where origin validation (`isAllowedOrigin` in `src/shared/mcp/security.ts`) is a placeholder that always returns `true`, allowing any origin to connect. The `README` warns about manual hardening, but this default behavior is a significant risk. Additionally, `RS_TOKENS_ENC_KEY` is crucial for encrypting stored OAuth tokens in KV; without it, they are stored in plaintext.
Similar Servers
tmcp
Build Model Context Protocol (MCP) servers for AI agents, providing schema-agnostic tools, resources, and prompts, with optional OAuth 2.1 authentication and distributed session management.
backlog-mcp-server
Integrate Backlog API with AI agents (e.g., Claude) to manage projects, issues, wikis, and Git repositories through natural language commands.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.
1xn-vmcp
An open-source platform for composing, customizing, and extending multiple Model Context Protocol (MCP) servers into a single logical, virtual MCP server, enabling fine-grained context engineering for AI workflows and agents.