subspace-api
Verified Safeby subtype-space
Overview
An Express-based RESTful API and Model Context Protocol (MCP) server that aggregates data from external services like WMATA, weather, and stock markets, also supporting TRMNL plugin integrations.
Installation
docker compose pull && docker compose up -dEnvironment Variables
- ACTIVE_VERSION
- ACTIVITY_DISCORD_CLIENT_ID
- ACTIVITY_DISCORD_CLIENT_SECRET
- API_CLIENT_ID
- API_CLIENT_SECRET
- AUTH_SERVER_URL
- AUTH_REALM
- LOG_LEVEL
- MCP_SERVER_URL
- PORT
- WMATA_PRIMARY_KEY
- TZ
- TRMNL_DB_PATH
- TRMNL_CLIENT_ID
- TRMNL_CLIENT_SECRET
- TRMNL_IP_AUTH_ALLOW_PRIVATE
- API_VERSION
Security Notes
The server uses `helmet` for security headers, `express-rate-limit` to prevent abuse, and `better-sqlite3` with prepared statements to prevent SQL injection. OAuth implementation for MCP endpoints relies on external authentication servers for token introspection, with client secrets stored in environment variables. TRMNL integrations use hashed tokens and optionally enforce IP whitelisting based on an external service ('usetrmnl.com/api/ips'), which is a potential single point of failure if that service is compromised or unavailable, and has an explicit bypass (`TRMNL_IP_AUTH_ALLOW_PRIVATE`) that should not be enabled in production. Overall, good practices are followed for environment variable usage and input validation.
Similar Servers
Weather-MCP-Server
Provides comprehensive weather information and tools via a Model Context Protocol (MCP) server using FastMCP and WeatherAPI.com.
mcp-local-server
A Model Context Protocol (MCP) server that provides real-time weather data, basic mathematical calculations, and mock alert details to AI agents.
mcp-weather
Serves weather information via the Model Context Protocol (MCP) as a tool for AI agents.
mcpinabox
Builds a Model Context Protocol (MCP) server to expose weather data as tools for AI models like OpenAI's Responses API.