stelae
by Dub1n
Overview
Transforms a local WSL workspace into a single, extensible MCP endpoint for desktop agents, ChatGPT Connectors, and other HTTP/SSE clients, enabling declarative tool overrides and aggregations for AI agent tooling.
Installation
source ~/.nvm/nvm.sh && make upEnvironment Variables
- STELAE_DIR
- STELAE_CONFIG_HOME
- PUBLIC_BASE_URL
- CF_TUNNEL_NAME
- CF_CREDENTIALS_FILE
- OPENAI_API_KEY
- GITHUB_TOKEN
Security Notes
The server is designed to execute local commands, manage filesystems, and install/remove other MCP servers, which exposes a significant attack surface if not properly secured at the deployment and access control layers. It relies heavily on `subprocess.run` and `urllib.request.urlopen` for core functionalities and external interactions. Configuration changes via `manage_stelae` allow the server to self-modify its operational capabilities. While the code includes some path validation and environment variable expansion, robust external authentication and authorization are critical, especially when exposed publicly via Cloudflare. Hardcoded secrets are generally avoided, with `OPENAI_API_KEY` and `GITHUB_TOKEN` being passed via environment variables.
Similar Servers
jetski
Jetski is an open-source platform providing analytics, authentication, and simplified client setup for Model Context Protocol (MCP) servers by acting as a proxy.
mcpproxy-go
MCPProxy super-charges AI agents with intelligent tool discovery, massive token savings, and built-in security quarantine against malicious Model Context Protocol (MCP) servers.
emceepee
A proxy server enabling AI agents to dynamically connect to and interact with multiple Model Context Protocol (MCP) backend servers, exposing the full MCP protocol via a simplified tool interface or a sandboxed JavaScript execution environment.
docker-mcp-server
A Model Context Protocol (MCP) server for containerized execution and file operations, enabling AI assistants to interact with a Docker environment via HTTP.