bubble-ai-backend
Verified Safeby Aadhavan-Pachauri
Overview
Enterprise-grade Model Context Protocol (MCP) server for Bubble AI, providing intelligent web search, secured LLM proxy, deep research streaming, caching, and analytics.
Installation
docker-compose upEnvironment Variables
- TAVILY_API_KEY
- FIRECRAWL_API_KEY
- GEMINI_API_KEY
- OPENROUTER_API_KEY
- JWT_SECRET
- SUPABASE_URL
- SUPABASE_ANON_KEY
- SUPABASE_SERVICE_KEY
- WEB_SEARCH_MCP_URL
- BUBBLE_SEARCH_URL
- LOG_LEVEL
- ENABLE_METRICS
- METRICS_PORT
Security Notes
The `child_process.exec` command is used in `api/mcp-orchestrator.js` to start dependent services. While the commands and working directories are derived from internal configurations (mitigating direct injection from user input), there's a potential supply chain risk if the `package.json` scripts of the forked services (`web-search-mcp`, `bubble-search`) were to become malicious. Additionally, CORS headers are set to `Access-Control-Allow-Origin: '*'`, which is broadly permissive; for production, it's generally more secure to restrict this to known origins.
Similar Servers
mcp-omnisearch
Provides a unified interface for various search, AI response, content processing, and enhancement tools via Model Context Protocol (MCP).
opensearch-mcp-server-py
Enables AI assistants and LLMs to interact with OpenSearch clusters by providing a standardized Model Context Protocol (MCP) interface through built-in and dynamic tools.
mcp-server
A Model Context Protocol (MCP) server that integrates with SerpApi to provide comprehensive search engine results and data extraction to an LLM.
webscraping-ai-mcp-server
Integrates with WebScraping.AI to provide LLM-powered web data extraction, including question answering, structured data extraction, and HTML/text retrieval, with advanced features like JavaScript rendering and proxy management.