deep-research
by u14app
Overview
Generate comprehensive, AI-powered deep research reports, leveraging various LLMs and web search engines, with local knowledge base integration and report artifact editing.
Installation
docker compose upEnvironment Variables
- ACCESS_PASSWORD
- GOOGLE_GENERATIVE_AI_API_KEY
- OPENROUTER_API_KEY
- OPENAI_API_KEY
- ANTHROPIC_API_KEY
- DEEPSEEK_API_KEY
- XAI_API_KEY
- MISTRAL_API_KEY
- AZURE_API_KEY
- GOOGLE_CLIENT_EMAIL
- GOOGLE_PRIVATE_KEY
- GOOGLE_PRIVATE_KEY_ID
- OPENAI_COMPATIBLE_API_KEY
- TAVILY_API_KEY
- FIRECRAWL_API_KEY
- EXA_API_KEY
- BOCHA_API_KEY
- MCP_AI_PROVIDER
- MCP_SEARCH_PROVIDER
- MCP_THINKING_MODEL
- MCP_TASK_MODEL
- NODE_ENV
- HEAD_SCRIPTS
- NEXT_PUBLIC_DISABLED_AI_PROVIDER
- NEXT_PUBLIC_DISABLED_SEARCH_PROVIDER
- NEXT_PUBLIC_MODEL_LIST
- NEXT_PUBLIC_BUILD_MODE
- NEXT_PUBLIC_VERSION
Security Notes
The project uses Zod for input validation, which is a good practice. API keys are managed via environment variables and a multi-key polling mechanism, and proxy endpoints are protected by signature verification (`ACCESS_PASSWORD`). However, the `/api/crawler` endpoint, when authenticated, allows the server to fetch an arbitrary URL provided in the request body, which poses a Server-Side Request Forgery (SSRF) risk if the `ACCESS_PASSWORD` is compromised. Additionally, processing user-uploaded files (Office, PDF) introduces potential attack surface for parsing vulnerabilities. Interactions with LLMs are susceptible to prompt injection attacks, a general risk in AI applications. The security of the application heavily relies on the secrecy of `ACCESS_PASSWORD` and the robustness of third-party libraries used for file parsing and AI SDKs.
Similar Servers
gpt-researcher
The GPT Researcher MCP Server enables AI assistants to conduct comprehensive web research and generate detailed, factual, and unbiased reports. It supports multi-agent workflows, local document analysis, and integration with external tools via the Machine Conversation Protocol (MCP) for various research tasks.
modelcontextprotocol
Provides AI assistants with real-time web search, reasoning, and research capabilities through Perplexity's Sonar models and Search API.
kindly-web-search-mcp-server
Provides web search with robust, LLM-optimized content retrieval from various sources (StackExchange, GitHub, Wikipedia, arXiv, and general webpages) for AI coding assistants.
academia_mcp
An MCP server providing tools for searching, fetching, analyzing, and reporting on scientific papers and datasets, often powered by LLMs.