doccura
by chironsb
Overview
Doccura is a local RAG system providing a terminal interface and an MCP server for document-based question answering and general chat with Ollama.
Installation
bun run src/mcp/server.tsEnvironment Variables
- OLLAMA_ENDPOINT
- OLLAMA_MODEL
- ENABLE_THINKING
- RAG_CHUNK_SIZE
- RAG_CHUNK_OVERLAP
- RAG_MAX_RESULTS
- RAG_SIMILARITY_THRESHOLD
- EMBEDDING_MODEL
- CHROMA_URL
- CHROMA_PATH
- DOCUMENTS_PATH
- EMBEDDINGS_CACHE_PATH
- MAX_FILE_SIZE_MB
- PERSONALITY_FILE
- RAG_PERSONALITY_FILE
Security Notes
The MCP server exposes an `upload_document` tool that takes a `filePath` argument directly from the client. This allows an external MCP client (which could be malicious) to specify any file path on the server's local filesystem. This could lead to: 1. Information Disclosure: An attacker could force the server to read and process sensitive system files (e.g., configuration files, credentials). 2. Denial of Service: Uploading very large or malformed files could consume excessive resources or crash the PDF/TXT processing components. The reliance on `fs.existsSync` and `fs.readFileSync` with client-provided paths is a critical vulnerability. While `child_process.execSync` is used, it's primarily for internal startup checks and script execution, not directly exposed to user input.
Similar Servers
mcp-client-for-ollama
An interactive Python client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation.
mcp-local-rag
Provides a local RAG-like web search capability for LLMs through the Model Context Protocol without external APIs.
mcp-local-rag
A privacy-first, local document search server that leverages semantic search for Model Context Protocol (MCP) clients.
rag-server-mcp
Provides Retrieval Augmented Generation (RAG) capabilities to Model Context Protocol (MCP) clients by indexing local project documents and retrieving relevant information for LLMs.