langchain-research-assistant
by phisad
Overview
A system where an LLM can query local research papers/notes through a custom MCP server.
Installation
resi --start-serverEnvironment Variables
- agent_model
Security Notes
The `read_pdf` tool uses `Path.cwd() / file_name` without proper sanitization of `file_name`. This creates a path traversal vulnerability where an attacker, through LLM prompt injection, could craft `file_name` (e.g., `../sensitive_file.txt`) to read arbitrary files on the system that the running process has permissions to access, as long as the file exists and is recognized as a PDF (the suffix check is simple and could potentially be bypassed or irrelevant if the goal is to read non-PDFs).
Similar Servers
gpt-researcher
The GPT Researcher MCP Server enables AI assistants to conduct comprehensive web research and generate detailed, factual, and unbiased reports. It supports multi-agent workflows, local document analysis, and integration with external tools via the Machine Conversation Protocol (MCP) for various research tasks.
blz
Provides fast, local documentation search and retrieval for AI agents, using llms.txt files for line-accurate citations.
Docker_MCPGUIApp
This repository provides a starter template for building full-stack AI assistants that integrate with real-world tools using Docker MCP Gateway and a Large Language Model.
thinkingcap
A multi-agent research MCP server that runs multiple LLM providers in parallel and synthesizes their responses to a given query.