mcp-local-assistant
Verified Safeby Prakashbishal
Overview
Automate the extraction of multiple-choice questions from university exam PDFs/PPTX files and generate answers using a local Ollama LLM.
Installation
python mcq_answer_mcp_server.pySecurity Notes
All file operations use strict path sanitization to prevent directory traversal. The server interacts with a local Ollama instance for LLM inference, which is a controlled local network dependency. No arbitrary code execution functions (like 'eval' or 'exec') were found in the core 'mcq_answer_mcp.py' implementation. Reliance on standard document parsing libraries (pypdf, python-pptx) is typical and assumed to be secure within reasonable bounds.
Similar Servers
5ire
A desktop AI assistant client that integrates with various LLM providers and connects to Model Context Protocol (MCP) servers for extended tool-use and knowledge base capabilities.
MCP-buddy
A local desktop or web application to manage and interact with multiple MCP (Model Context Protocol) servers, offering optional AI orchestration and enhancement for responses.
mcp-file-assistant-workshop
Builds an AI-powered file assistant server using Model Context Protocol for intelligent interaction with local files.
photons
A comprehensive demonstration MCP server showcasing various functionalities of the Photon runtime, including basic data handling, streaming responses, progress reporting, in-memory state management, and interactive UI elements. It serves as a reference for developers building new photons.