Local-Qwen-2.5-with-DeepSeek-OCR-vLLM-MCP
Verified Safeby WayBob
Overview
An AI agent system integrating local/remote Qwen 2.5 LLM and DeepSeek OCR as an MCP server for multimodal tasks.
Installation
uv run src/deepseek_ocr/server.py --port 3002Environment Variables
- OPENAI_API_KEY
- ANTHROPIC_API_KEY
- GOOGLE_API_KEY
- DEEPSEEK_API_KEY
- XAI_API_KEY
Security Notes
API keys are loaded from environment variables (e.g., .env file), which is good practice. The system involves multiple networked components (vLLM servers, MCP server, client) which should be secured with appropriate firewall rules. The 'process_local_image_for_ocr' tool takes a local file path which, if supplied by a malicious LLM prompt, could potentially lead to attempts to open arbitrary local files. However, the `Image.open` function used for processing is robust and primarily designed for image files, mitigating the risk of arbitrary code execution or significant information disclosure for non-image files. The configuration of MCP server launch commands in 'mcp_config.json' could pose a risk if not controlled by a trusted administrator, but this is a deployment-time configuration risk.
Similar Servers
lemonade
Lemonade Server is a high-performance C++ HTTP server providing local OpenAI-compatible API endpoints for various AI inference tasks including large language models (LLMs), embeddings, reranking, and audio transcription, with a focus on AMD Ryzen AI hardware acceleration.
remembrances-mcp
Provides long-term memory, knowledge base, and semantic code indexing capabilities for AI agents.
mcpserve
A server for deploying AI/ML models, providing shell access and containerization features for development and remote access.
mcp-ocr
Provides an OCR server leveraging Kimi API for image text extraction via the Model Context Protocol (MCP).