tiny_chat
Verified Safeby to-aoki
Overview
A RAG-enabled chat application with a web interface and OpenAI-compatible APIs, allowing users to converse with LLMs, manage document collections in Qdrant, and perform web searches.
Installation
tiny-chat-mcpEnvironment Variables
- DB_CONFIG
- LLM_CONFIG
Security Notes
The server employs Pydantic for API input validation, uses standard libraries for file processing, and explicitly handles API keys via configuration files rather than hardcoding. It uses `streamlit.components.v1.html` and JavaScript for UI features, with attempts to sanitize user-provided text for safe embedding. However, as with any application processing user input and integrating with external LLMs and databases, continuous vigilance against prompt injection and other application-level vulnerabilities is recommended. No direct arbitrary code execution via 'eval' or critical hardcoded secrets were identified.
Similar Servers
Context-Engine
Context-Engine is a plug-and-play MCP retrieval stack that unifies code indexing, hybrid search, and optional LLM decoding to enable product teams to quickly ship context-aware AI agents for large or fast-changing codebases.
flexible-graphrag
The Flexible GraphRAG MCP Server provides a Model Context Protocol (MCP) interface for AI assistants (like Claude Desktop) to interact with a sophisticated RAG and GraphRAG system for document processing, knowledge graph auto-building, hybrid search, and AI Q&A.
qdrant-loader
Provides advanced Retrieval-Augmented Generation (RAG) capabilities to AI development tools by bridging a QDrant knowledge base for intelligent search, context understanding, and seamless integration with MCP-compatible tools like Cursor and Claude Desktop. It offers specialized search tools for semantic, hierarchy-aware, and attachment-focused queries.
RagThisCode
Set up a RAG (Retrieval-Augmented Generation) system to chat with the code of any public or private GitHub repository.