MCP_Catalog
Verified Safeby Swissbit92
Overview
A local, persona-driven chat interface for interacting with various Modular Computation Process (MCP) servers (e.g., GraphRAG, Knowledge Graph) through LLM inference powered by Ollama.
Installation
python run_react.pyEnvironment Variables
- COORD_URL
- PERSONA_MODEL
- PERSONA_DIR
- PERSONA_TEMPERATURE
- APP_LOGO_PATH
- USER_AVATAR
- COORDINATOR_DB_PATH
Security Notes
The application is designed for local-first usage, reducing typical web security concerns. Backend API calls are restricted to localhost by default (`http://127.0.0.1:8000`), and database interactions use parameterized queries, mitigating SQL injection risks. The UI uses `streamlit.components.v1.html` for JavaScript injection, which could be a risk in a multi-user web-hosted scenario, but is acceptable for a local single-user tool where the user controls the source code. LLM outputs are processed, but general risks associated with LLM-generated content (e.g., unintended instructions) remain.
Similar Servers
osaurus
Osaurus is a native macOS LLM server running local language models with OpenAI and Ollama compatible APIs, enabling tool calling and a plugin ecosystem for AI agents.
mcp-client-for-ollama
An interactive Python client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation.
rag-server-mcp
Provides Retrieval Augmented Generation (RAG) capabilities to Model Context Protocol (MCP) clients by indexing local project documents and retrieving relevant information for LLMs.
ollama-fastmcp-wrapper
A proxy service that bridges Ollama models with FastMCP servers, enabling local LLM-tool augmented reasoning and persistent conversational history via an API or CLI.