MCP-Servers
by ankitpokhrel08
Overview
A multi-tool chatbot connecting specialized servers for math, expense management, and animation creation, driven by an LLM.
Installation
uv run streamlit run streamlit_app.pyEnvironment Variables
- GOOGLE_API_KEY
- GEMINI_API_KEY
- MANIM_EXECUTABLE
Security Notes
The application relies on hardcoded absolute file paths for local server executables (uv, python3) and server scripts (math_server.py, manim_server.py) which are specific to the developer's system, causing significant setup friction and potential risks if these paths point to malicious or non-existent executables on another system. It depends on a separately cloned Manim server (manim-mcp-server) whose code is not provided for audit. The remote 'expense' server introduces an external dependency with its own security profile. While the provided `math_server.py` tools appear to handle inputs safely with type hints, general execution of external processes requires careful consideration.
Similar Servers
Docker_MCPGUIApp
This repository provides a starter template for building full-stack AI assistants that integrate with real-world tools using Docker MCP Gateway and a Large Language Model.
tiny_chat
A RAG-enabled chat application that integrates with various LLM backends (OpenAI, Ollama, vLLM) and a Qdrant vector database, offering web search capabilities and an OpenAI-compatible API.
fastchat-mcp
A Python client for integrating Language Models with Model Context Protocol (MCP) servers, allowing natural language interaction with external tools, resources, and prompts.
Enterprise-Multi-AI-Agent-Systems-
Orchestrates multiple AI agents for complex reasoning and real-time information retrieval, integrating large language models with web search capabilities.