pysisense-mcp-chatbot
by hnegi01
Overview
Automates Sisense deployment interactions and migrations via an AI assistant with tool-calling capabilities.
Installation
docker compose up --build --force-recreateEnvironment Variables
- LLM_PROVIDER
- AZURE_OPENAI_ENDPOINT
- AZURE_OPENAI_DEPLOYMENT
- AZURE_OPENAI_API_KEY
- DATABRICKS_HOST
- DATABRICKS_TOKEN
- LLM_ENDPOINT
Security Notes
The system uses LLMs, which introduces inherent prompt injection risks. This is partially mitigated by a confirmation loop for mutating actions and an optional 'no summarization' privacy mode. The CORS middleware in the `mcp_server` and `backend` uses `allow_origins=["*"]`, which is generally unsafe for production environments without external access controls. However, the README explicitly advises deploying behind an organization's authentication/SSO and implementing network restrictions. Sensitive Sisense API tokens and LLM API keys are handled via environment variables and session state (not persisted in the UI), reducing hardcoding risks. No direct `eval` or obvious malicious code patterns were found.
Similar Servers
NiFiMCP
Provides a natural language chat interface for interacting with Apache NiFi instances, enabling users to retrieve information, document flows, and perform creation, modification, and operational actions on NiFi components using Large Language Models and custom tools.
Docker_MCPGUIApp
This repository provides a starter template for building full-stack AI assistants that integrate with real-world tools using Docker MCP Gateway and a Large Language Model.
tiny_chat
A RAG-enabled chat application that integrates with various LLM backends (OpenAI, Ollama, vLLM) and a Qdrant vector database, offering web search capabilities and an OpenAI-compatible API.
Enterprise-Multi-AI-Agent-Systems-
Orchestrates multiple AI agents for complex reasoning and real-time information retrieval, integrating large language models with web search capabilities.