datalake-mcp-server-client
Verified Safeby BERDataLakehouse
Overview
A Python client library for programmatic interaction with the BERDL Datalake MCP Server, enabling operations like health checks, database/table listing, schema retrieval, and data querying for Delta Lake tables.
Installation
No command providedEnvironment Variables
- SERVICE_ROOT_PATH (used only during client generation to influence OpenAPI spec generation from the server's application)
Security Notes
The provided code is for a client library. The generation process (via `run.sh` and `generate_spec_from_local.py`) involves dynamically importing the `create_application` function from the *server's* local repository (`../datalake-mcp-server`) to generate the OpenAPI specification. This implies a trust relationship with the server's source code during the client build step. The generated client code itself uses standard Python practices and the `httpx` library, which is well-regarded for HTTP requests. No 'eval', obfuscation, hardcoded secrets, or unusual network risks are evident within the client library's runtime code. The documentation for `table_select_request` explicitly states that the server's backend builds queries safely, preventing SQL injection.
Similar Servers
fastmcp
FastMCP is an ergonomic interface for the Model Context Protocol (MCP), providing a comprehensive framework for building and interacting with AI agents, tools, resources, and prompts across various transports and authentication methods.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.
lex
Provides a UK legal research API for AI agents, offering capabilities to search legislation, caselaw, amendments, and explanatory notes using semantic and keyword search, and includes a Micro-Copilot (MCP) server for integration with AI assistants.
sap-datasphere-mcp
AI-powered data exploration, integration, and management for SAP Datasphere environments, enabling natural language interaction for data discovery, metadata exploration, analytics, ETL, and database user management.