newline-mcp-server
Verified Safeby newline53
Overview
Enables AI agents to interact with the Newline Banking API for managing synthetic accounts, transfers, customers, transactions, and other banking operations via the Model Context Protocol (MCP).
Installation
No command providedEnvironment Variables
- NEWLINE_HMAC_KEY
- NEWLINE_PROGRAM_ID
- NEWLINE_BASE_URL
- HTTPS_PROXY
- https_proxy
Security Notes
The server uses JWT-based authentication with HMAC signing, relying on a secret key (`NEWLINE_HMAC_KEY`). While sensitive keys are loaded from environment variables as recommended, the `config.ts` provides default placeholder strings if these variables are not set. This would lead to authentication failures rather than a direct security compromise, but is a noteworthy configuration risk. The project explicitly states that external pull requests are not accepted due to operating within a regulated banking environment, and directs vulnerability reports to a HackerOne program, indicating a strong internal security focus. Proxy support is implemented via `https-proxy-agent`. Running via `npx git+...` involves executing remote code, which is a common practice but carries inherent supply chain risks.
Similar Servers
mcp-server-typescript
This server acts as a Model Context Protocol (MCP) gateway, enabling AI assistants to interact with DataForSEO APIs for various SEO data, including keyword research, SERP analysis, backlink monitoring, and on-page optimization.
chuk-mcp-server
A zero-configuration framework for building high-performance MCP (Model Context Protocol) servers, designed to host tools, resources, and prompts for AI agents (e.g., Claude Desktop). It features automatic cloud detection, multi-server composition, and robust transport options.
mcp-server
To provide an unidentified server-side component, potentially for network communication or management, given its name 'mcp-server'.
remote-mcp-server
An MCP server for tracking, listing, and summarizing personal or business expenses, designed to be interacted with via an LLM tool-calling interface.