ncp
Verified Safeby portel-dev
Overview
NCP acts as a universal adapter and orchestrator for Model Context Protocol (MCP) servers and tools. It provides a unified interface for discovery, execution, and management of diverse tools (local CLI, HTTP APIs, internal plugins/Photons, AI skills) through natural language and structured code interaction, enabling AI agents to interact with the broader digital ecosystem.
Installation
node dist/index-mcp.jsEnvironment Variables
- NCP_PROFILE
- NCP_WORKING_DIR
- NCP_DEBUG
- NCP_ENABLE_GLOBAL_CLI
- NCP_CLI_AUTOSCAN
- NCP_DISABLE_BACKGROUND_INIT
- NCP_ENABLE_SCHEDULE_MCP
- NCP_ENABLE_MCP_MANAGEMENT
- NCP_ENABLE_SKILLS
- NCP_ENABLE_PHOTON_RUNTIME
Security Notes
NCP features a robust multi-layered security architecture, including an `CodeAnalyzer` and `SemanticValidator` to detect dangerous patterns in user-provided code, sandboxed execution environments (`IsolatedVMSandbox`, `SubprocessSandbox`), a `NetworkPolicyManager` requiring elicitation for network access, secure credential storage (`SecureCredentialStore`), and input validation against dangerous shell commands. User consent is elicited for potentially sensitive operations.
Similar Servers
mcphub
An orchestration hub that aggregates, manages, and routes Model Context Protocol (MCP) servers and their tools, providing a centralized interface, user management, OAuth 2.0 authorization server capabilities, and AI-powered tool discovery and routing.
aicode-toolkit
An MCP proxy server that aggregates multiple Model Context Protocol (MCP) servers, enabling on-demand tool discovery and execution, thereby significantly reducing AI agent token usage and improving context window efficiency by loading tools progressively.
1xn-vmcp
An open-source platform for composing, customizing, and extending multiple Model Context Protocol (MCP) servers into a single logical, virtual MCP server, enabling fine-grained context engineering for AI workflows and agents.
modular-mcp
A proxy server that efficiently manages and loads large tool collections from multiple Model Context Protocol (MCP) servers on-demand for LLMs, reducing context overhead.