openops
Verified Safeby openops-cloud
Overview
OpenOps is a No-Code FinOps automation platform that helps organizations reduce cloud costs, streamline financial operations, and automate key FinOps processes through customizable workflows and AI capabilities.
Installation
npm i --no-save && docker compose up -d --wait && npm run devEnvironment Variables
- OPS_PUBLIC_URL
- OPS_FRONTEND_URL
- OPS_ENCRYPTION_KEY
- OPS_JWT_SECRET
- OPS_OPENOPS_ADMIN_EMAIL
- OPS_OPENOPS_ADMIN_PASSWORD
- OPS_POSTGRES_USERNAME
- OPS_POSTGRES_PASSWORD
- OPS_POSTGRES_DATABASE
- OPS_ANALYTICS_ADMIN_PASSWORD
- ANALYTICS_POWERUSER_PASSWORD
- OPS_REDIS_HOST
- OPS_REDIS_PORT
- OPS_ENGINE_URL
- OPS_OPENOPS_TABLES_PUBLIC_URL
- OPS_OPENOPS_TABLES_API_URL
- OPS_REQUEST_BODY_LIMIT
- OPS_ENABLE_HOST_SESSION
- HOST_AZURE_CONFIG_DIR
- HOST_CLOUDSDK_CONFIG
- AZURE_API_VERSION
- LANGFUSE_SECRET_KEY
- LANGFUSE_PUBLIC_KEY
- LANGFUSE_HOST
- OPS_EXECUTION_MODE
- OPS_BLOCKS_SOURCE
Security Notes
The platform inherently executes external commands for integrations (e.g., `gcloud`, `az`, `aws`) using `child_process.spawn` and `execFile`. `eval` is utilized for dynamic module loading of blocks in development mode, which is a potential risk but restricted to that context. Hardcoded secrets in `deploy/helm/openops/values.yaml` are explicitly marked as placeholders (`please-change-this-secret`) requiring user replacement, which is a good practice. `docker-entrypoint.sh` and `tools/link-packages.sh` perform sensitive actions like `npm install` and `rm -rf node_modules`, typically executed in trusted build/deployment environments.
Similar Servers
n8n
AI-powered workflow automation platform, enabling users to build and run workflows using various integrations, with a focus on AI models and tools for task execution and conversational agents.
activepieces
An all-in-one AI automation platform designed to be extensible, serving as an open-source replacement for Zapier. It enables users to build AI-driven workflows and integrations using a type-safe TypeScript framework, and functions as a comprehensive MCP toolkit for connecting LLMs to various services.
flux-operator
The Flux Operator MCP Server acts as a bridge for AI assistants, allowing them to manage and troubleshoot GitOps pipelines and Kubernetes resources controlled by FluxCD through natural language interactions.
ironmanus-mcp
Orchestrates AI workflows with an 8-phase control flow and specialized tools, serving as a Model Context Protocol (MCP) server.