Back to Home
0xJacky icon

nginx-ui

by 0xJacky

Overview

Nginx Log Analysis and Management UI with AI Assistant features for configuration, monitoring, and debugging across clustered Nginx instances.

Installation

Run Command
docker-compose -f docker-compose-demo.yml up

Environment Variables

  • NGINX_UI_NODE_DEMO
  • NGINX_UI_INIT_PASSWORD
  • NGINX_UI_DB_DSN
  • NGINX_UI_LOGROTATE_ENABLED
  • NGINX_UI_LOGROTATE_INTERVAL
  • NGINX_UI_OPENAI_API_KEY
  • NGINX_UI_OPENAI_MODEL
  • NGINX_UI_NGINX_CONFIG_DIR
  • NGINX_UI_NGINX_ACCESS_LOG_PATH
  • NGINX_UI_NGINX_ERROR_LOG_PATH
  • NGINX_UI_NGINX_PID_PATH
  • NGINX_UI_NODE_ID
  • NGINX_UI_NODE_NAME
  • NGINX_UI_NODE_URL
  • NGINX_UI_NODE_TOKEN
  • NGINX_UI_CASDOOR_ENDPOINT
  • NGINX_UI_CASDOOR_CLIENT_ID
  • NGINX_UI_CASDOOR_CLIENT_SECRET
  • NGINX_UI_CASDOOR_CERTIFICATE_PATH
  • NGINX_UI_CASDOOR_ORGANIZATION
  • NGINX_UI_CASDOOR_APPLICATION
  • NGINX_UI_CERT_EMAIL
  • NGINX_UI_CERT_CA_DIR
  • NGINX_UI_HTTP_HOST
  • NGINX_UI_HTTP_PORT
  • NGINX_UI_HTTP_ENABLE_HTTPS
  • NGINX_UI_HTTP_SSL_CERT
  • NGINX_UI_HTTP_SSL_KEY
  • NGINX_UI_WEBAUTHN_RP_ID
  • NGINX_UI_WEBAUTHN_RP_ORIGINS
  • NGINX_UI_TERMINAL_WS_ORIGIN
  • NGINX_UI_BACKUP_S3_REGION
  • NGINX_UI_BACKUP_S3_ACCESS_KEY_ID
  • NGINX_UI_BACKUP_S3_SECRET_ACCESS_KEY
  • NGINX_UI_BACKUP_S3_ENDPOINT
  • LEGO_DISABLE_CNAME_SUPPORT

Security Notes

CRITICAL VULNERABILITY: The application uses MD5 to hash a secret key which is then used to derive the AES key for encrypting sensitive data like OTP secrets. MD5 is cryptographically broken and should never be used for key derivation, making encrypted data highly vulnerable. Additional Concerns: - While path validation is implemented for file operations (config, logs, certs), the robustness of `helper.CopyFile` in `api/streams/duplicate.go` and the `internal/backup.Restore` function is critical. Without a deeper dive, there's a potential risk of path traversal or arbitrary file overwrites during these sensitive operations. - Remote node synchronization relies on secure token management. Compromise of these tokens would allow control over remote Nginx instances. - LLM integration introduces inherent prompt injection risks, although system prompts are used to guide the AI.

Similar Servers

Stats

Interest Score100
Security Score4
Cost ClassHigh
Avg Tokens500
Stars10098
Forks729
Last Update2025-12-06

Tags

NginxLog AnalysisWeb UIAI AssistantCluster ManagementPerformance Monitoring