ai_workflows
by shawnmcrowley
Overview
A comprehensive system for building, managing, and executing AI workflows, intelligent agents, and document processing pipelines leveraging Langflow's visual builder, PostgreSQL for vector database capabilities, and local Ollama models for privacy-focused AI processing.
Installation
npm startEnvironment Variables
- LANGFLOW_DATABASE_URL
- LANGFLOW_CONFIG_DIR
- POSTGRES_USER
- POSTGRES_PASSWORD
- POSTGRES_DB
- OLLAMA_HOST
- OLLAMA_BASE_URL
Security Notes
The source code presents critical security vulnerabilities. Firstly, a Langflow API key (`sk-AEDsSFO3Lg3H85crq64Co1hmezOIhraCVCvxO8LKeZU`) is hardcoded in `src/app/scripts/index.js`. If this script is exposed client-side, the API key would be immediately compromised. Secondly, the `executeWorkflowAction` server action in `src/app/actions/workflow-actions.js` allows a user-controlled `endpoint` to be used in server-side `fetch` requests (`executeGenericRequest`). This introduces a severe Server-Side Request Forgery (SSRF) vulnerability, enabling potential attackers to make the server request internal network resources, scan internal networks, or trigger unintended actions on other internal services. The Langflow `APIRequest` component within `flows/External API.json` also directly uses a `url_input` for `httpx` requests, presenting another SSRF vector if that specific Langflow flow can be triggered with malicious input.
Similar Servers
MaxKB
MaxKB (Max Knowledge Brain) is an enterprise-grade intelligent agent platform designed to lower the technical barrier and deployment costs of AI implementation, helping businesses quickly integrate mainstream large language models, build proprietary knowledge bases, and offer a progressive upgrade path from RAG to complex workflow automation and advanced agents for various application scenarios like smart customer service and office assistants.
flexible-graphrag
The Flexible GraphRAG MCP Server integrates document processing, knowledge graph building, hybrid search, and AI query capabilities via the Model Context Protocol (MCP) for clients like Claude Desktop and MCP Inspector.
Matryoshka
Processes large documents beyond LLM context windows using a Recursive Language Model (RLM) that executes symbolic commands for iterative document analysis.
flowllm
FlowLLM is a configuration-driven framework for building LLM-powered applications, encapsulating LLM, Embedding, and vector store capabilities as HTTP/MCP services. It's designed for AI assistants, RAG applications, and complex workflow orchestration, minimizing boilerplate code.