Back to Home
FlowLLM-AI icon

flowllm

by FlowLLM-AI

Overview

FlowLLM is a configuration-driven framework for building LLM-powered applications, encapsulating LLM, Embedding, and vector store capabilities as HTTP/MCP services. It's designed for AI assistants, RAG applications, and complex workflow orchestration, minimizing boilerplate code.

Installation

Run Command
flowllm backend=http http.port=8002

Environment Variables

  • FLOW_LLM_API_KEY
  • FLOW_LLM_BASE_URL
  • FLOW_EMBEDDING_API_KEY
  • FLOW_EMBEDDING_BASE_URL
  • FLOW_DASHSCOPE_API_KEY
  • FLOW_TAVILY_API_KEY
  • FLOW_PGVECTOR_CONNECTION_STRING
  • FLOW_PGVECTOR_ASYNC_CONNECTION_STRING
  • FLOW_QDRANT_HOST
  • FLOW_QDRANT_PORT
  • FLOW_QDRANT_API_KEY
  • FLOW_ES_HOSTS
  • FLOW_APP_NAME

Security Notes

The framework uses `exec()` and `eval()` internally (`parse_flow_expression`) to process `flow_content` defined in YAML configuration. While the documentation states execution in a 'restricted environment', arbitrary code execution remains a critical risk if the YAML input is not from an absolutely trusted source or if the sandbox is insufficient. Default CORS settings allow all origins, which should be restricted in production environments.

Similar Servers

Stats

Interest Score41
Security Score2
Cost ClassHigh
Avg Tokens8000
Stars29
Forks1
Last Update2026-01-07

Tags

LLM FrameworkAI AgentsWorkflow OrchestrationConfiguration-DrivenHTTP APIMCP ServiceRAGMicroservicesPythonVector Database