Back to Home
FlowLLM-AI icon

flowllm

by FlowLLM-AI

Overview

A configuration-driven framework for rapidly building and deploying LLM-powered applications, AI agents, RAG systems, and workflow services, automatically generating HTTP APIs and Model Context Protocol (MCP) tools.

Installation

Run Command
flowllm config=my_mcp_config backend=mcp mcp.transport=sse mcp.port=8001 llm.default.model_name=qwen3-30b-a3b-thinking-2507

Environment Variables

  • FLOW_LLM_API_KEY
  • FLOW_LLM_BASE_URL
  • FLOW_EMBEDDING_API_KEY
  • FLOW_EMBEDDING_BASE_URL
  • FLOW_DASHSCOPE_API_KEY
  • FLOW_TAVILY_API_KEY
  • FLOW_ES_HOSTS
  • FLOW_QDRANT_HOST
  • FLOW_QDRANT_PORT
  • FLOW_QDRANT_API_KEY
  • HF_ENDPOINT

Security Notes

The `parse_flow_expression` function in `flowllm/core/utils/common_utils.py` uses `exec()` and `eval()` to dynamically execute Python code defined in YAML configurations. This poses a severe remote code execution vulnerability if configuration files or their sources are not fully trusted. Additionally, `ShellOp` in `flowllm/extensions/file_tool/shell_op.py` allows arbitrary shell command execution, which could lead to system compromise if user inputs or agent decisions are not rigorously sanitized in an untrusted environment. These features, while potentially intended for highly controlled agentic use cases, present significant risks in a general server deployment.

Similar Servers

Stats

Interest Score42
Security Score3
Cost ClassMedium
Avg Tokens1000
Stars18
Forks1
Last Update2025-12-05

Tags

LLM FrameworkAI AgentsRAG SystemsWorkflow OrchestrationConfiguration-drivenAPI Generation