Langflow Component

Low-code AI agent and workflow platform with support for RAG, vector databases, and LLM integration.

Architecture

Langflow App - Web UI + API server
PostgreSQL - Flow storage (via CNPG link)
Vector Store - RAG storage (via Qdrant link)
LLM Backend - vLLM or Ollama link

Quick Reference

REQUIRED = Must be defined by user
Attribute Example Default Effect
namespace REQ langflow - K8s namespace
port REQ 7860 7860 Web UI port
version langflowai/langflow:1.7.1 1.7.1 Full image with tag
replicas 1 1 Pod replicas
cpu_request / cpu_limit 500m / 2000m 500m/2000m CPU resources
mem_request / mem_limit 1Gi / 4Gi 1Gi/4Gi Memory resources
pvc_enabled true true Persistent storage
pvc_size 10Gi 10Gi Storage size
auto_login false false Skip auth (dev only)
superuser admin admin Admin username
superuser_password changeme - Admin password
secret_key prod-secret changeme Session security key

Link Variables (11 Total)

Variable Link Type Generated ENV
__vllm langflow-vllm OPENAI_API_BASE (router)
__vllm_model langflow-model OPENAI_API_BASE_[MODEL]
__ollama langflow-ollama OLLAMA_BASE_URL
__qdrant langflow-qdrant QDRANT_URL
__qdrant_collection langflow-collection QDRANT_COLLECTION, QDRANT_VECTOR_SIZE
__cnpg langflow-db LANGFLOW_DATABASE_URL
__valkey langflow-valkey LANGFLOW_REDIS_HOST/PORT/PASSWORD
__prometheus prometheus-langflow ServiceMonitor

Dynamic ENV Import

LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT

Auto-populated list of all generated ENV vars.
Langflow reads these into its component store at startup.

Example: OPENAI_API_BASE,QDRANT_URL,LANGFLOW_DATABASE_URL

Generated Files

File Condition Contains
deployment.yaml Always Deployment + Service
rbac.yaml Always ServiceAccount
pvc.yaml pvc_enabled=true PersistentVolumeClaim
secret/cloud.env Always All auto-configured ENVs
doc/flows/*.json Always Example flow templates

Ports

Port Purpose Service
7860 Web UI + API [name]:80 (external)