Click each milestone to see what it unlocked for LLM workflow design.
SequentialChain - a wrapper that connected multiple chain steps. Input/output variables were mapped between steps. It solved the boilerplate problem but imposed rigid linearity. No branching, no conditions, no loops.
chain1 | chain2 | chain3. Elegant syntax, still fundamentally linear.RunnablePassthrough | prompt | llm | parser reads like a Unix pipeline. Streaming, batching, and async came for free. But the pipe is a line, not a graph. Conditional routing required escape hatches like RunnableBranch.
QueryPipeline allowed defining modules and linking them as a DAG. But it was deprecated in favor of Workflow and then AgentWorkflow. As of 2026, LlamaIndex Core 0.14 has no stable, supported graph primitive for custom conditional routing.
StateGraph with a TypedDict state schema. add_node(), add_conditional_edges(), compile(). Support for cycles (retry loops), parallel branches, checkpointing to SQLite/Redis, and event streaming. This was the inflection point - workflows became declarative graphs instead of imperative code.
StateGraph mirrors LangGraph's API: add_node(), add_conditional_edge(), set_entry_point(), compile(). The key addition: StateField reducers that define how concurrent writes merge (append, last-write-wins, custom). This prevents bugs in fan-out/fan-in patterns where multiple branches write to the same state key.