We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.

By clicking "Accept", you agree to our use of cookies.
Learn more.

User GuideChild Spawning

Procedural Child Workflow Spawning

Hatchet supports the dynamic creation of child workflows during a parent workflow’s execution. This powerful feature enables:

  • Complex, reusable workflow hierarchies - Break down complex workflows into simpler, reusable components
  • Fan-out parallelism - Scale out to multiple parallel tasks dynamically
  • Dynamic workflow behavior - Create loops and conditional branches at runtime
  • Agent-based workflows - Support AI agents that can create new workflows based on analysis results or loop until a condition is met

Creating Parent and Child Workflows

To implement child workflow spawning, you first need to create both parent and child workflow definitions.

First, we’ll declare a couple of workflows for the parent and child:

We also created a task on the parent workflow that spawns the child workflows. Now, we’ll add a couple of tasks to the child workflow:

And that’s it! The fanout parent will run and spawn the child, and then will collect the results from its tasks.

Running Child Workflows

To spawn and run a child workflow from a parent task, use the appropriate method for your language:

# Inside a parent task
child_result = child_workflow.run(child_input)

Parallel Child Workflow Execution

As shown in the examples above, you can spawn multiple child workflows in parallel:

# Run multiple child workflows concurrently with asyncio
import asyncio
 
async def run_child_workflows(n: int) -> list[dict[str, Any]]:
	return await child.aio_run_many([
		child.create_bulk_run_item(
			options=TriggerWorkflowOptions(
				input=ChildInput(n=i),
			)
		)
		for i in range(n)
	])
 
# In your parent task
child_results = await run_child_workflows(input.n)

Use Cases for Child Workflows

Child workflows are ideal for:

  1. Dynamic fan-out processing - When the number of parallel tasks is determined at runtime
  2. Reusable workflow components - Create modular workflows that can be reused across different parent workflows
  3. Resource-intensive operations - Spread computation across multiple workers
  4. Agent-based systems - Allow AI agents to spawn new workflows based on their reasoning
  5. Long-running operations - Break down long operations into smaller, trackable units of work

Error Handling with Child Workflows

When working with child workflows, it’s important to properly handle errors. Here are patterns for different languages:

try:
    child_result = child.run(ChildInput(a="foobar"))
except Exception as e:
    # Handle error from child workflow
    print(f"Child workflow failed: {e}")
    # Decide how to proceed - retry, skip, or fail the parent
Last updated on