Parallelization
Parallelization spawns multiple independent tasks at the same time and aggregates results before continuing. Inside an agent loop, this typically means running several tool calls concurrently when they don’t depend on each other. At a system level, it can mean running the same input through multiple evaluators and picking the best (or majority) result.
Hatchet distributes child runs across all running workers where the task is registered. The parent’s slot is freed while children execute, so you don’t hold resources during parallel work.
There are two common variants:
- Sectioning: different tasks handle different concerns in parallel (e.g., content generation + safety check).
- Voting: the same task runs N times and results are aggregated by majority vote or best score.
When to use
| Scenario | Fit |
|---|---|
| Agent calls 3 independent APIs (weather, news, stock) | Good: no dependencies between calls, latency drops to max of the three |
| Content generation + safety guardrail in parallel | Good: sectioning, both run at once, block if unsafe |
| Multiple evaluators vote on content quality | Good: voting, aggregate for more reliable decisions |
| Processing a batch of items (100+ documents) | Good: see Batch Processing for large-scale fanout |
| Steps depend on each other (output of A feeds B) | Skip: run sequentially |
| Provider rate limits are tight | Careful: parallel calls may hit limits; use Rate Limits |
How it maps to Hatchet
The parent task spawns children via child spawning. Each child runs on any available worker. The parent’s slot is evicted while children execute, so you’re not holding resources during the parallel work. When all children complete, the parent resumes and aggregates.
Step-by-step walkthrough
Define the parallel tasks
Create separate tasks for each concern. These run independently and can be composed in different patterns.
Sectioning (parallel concerns)
Sectioning runs different concerns in parallel. The example generates content and checks safety at the same time. If the safety check fails, the content is blocked even though generation succeeded.
Voting (parallel consensus)
Voting runs the same evaluation N times and aggregates by majority or average score. This produces more reliable decisions than a single evaluation.
Run the worker
Register all tasks and start the worker.
For large-scale parallelism (hundreds or thousands of items), see the Batch Processing guide, which covers fan-out with concurrency control.
Related Patterns
Large-scale fan-out with concurrency limits and progress tracking.
Batch ProcessingParallelization applies within one iteration of an agent loop when multiple tools are independent.
Reasoning LoopCombine voting (parallel evaluators) with optimization (feedback loop) for higher-quality iteration.
Evaluator-OptimizerThe Hatchet concept: spawn children in parallel, parent waits for all.
FanoutNext Steps
- Child Spawning: spawn parallel children from a parent task
- Task Eviction: free the parent’s slot while children execute
- Rate Limits: throttle parallel calls to external APIs
- Concurrency Control: limit how many children run simultaneously