Task.WhenAll vs Parallel.ForEachAsync vs Channels in C#

Mastering High-Performance Concurrency Patterns in .NET 10

Modern C# gives developers multiple concurrency models – each powerful, but each suited to very different workloads. With the introduction of Parallel.ForEachAsync in .NET 6 and the maturing of System.Threading.Channels, developers now have three top-tier concurrency primitives for writing high-throughput, scalable systems.

But most developers struggle to choose between them.

  • Task.WhenAll for orchestrating many asynchronous operations
  • Parallel.ForEachAsync for controlled concurrency on CPU-bound or mixed workloads
  • Channels for high-throughput pipelines and producer/consumer systems

This tutorial breaks down each model using concepts, diagrams, examples, internals, performance, and real-world use cases.


πŸ” The Problem: Developers Frequently Misuse Async Concurrency Tools

Before modern .NET, concurrency solutions were limited:

  • ThreadPool.QueueUserWorkItem
  • Task.Run loops
  • Hard-coded throttling logic
  • Blocking Collections
  • Manual locking
  • SemaphoreSlim throttling hacks

This caused several recurring issues:

❌ Too much concurrency

Flooding the thread pool with hundreds or thousands of tasks.

❌ Too little concurrency

Sequential execution through poorly written await loops.

❌ Incorrect tool selection

Using Task.WhenAll for CPU-heavy loops or using Parallel loops for I/O tasks.

❌ Deadlocks and contention

Because of poor design of shared state, locks, or blocking calls.

.NET 10 gives us three clear tools to solve concurrency problems – but knowing when to use each one is the real skill.


⚑ The Three Models: Snapshot Overview

PatternBest ForConcurrency Control?Work TypeTypical Use Case
Task.WhenAllFan-out async I/O❌ NoI/O-boundFetch 50 URL’s
Parallel.ForEachAsyncControlled parallel loopsβœ” Degree of ParallelismCPU / MixedImage processing
ChannelsPipelines & queuesβœ” FullCPU / I/O / mixedJobs, background workers, ingestion

Before diving deep, here are the guiding rules:

βœ” Use Task.WhenAll when you already have tasks

βœ” Use Parallel.ForEachAsync when you have a collection and want parallelism

βœ” Use Channels when you need structured, streaming pipelines or producer/consumer architectures

Now let’s go deep.


🧠 1. Task.WhenAll β€” The Fan-Out, High-Concurrency Optimiser

Best for Firehose-Style Parallel Asynchronous I/O

πŸ”§ Concept

Task.WhenAll schedules all tasks at once, gathers their results, and returns a single Task that completes when all tasks complete.

Example: Making 100 API calls simultaneously

var urls = Enumerable.Range(1, 100).Select(i => $"https://api.example.com/{i}");
var http = new HttpClient();

var tasks = urls.Select(url => http.GetStringAsync(url));
var results = await Task.WhenAll(tasks);

Why it’s fast:

  • No limitation on parallelism
  • Zero orchestration overhead
  • Perfect for I/O β€” the thread is released during waiting

Why it can be dangerous:

  • If each task uses a socket/file/db connection, you may overwhelm the system.
  • If tasks are CPU-bound, you may create thousands of competing tasks.

Under the Hood

WhenAll:

  • Inspects each Task
  • Attaches a continuation that monitors when all complete
  • Returns a composite Task that completes when all child Tasks are done
  • Propagates aggregated exceptions

πŸ”₯ Summary:

Use WhenAll for:

  • Network calls
  • Database queries
  • Disk I/O
  • Anything β€œasync all the way down”
  • Massive scale I/O fans (50–10,000 tasks)

🧠 2. Parallel.ForEachAsync β€” Modern Controlled Parallelism

Best for CPU-intensive work or mixed CPU/I/O loops

Introduced in .NET 6, this method gives the best of the classic Parallel.ForEach and modern async paradigms.

πŸ”§ Concept

You supply:

  • an enumerable
  • a delegate that returns a ValueTask
  • an optional degree of parallelism

.NET manages the execution of loop bodies concurrently using worker threads and async scheduling.

Example: Processing 500 images with limited concurrency

await Parallel.ForEachAsync(images, new ParallelOptions
{
    MaxDegreeOfParallelism = Environment.ProcessorCount
}, async (image, token) =>
{
    var data = await File.ReadAllBytesAsync(image);
    var processed = ProcessImage(data); // CPU-bound
    await SaveAsync(processed);
});

Why it’s powerful:

  • Built-in throttling
  • Integrates async + CPU-bound work
  • Zero boilerplate concurrency management
  • Smart scheduling based on thread pool heuristics

Why it can be dangerous:

  • Not ideal for pure async I/O (WhenAll is faster)
  • Work must be per-item and independent
  • Channel pipelines can outperform it for complex flows

Under the Hood

Parallel.ForEachAsync:

  • Uses a work-stealing scheduling loop
  • Queues tasks across thread pool workers
  • Balances CPU and async continuations
  • Uses ValueTask to reduce overhead
  • Has cancellation, throttling, and better memory usage

πŸ”₯ Summary:

Use Parallel.ForEachAsync for:

  • Per-item CPU processing
  • Combination I/O + CPU workloads
  • Limited concurrency scenarios
  • Batch job processing
  • Data transformations

🧠 3. Channels – High-Throughput Producer/Consumer Pipelines

Best for real-time processing, streaming, pipelines, and background workers

System.Threading.Channels give you:

  • In-memory message queues
  • Backpressure
  • Bounded channels
  • High-performance pipelines
  • Built-in asynchronous coordination
  • Zero locking

This is the tool you use when:

  • Tasks produce data
  • Tasks consume data
  • Processing is multi-stage
  • Data flows in streams
  • You need maximum throughput

Example: Multi-stage pipeline

Input β†’ Transform β†’ Save

var channel = Channel.CreateBounded<int>(100);

var writer = channel.Writer;
var reader = channel.Reader;

// Producer
var producer = Task.Run(async () =>
{
    for (int i = 0; i < 1000; i++)
        await writer.WriteAsync(i);

    writer.Complete();
});

// Consumer
var consumer = Task.Run(async () =>
{
    await foreach (var value in reader.ReadAllAsync())
    {
        var transformed = value * 2;
        await SaveAsync(transformed);
    }
});

await Task.WhenAll(producer, consumer);

Why it’s powerful:

  • Works for queues, pipelines, streams
  • Naturally handles fast producers/slow consumers
  • Ideal for high-scale ingestion or processing
  • Superior to BlockingCollection
  • Best tool for real-time server workloads

Under the Hood

Channels:

  • Are lock-free
  • Use async/await internally
  • Use optimised ring buffers
  • Support single/multi-producer and consumer
  • Provide built-in backpressure (bounded channels)

πŸ”₯ Summary:

Use Channels for:

  • Multi-stage pipelines
  • Background workers
  • Stream processing
  • Telemetry ingestion
  • Real-time servers
  • Large job queues
  • Anywhere you’d previously consider Kafka-lite or in-memory queues

🧩 Performance Comparison

I/O-bound tasks

Winner: Task.WhenAll

CPU-bound per-item loops

Winner: Parallel.ForEachAsync

Pipeline processing

Winner: Channels

Dynamic workloads / producers and consumers

Winner: Channels

Single-stage fan-out

Winner: Task.WhenAll


🧱 Real-World Scenarios: Which One Should You Use?

πŸ“Œ Scenario 1: Download 200 URLs

➑ Task.WhenAll
Because they are independent and purely I/O.

πŸ“Œ Scenario 2: Resize 1,000 photos

➑ Parallel.ForEachAsync
Because it controls CPU load.

πŸ“Œ Scenario 3: Stream telemetry, parse it, aggregate it, and save it

➑ Channels
Because it has sequential pipeline stages.

πŸ“Œ Scenario 4: Run 50 background Tasks

➑ Channels or Parallel.ForEachAsync
Depending on job shape.

πŸ“Œ Scenario 5: Execute 10 SQL queries in parallel

➑ Task.WhenAll


🧰 Best Practices

βœ” Don’t use Task.WhenAll for CPU-heavy loops

It will spawn too many tasks.

βœ” Don’t use Parallel.ForEachAsync for massive I/O

WhenAll is more efficient.

βœ” Don’t use Channels for one-step work

They shine in pipelines.

βœ” Favor ValueTask in high-throughput loops (like Channels consumers)

βœ” Always throttle concurrency when accessing databases or file I/O

(Parallel.ForEachAsync makes this trivial)

βœ” Avoid Task.Wait and .Result β€” they block threads


🧠 Summary Table (Executable Cheat Sheet)

ScenarioBest ToolReason
100s–1000s of independent async operationsTask.WhenAllNo overhead, pure async
CPU-bound processing of a collectionParallel.ForEachAsyncControlled parallelism
Mixed CPU/I/O per itemParallel.ForEachAsyncBalance & throttling
Producer/consumerChannelsBackpressure + pipelines
Multi-stage processingChannelsThroughput scaling
Message passingChannelsLock-free async queue
Dynamic workloadsChannelsFlexible pattern

Final Thoughts

The concurrency landscape in .NET 10 is powerful but nuanced.

Using the wrong tool leads to:

  • poor scaling
  • thread starvation
  • unnecessary memory use
  • difficult debugging

Using the right tool leads to:

  • maximum throughput
  • clean, readable code
  • predictable performance
  • easier maintenance

Task.WhenAll, Parallel.ForEachAsync, and System.Threading.Channels each solve a different class of concurrency problem β€” and mastering them is essential for creating fast, modern, production-grade C# applications.