JavaScript Async Generators: Complete Tutorial

A complete tutorial on JavaScript async generators. Covers async generator syntax with async function*, yielding promises, consuming with for-await-of, paginated API iteration, real-time event streaming, combining async generators with AbortController, lazy data pipelines, error handling in async iteration, and async generator vs ReadableStream comparison.

JavaScriptintermediate
15 min read

Async generators combine the lazy evaluation of generators with the asynchronous power of promises. They produce values on demand, one at a time, where each value can involve an async operation like a network request or timer. This makes them ideal for paginated APIs, real-time event streams, and lazy data pipelines.

Syntax: async function*

javascriptjavascript
async function* countAsync(limit) {
  for (let i = 0; i < limit; i++) {
    // Simulate async work
    await new Promise((resolve) => setTimeout(resolve, 100));
    yield i;
  }
}
 
// Consume with for-await-of
for await (const value of countAsync(5)) {
  console.log(value); // 0, 1, 2, 3, 4 (each 100ms apart)
}

The async function* declaration creates a function that returns an async iterator. Each yield pauses execution until the consumer requests the next value.

Async Iterator Protocol

MethodReturnsDescription
iterator.next()Promise<{value, done}>Get the next value
iterator.return(value)Promise<{value, done: true}>Signal early termination
iterator.throw(error)Promise<{value, done}>Inject an error into the generator
javascriptjavascript
const gen = countAsync(3);
 
const first = await gen.next();   // { value: 0, done: false }
const second = await gen.next();  // { value: 1, done: false }
const third = await gen.next();   // { value: 2, done: false }
const end = await gen.next();     // { value: undefined, done: true }

Paginated API Iterator

javascriptjavascript
async function* fetchPages(baseUrl, pageSize = 20) {
  let page = 1;
  let hasMore = true;
 
  while (hasMore) {
    const response = await fetch(`${baseUrl}?page=${page}&limit=${pageSize}`);
    if (!response.ok) throw new Error(`HTTP ${response.status}`);
 
    const data = await response.json();
 
    for (const item of data.items) {
      yield item;
    }
 
    hasMore = data.items.length === pageSize;
    page++;
  }
}
 
// Consume: processes items one at a time across all pages
for await (const user of fetchPages("/api/users", 50)) {
  console.log(user.name);
 
  // Stop early if needed
  if (user.name === "target") break;
}

When you break out of the loop, the generator's return() method is called automatically, so no more pages are fetched.

Real-Time Event Stream

javascriptjavascript
async function* serverSentEvents(url) {
  const response = await fetch(url);
  const reader = response.body
    .pipeThrough(new TextDecoderStream())
    .getReader();
 
  let buffer = "";
 
  while (true) {
    const { done, value } = await reader.read();
    if (done) break;
 
    buffer += value;
    const lines = buffer.split("\n\n");
    buffer = lines.pop() || "";
 
    for (const block of lines) {
      const dataLine = block.split("\n").find((l) => l.startsWith("data: "));
      if (dataLine) {
        yield JSON.parse(dataLine.slice(6));
      }
    }
  }
}
 
// Usage
for await (const event of serverSentEvents("/api/stream")) {
  console.log("Event:", event);
  updateUI(event);
}

Timer-Based Generator

javascriptjavascript
async function* interval(ms) {
  let count = 0;
 
  while (true) {
    await new Promise((resolve) => setTimeout(resolve, ms));
    yield count++;
  }
}
 
// Tick every second, stop after 10
for await (const tick of interval(1000)) {
  console.log(`Tick ${tick}`);
  if (tick >= 9) break;
}

Combining With AbortController

javascriptjavascript
async function* fetchWithAbort(url, signal) {
  const response = await fetch(url, { signal });
  const reader = response.body.getReader();
 
  try {
    while (true) {
      if (signal?.aborted) return;
 
      const { done, value } = await reader.read();
      if (done) break;
      yield value;
    }
  } finally {
    reader.releaseLock();
  }
}
 
// Usage
const controller = new AbortController();
 
setTimeout(() => controller.abort(), 5000); // Cancel after 5s
 
try {
  for await (const chunk of fetchWithAbort("/api/large-data", controller.signal)) {
    process(chunk);
  }
} catch (error) {
  if (error.name === "AbortError") {
    console.log("Stream canceled");
  }
}

See using AbortController in JS complete tutorial for more cancellation patterns.

Composing Async Generators

Map

javascriptjavascript
async function* asyncMap(source, fn) {
  for await (const item of source) {
    yield await fn(item);
  }
}

Filter

javascriptjavascript
async function* asyncFilter(source, predicate) {
  for await (const item of source) {
    if (await predicate(item)) {
      yield item;
    }
  }
}

Take

javascriptjavascript
async function* asyncTake(source, count) {
  let taken = 0;
  for await (const item of source) {
    yield item;
    if (++taken >= count) return;
  }
}

Composing a Pipeline

javascriptjavascript
const pipeline = asyncTake(
  asyncFilter(
    asyncMap(
      fetchPages("/api/articles"),
      async (article) => ({
        ...article,
        wordCount: article.content.split(" ").length,
      })
    ),
    async (article) => article.wordCount > 500
  ),
  10
);
 
// Get first 10 articles with >500 words across all pages
const results = [];
for await (const article of pipeline) {
  results.push(article);
}

Batch Processing

javascriptjavascript
async function* asyncBatch(source, batchSize) {
  let batch = [];
 
  for await (const item of source) {
    batch.push(item);
 
    if (batch.length >= batchSize) {
      yield batch;
      batch = [];
    }
  }
 
  if (batch.length > 0) {
    yield batch;
  }
}
 
// Process users in batches of 100
for await (const batch of asyncBatch(fetchPages("/api/users"), 100)) {
  await bulkInsert(batch);
  console.log(`Inserted batch of ${batch.length} users`);
}

Error Handling

javascriptjavascript
async function* resilientFetch(urls) {
  for (const url of urls) {
    try {
      const response = await fetch(url);
      if (!response.ok) throw new Error(`HTTP ${response.status}`);
      yield await response.json();
    } catch (error) {
      yield { error: error.message, url };
    }
  }
}
 
for await (const result of resilientFetch(endpoints)) {
  if (result.error) {
    console.warn(`Failed: ${result.url} - ${result.error}`);
  } else {
    processData(result);
  }
}

Async Generator vs ReadableStream

FeatureAsync GeneratorReadableStream
Syntaxasync function* + for awaitConstructor + getReader()
PipingManual compositionBuilt-in pipeThrough/pipeTo
BackpressureImplicit (pull-based)Explicit (desiredSize)
TeeingManualBuilt-in tee()
Browser supportES2018+Chrome 89+, Firefox 102+
Best forApplication logic, paginationLow-level data processing
Rune AI

Rune AI

Key Insights

  • async function yields promises lazily*: Values are only produced when the consumer requests them via .next() or for-await-of
  • break cleanly terminates generators: The generator's finally block runs, resources are released, and no more async work is initiated
  • Compose with map/filter/take: Chain async generator helpers to build declarative data pipelines that process items one at a time
  • Natural backpressure: The producer pauses at yield until the consumer is ready, preventing memory overflow without explicit coordination
  • Ideal for paginated APIs: Fetch pages on demand, yield items one at a time, and stop early when the target is found
RunePowered by Rune AI

Frequently Asked Questions

When should I use async generators vs Promise.all?

Use async generators when items must be processed sequentially, when the total count is unknown (pagination), or when you want to start processing before all data arrives. Use `Promise.all` when all items are independent and you want maximum parallelism. See [advanced JS promise patterns complete tutorial](/tutorials/programming-languages/javascript/advanced-js-promise-patterns-complete-tutorial) for promise combinators.

Can I use async generators in older browsers?

sync generators require ES2018 support. All modern browsers support them. For older environments, use a Babel transform or polyfill.

How do async generators handle backpressure?

Naturally. The generator pauses at each `yield` until the consumer calls `.next()`. If the consumer is slow, the producer automatically waits. No data is produced until it is requested.

What happens if I do not break or return from the loop?

The generator runs until it returns (exits the function body) or throws. For infinite generators like `interval()`, you must `break` or the loop runs forever.

Can I yield from another async generator?

Yes, use `yield*`: `yield* anotherAsyncGenerator()`. This delegates to the inner generator, yielding all its values before continuing.

Conclusion

Async generators produce async values lazily on demand, making them perfect for paginated APIs, event streams, and data pipelines. They compose with asyncMap, asyncFilter, and asyncTake for declarative data transformation. The for-await-of loop handles the async iteration protocol automatically, and break cleanly terminates the generator. For the stream-based alternative, see JavaScript Web Streams API a complete tutorial. For the event loop scheduling these async operations, see the JS event loop architecture complete guide.