JavaScript Async Generators: Complete Tutorial
A complete tutorial on JavaScript async generators. Covers async generator syntax with async function*, yielding promises, consuming with for-await-of, paginated API iteration, real-time event streaming, combining async generators with AbortController, lazy data pipelines, error handling in async iteration, and async generator vs ReadableStream comparison.
Async generators combine the lazy evaluation of generators with the asynchronous power of promises. They produce values on demand, one at a time, where each value can involve an async operation like a network request or timer. This makes them ideal for paginated APIs, real-time event streams, and lazy data pipelines.
Syntax: async function*
async function* countAsync(limit) {
for (let i = 0; i < limit; i++) {
// Simulate async work
await new Promise((resolve) => setTimeout(resolve, 100));
yield i;
}
}
// Consume with for-await-of
for await (const value of countAsync(5)) {
console.log(value); // 0, 1, 2, 3, 4 (each 100ms apart)
}The async function* declaration creates a function that returns an async iterator. Each yield pauses execution until the consumer requests the next value.
Async Iterator Protocol
| Method | Returns | Description |
|---|---|---|
iterator.next() | Promise<{value, done}> | Get the next value |
iterator.return(value) | Promise<{value, done: true}> | Signal early termination |
iterator.throw(error) | Promise<{value, done}> | Inject an error into the generator |
const gen = countAsync(3);
const first = await gen.next(); // { value: 0, done: false }
const second = await gen.next(); // { value: 1, done: false }
const third = await gen.next(); // { value: 2, done: false }
const end = await gen.next(); // { value: undefined, done: true }Paginated API Iterator
async function* fetchPages(baseUrl, pageSize = 20) {
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await fetch(`${baseUrl}?page=${page}&limit=${pageSize}`);
if (!response.ok) throw new Error(`HTTP ${response.status}`);
const data = await response.json();
for (const item of data.items) {
yield item;
}
hasMore = data.items.length === pageSize;
page++;
}
}
// Consume: processes items one at a time across all pages
for await (const user of fetchPages("/api/users", 50)) {
console.log(user.name);
// Stop early if needed
if (user.name === "target") break;
}When you break out of the loop, the generator's return() method is called automatically, so no more pages are fetched.
Real-Time Event Stream
async function* serverSentEvents(url) {
const response = await fetch(url);
const reader = response.body
.pipeThrough(new TextDecoderStream())
.getReader();
let buffer = "";
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += value;
const lines = buffer.split("\n\n");
buffer = lines.pop() || "";
for (const block of lines) {
const dataLine = block.split("\n").find((l) => l.startsWith("data: "));
if (dataLine) {
yield JSON.parse(dataLine.slice(6));
}
}
}
}
// Usage
for await (const event of serverSentEvents("/api/stream")) {
console.log("Event:", event);
updateUI(event);
}Timer-Based Generator
async function* interval(ms) {
let count = 0;
while (true) {
await new Promise((resolve) => setTimeout(resolve, ms));
yield count++;
}
}
// Tick every second, stop after 10
for await (const tick of interval(1000)) {
console.log(`Tick ${tick}`);
if (tick >= 9) break;
}Combining With AbortController
async function* fetchWithAbort(url, signal) {
const response = await fetch(url, { signal });
const reader = response.body.getReader();
try {
while (true) {
if (signal?.aborted) return;
const { done, value } = await reader.read();
if (done) break;
yield value;
}
} finally {
reader.releaseLock();
}
}
// Usage
const controller = new AbortController();
setTimeout(() => controller.abort(), 5000); // Cancel after 5s
try {
for await (const chunk of fetchWithAbort("/api/large-data", controller.signal)) {
process(chunk);
}
} catch (error) {
if (error.name === "AbortError") {
console.log("Stream canceled");
}
}See using AbortController in JS complete tutorial for more cancellation patterns.
Composing Async Generators
Map
async function* asyncMap(source, fn) {
for await (const item of source) {
yield await fn(item);
}
}Filter
async function* asyncFilter(source, predicate) {
for await (const item of source) {
if (await predicate(item)) {
yield item;
}
}
}Take
async function* asyncTake(source, count) {
let taken = 0;
for await (const item of source) {
yield item;
if (++taken >= count) return;
}
}Composing a Pipeline
const pipeline = asyncTake(
asyncFilter(
asyncMap(
fetchPages("/api/articles"),
async (article) => ({
...article,
wordCount: article.content.split(" ").length,
})
),
async (article) => article.wordCount > 500
),
10
);
// Get first 10 articles with >500 words across all pages
const results = [];
for await (const article of pipeline) {
results.push(article);
}Batch Processing
async function* asyncBatch(source, batchSize) {
let batch = [];
for await (const item of source) {
batch.push(item);
if (batch.length >= batchSize) {
yield batch;
batch = [];
}
}
if (batch.length > 0) {
yield batch;
}
}
// Process users in batches of 100
for await (const batch of asyncBatch(fetchPages("/api/users"), 100)) {
await bulkInsert(batch);
console.log(`Inserted batch of ${batch.length} users`);
}Error Handling
async function* resilientFetch(urls) {
for (const url of urls) {
try {
const response = await fetch(url);
if (!response.ok) throw new Error(`HTTP ${response.status}`);
yield await response.json();
} catch (error) {
yield { error: error.message, url };
}
}
}
for await (const result of resilientFetch(endpoints)) {
if (result.error) {
console.warn(`Failed: ${result.url} - ${result.error}`);
} else {
processData(result);
}
}Async Generator vs ReadableStream
| Feature | Async Generator | ReadableStream |
|---|---|---|
| Syntax | async function* + for await | Constructor + getReader() |
| Piping | Manual composition | Built-in pipeThrough/pipeTo |
| Backpressure | Implicit (pull-based) | Explicit (desiredSize) |
| Teeing | Manual | Built-in tee() |
| Browser support | ES2018+ | Chrome 89+, Firefox 102+ |
| Best for | Application logic, pagination | Low-level data processing |
Rune AI
Key Insights
- async function yields promises lazily*: Values are only produced when the consumer requests them via
.next()orfor-await-of - break cleanly terminates generators: The generator's
finallyblock runs, resources are released, and no more async work is initiated - Compose with map/filter/take: Chain async generator helpers to build declarative data pipelines that process items one at a time
- Natural backpressure: The producer pauses at
yielduntil the consumer is ready, preventing memory overflow without explicit coordination - Ideal for paginated APIs: Fetch pages on demand, yield items one at a time, and stop early when the target is found
Frequently Asked Questions
When should I use async generators vs Promise.all?
Can I use async generators in older browsers?
How do async generators handle backpressure?
What happens if I do not break or return from the loop?
Can I yield from another async generator?
Conclusion
Async generators produce async values lazily on demand, making them perfect for paginated APIs, event streams, and data pipelines. They compose with asyncMap, asyncFilter, and asyncTake for declarative data transformation. The for-await-of loop handles the async iteration protocol automatically, and break cleanly terminates the generator. For the stream-based alternative, see JavaScript Web Streams API a complete tutorial. For the event loop scheduling these async operations, see the JS event loop architecture complete guide.
More in this topic
OffscreenCanvas API in JS for UI Performance
Master the OffscreenCanvas API to offload rendering from the main thread. Covers worker-based 2D and WebGL rendering, animation loops inside workers, bitmap transfer, double buffering, chart rendering pipelines, image processing, and performance measurement strategies.
Advanced Web Workers for High Performance JS
Master Web Workers for truly parallel JavaScript execution. Covers dedicated and shared workers, structured cloning, transferable objects, SharedArrayBuffer with Atomics, worker pools, task scheduling, Comlink RPC patterns, module workers, and performance profiling strategies.
JavaScript Macros and Abstract Code Generation
Master JavaScript code generation techniques for compile-time and runtime metaprogramming. Covers AST manipulation, Babel plugin authorship, tagged template literals as macros, code generation pipelines, source-to-source transformation, compile-time evaluation, and safe eval alternatives.