JavaScript Web Streams API: A Complete Tutorial

A complete tutorial on the JavaScript Web Streams API. Covers ReadableStream, WritableStream, TransformStream, piping streams together, creating custom readable sources, backpressure management, streaming fetch responses, text decoding streams, and building a real-time log viewer with chunked data.

JavaScriptintermediate
16 min read

The Web Streams API processes data incrementally as it arrives, rather than waiting for the entire payload. This is critical for large files, real-time data feeds, and network responses where memory efficiency and time-to-first-byte matter. Streams let you start processing data before the full download completes.

Stream Types Overview

Stream TypePurposeKey Class
ReadableSource of data you consumeReadableStream
WritableDestination for data you produceWritableStream
TransformModifies data passing throughTransformStream

Reading a Fetch Response as a Stream

javascriptjavascript
async function streamFetchResponse(url) {
  const response = await fetch(url);
  const reader = response.body.getReader();
  const decoder = new TextDecoder();
 
  let result = "";
 
  while (true) {
    const { done, value } = await reader.read();
 
    if (done) break;
 
    // value is a Uint8Array chunk
    const text = decoder.decode(value, { stream: true });
    result += text;
    console.log(`Received ${value.length} bytes`);
  }
 
  return result;
}

The response.body is a ReadableStream. Each reader.read() call returns the next chunk as it arrives from the network. See how to use the JS Fetch API complete tutorial for Fetch fundamentals.

Creating a Custom ReadableStream

javascriptjavascript
function createCounterStream(limit) {
  let count = 0;
 
  return new ReadableStream({
    start(controller) {
      console.log("Stream started");
    },
 
    pull(controller) {
      if (count >= limit) {
        controller.close();
        return;
      }
 
      const data = { count: count++, timestamp: Date.now() };
      controller.enqueue(JSON.stringify(data) + "\n");
    },
 
    cancel(reason) {
      console.log("Stream canceled:", reason);
    },
  });
}
 
// Read the stream
const stream = createCounterStream(5);
const reader = stream.getReader();
 
while (true) {
  const { done, value } = await reader.read();
  if (done) break;
  console.log(value); // "{"count":0,"timestamp":...}\n"
}

ReadableStream Controller Methods

MethodDescription
controller.enqueue(chunk)Push a chunk into the stream
controller.close()Signal no more data will be produced
controller.error(e)Signal a stream error
controller.desiredSizeHow much the consumer wants (backpressure)

WritableStream

javascriptjavascript
function createLogWriter() {
  const logs = [];
 
  const stream = new WritableStream({
    write(chunk) {
      const entry = `[${new Date().toISOString()}] ${chunk}`;
      logs.push(entry);
      console.log(entry);
    },
 
    close() {
      console.log(`Log closed. Total entries: ${logs.length}`);
    },
 
    abort(reason) {
      console.error("Log aborted:", reason);
    },
  });
 
  return { stream, getLogs: () => [...logs] };
}
 
// Write to the stream
const { stream, getLogs } = createLogWriter();
const writer = stream.getWriter();
 
await writer.write("Application started");
await writer.write("User logged in");
await writer.write("Data loaded");
await writer.close();
 
console.log(getLogs());

TransformStream

A TransformStream sits between a readable and writable stream, modifying data as it passes through:

javascriptjavascript
function createUpperCaseTransform() {
  return new TransformStream({
    transform(chunk, controller) {
      controller.enqueue(chunk.toUpperCase());
    },
  });
}
 
function createJsonParseTransform() {
  return new TransformStream({
    transform(chunk, controller) {
      try {
        const parsed = JSON.parse(chunk);
        controller.enqueue(parsed);
      } catch {
        // Skip invalid JSON lines
      }
    },
  });
}

Piping Streams Together

javascriptjavascript
// Pipe: ReadableStream -> TransformStream -> WritableStream
async function processStream() {
  const source = createCounterStream(10);
  const transform = createUpperCaseTransform();
  const { stream: sink, getLogs } = createLogWriter();
 
  await source
    .pipeThrough(transform)
    .pipeTo(sink);
 
  console.log("Pipeline complete:", getLogs().length, "entries");
}

Pipeline Methods

MethodDescription
readable.pipeThrough(transform)Pipe through a TransformStream, returns the readable side
readable.pipeTo(writable)Pipe directly to a WritableStream, returns a promise
readable.tee()Split a stream into two independent branches

Streaming a Large File Download With Progress

javascriptjavascript
async function downloadWithProgress(url, onProgress) {
  const response = await fetch(url);
  const contentLength = parseInt(response.headers.get("Content-Length") || "0", 10);
  let receivedBytes = 0;
 
  const progressStream = new TransformStream({
    transform(chunk, controller) {
      receivedBytes += chunk.length;
      if (contentLength > 0) {
        const percent = Math.round((receivedBytes / contentLength) * 100);
        onProgress({ receivedBytes, contentLength, percent });
      }
      controller.enqueue(chunk);
    },
  });
 
  const reader = response.body
    .pipeThrough(progressStream)
    .getReader();
 
  const chunks = [];
  while (true) {
    const { done, value } = await reader.read();
    if (done) break;
    chunks.push(value);
  }
 
  // Combine chunks into a single Uint8Array
  const totalLength = chunks.reduce((sum, c) => sum + c.length, 0);
  const result = new Uint8Array(totalLength);
  let offset = 0;
  for (const chunk of chunks) {
    result.set(chunk, offset);
    offset += chunk.length;
  }
 
  return result;
}
 
// Usage
const data = await downloadWithProgress("/api/large-file", (progress) => {
  console.log(`${progress.percent}% (${progress.receivedBytes}/${progress.contentLength})`);
});

Line-by-Line Text Stream Processing

javascriptjavascript
function createLineTransform() {
  let buffer = "";
 
  return new TransformStream({
    transform(chunk, controller) {
      buffer += chunk;
      const lines = buffer.split("\n");
 
      // Keep the last incomplete line in the buffer
      buffer = lines.pop() || "";
 
      for (const line of lines) {
        if (line.trim()) {
          controller.enqueue(line);
        }
      }
    },
 
    flush(controller) {
      // Emit any remaining buffered content
      if (buffer.trim()) {
        controller.enqueue(buffer);
      }
    },
  });
}
 
// Stream a text file line by line
async function processLines(url) {
  const response = await fetch(url);
  const reader = response.body
    .pipeThrough(new TextDecoderStream())
    .pipeThrough(createLineTransform())
    .getReader();
 
  let lineNumber = 0;
  while (true) {
    const { done, value } = await reader.read();
    if (done) break;
    console.log(`Line ${++lineNumber}: ${value}`);
  }
}

Tee: Splitting a Stream

javascriptjavascript
async function streamAndCache(url) {
  const response = await fetch(url);
  const [branch1, branch2] = response.body.tee();
 
  // Branch 1: display to user
  const displayPromise = branch1
    .pipeThrough(new TextDecoderStream())
    .pipeTo(new WritableStream({
      write(chunk) {
        document.getElementById("output").textContent += chunk;
      },
    }));
 
  // Branch 2: cache the raw bytes
  const cachePromise = (async () => {
    const reader = branch2.getReader();
    const chunks = [];
    while (true) {
      const { done, value } = await reader.read();
      if (done) break;
      chunks.push(value);
    }
    return new Blob(chunks);
  })();
 
  await Promise.all([displayPromise, cachePromise]);
}
Rune AI

Rune AI

Key Insights

  • Streams process data incrementally: No need to buffer the entire payload in memory before processing begins
  • Three stream types compose into pipelines: ReadableStream produces, TransformStream modifies, WritableStream consumes, all connected via pipeThrough and pipeTo
  • Backpressure prevents memory overflow: The consumer controls the pace; producers automatically slow down when the consumer falls behind
  • tee() enables fan-out: Split one stream into two independent readers for parallel processing like display + cache
  • TextDecoderStream handles encoding: Use it as the first transform in text pipelines to convert raw bytes to strings
RunePowered by Rune AI

Frequently Asked Questions

When should I use streams instead of regular Fetch?

Use streams when processing large files (>10MB), when you need progress tracking, when displaying data incrementally (chat, logs), or when piping data between services without buffering the entire payload in memory.

Do all browsers support the Web Streams API?

ll modern browsers (Chrome 89+, Firefox 102+, Safari 14.1+) support ReadableStream and TransformStream. WritableStream support is similarly broad. Node.js supports web streams since v16.5.0 (experimental) and v18+ (stable).

What is backpressure in streams?

Backpressure is the mechanism that slows down the producer when the consumer cannot keep up. The `controller.desiredSize` property indicates how much data the consumer wants. When it reaches 0 or negative, the producer should stop enqueuing until the consumer catches up.

Can I cancel a stream pipeline?

Yes. Call `reader.cancel()` on the readable side or `writer.abort()` on the writable side. Cancellation propagates through the entire pipeline. See [using AbortController in JS complete tutorial](/tutorials/programming-languages/javascript/using-abortcontroller-in-js-complete-tutorial) for abort patterns.

How do streams relate to async generators?

Both process data incrementally. Async generators use `for await...of` syntax and are simpler for sequential processing. Streams support piping, teeing, and backpressure natively. See [JavaScript async generators complete tutorial](/tutorials/programming-languages/javascript/javascript-async-generators-complete-tutorial) for the generator approach.

Conclusion

The Web Streams API enables incremental data processing with ReadableStream (source), WritableStream (sink), and TransformStream (processor). Piping chains streams into pipelines, teeing splits a stream into two branches, and backpressure prevents memory overflow. For the Fetch API that produces these streams, see how to use the JS Fetch API complete tutorial. For the event loop scheduling stream reads, see the JS event loop architecture complete guide.