The JS Event Loop Architecture Complete Guide
A complete guide to the JavaScript event loop architecture. Covers the call stack, task queue, microtask queue, how setTimeout and Promises are scheduled, requestAnimationFrame timing, event loop phases in Node.js, blocking the main thread, and real-world performance implications.
JavaScript is single-threaded, yet it handles thousands of concurrent operations. The event loop is the mechanism that makes this possible. It coordinates the call stack, task queue, and microtask queue to execute synchronous code, process callbacks, and resolve Promises in a predictable order.
The Core Components
| Component | Role | Examples |
|---|---|---|
| Call Stack | Executes function calls, LIFO | Function invocations, main() |
| Web APIs / Node APIs | Handle async operations in the background | setTimeout, fetch, DOM events |
| Task Queue (Macrotask) | Queues callbacks from Web APIs | setTimeout, setInterval, I/O |
| Microtask Queue | Queues high-priority callbacks | Promise.then, queueMicrotask, MutationObserver |
How the Event Loop Works
The event loop follows this cycle on every iteration (called a "tick"):
- Execute all synchronous code on the call stack until it is empty
- Drain the entire microtask queue (all microtasks, including ones added during processing)
- Pick one task from the task queue and push it onto the call stack
- After that task completes, drain the microtask queue again
- Optionally render (browser: requestAnimationFrame, repaint)
- Repeat
console.log("1 - sync");
setTimeout(() => {
console.log("4 - macrotask");
}, 0);
Promise.resolve().then(() => {
console.log("3 - microtask");
});
console.log("2 - sync");
// Output order: 1, 2, 3, 4Why This Order?
- "1" and "2" are synchronous and run immediately on the call stack
- The Promise
.then()callback is a microtask, processed before the next macrotask - The
setTimeoutcallback is a macrotask, processed after all microtasks are drained
The Call Stack in Detail
The call stack tracks function execution context in LIFO (last-in, first-out) order:
function multiply(a, b) {
return a * b; // 3. runs, then pops
}
function square(n) {
return multiply(n, n); // 2. pushes multiply
}
function printSquare(n) {
const result = square(n); // 1. pushes square
console.log(result);
}
printSquare(5);
// Stack growth: printSquare -> square -> multiply
// Stack shrinks: multiply pops -> square pops -> printSquare popsIf the call stack gets too deep (infinite recursion), you get a "Maximum call stack size exceeded" error.
Microtasks vs Macrotasks
setTimeout(() => console.log("timeout 1"), 0);
setTimeout(() => console.log("timeout 2"), 0);
Promise.resolve()
.then(() => console.log("promise 1"))
.then(() => console.log("promise 2"));
queueMicrotask(() => console.log("microtask 1"));
console.log("sync");
// Output:
// sync
// promise 1
// microtask 1
// promise 2
// timeout 1
// timeout 2All microtasks run before the next macrotask. This includes microtasks scheduled by other microtasks (like promise 2 from promise 1's .then() chain).
Classification Table
| Macrotasks | Microtasks |
|---|---|
setTimeout | Promise.then/catch/finally |
setInterval | queueMicrotask() |
setImmediate (Node) | MutationObserver |
| I/O callbacks | process.nextTick (Node, even higher priority) |
| UI rendering events | Async/await continuations |
MessageChannel |
setTimeout Is Not Precise
setTimeout(fn, 0) does not mean "execute immediately." It means "schedule a macrotask as soon as possible, but after microtasks and rendering":
const start = performance.now();
setTimeout(() => {
console.log(`Actual delay: ${performance.now() - start}ms`);
}, 0);
// Actual delay: ~1-4ms (browser minimum clamp)Browsers enforce a minimum delay (typically 1ms for first few calls, 4ms after nesting depth > 4). Node.js treats setTimeout(fn, 0) as setTimeout(fn, 1).
requestAnimationFrame
requestAnimationFrame runs before the browser paints the next frame, after microtasks:
console.log("1 - sync");
requestAnimationFrame(() => {
console.log("3 - rAF (before paint)");
});
setTimeout(() => {
console.log("4 - macrotask");
}, 0);
Promise.resolve().then(() => {
console.log("2 - microtask");
});
// Typical order: 1, 2, 3, 4
// (rAF timing can vary; it runs before the next paint, which is usually before the next macrotask)Use requestAnimationFrame for visual updates to ensure smooth 60fps rendering.
Blocking the Main Thread
Long synchronous operations block everything, including rendering and event handling:
// BAD: blocks the event loop for ~5 seconds
function heavyComputation() {
const start = Date.now();
while (Date.now() - start < 5000) {
// spin
}
console.log("done");
}
heavyComputation(); // UI frozen for 5 secondsSolutions for Heavy Work
// Option 1: Break work into chunks with setTimeout
function processChunked(items, chunkSize = 100) {
let index = 0;
function processChunk() {
const end = Math.min(index + chunkSize, items.length);
for (; index < end; index++) {
// process items[index]
}
if (index < items.length) {
setTimeout(processChunk, 0); // yield to the event loop
}
}
processChunk();
}
// Option 2: Use a Web Worker for true parallelism
const worker = new Worker("heavy-task.js");
worker.postMessage({ data: largeDataSet });
worker.onmessage = (e) => console.log("Result:", e.data);Node.js Event Loop Phases
Node.js has a more granular event loop with distinct phases:
┌───────────────────────────┐
┌─>│ timers │ setTimeout, setInterval callbacks
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ pending callbacks │ I/O callbacks deferred from prev cycle
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ idle, prepare │ internal use only
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ poll │ Retrieve new I/O events
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ check │ setImmediate callbacks
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ close callbacks │ socket.on('close', ...)
│ └─────────────┬─────────────┘
└─────────────────┘
Between every phase, Node drains the microtask queue (Promises) and the process.nextTick queue.
Async/Await and the Event Loop
async/await is syntactic sugar over Promises. The code after await runs as a microtask:
async function example() {
console.log("1 - before await");
await Promise.resolve();
console.log("3 - after await (microtask)");
}
example();
console.log("2 - sync after call");
// Output: 1, 2, 3Everything after await is equivalent to code inside .then().
Common Pitfalls
Microtask Starvation
If microtasks keep scheduling more microtasks, macrotasks never run:
// WARNING: This starves the macrotask queue
function infinite() {
Promise.resolve().then(infinite);
}
infinite(); // setTimeout callbacks will never executeAssuming setTimeout Order at Same Delay
setTimeout(() => console.log("a"), 0);
setTimeout(() => console.log("b"), 0);
// Usually "a" then "b", but not strictly guaranteed
// across different enginesRune AI
Key Insights
- JavaScript is single-threaded: One call stack, one line of code at a time; concurrency comes from the event loop, not parallelism
- Microtasks always run before the next macrotask: Promise callbacks and queueMicrotask have priority over setTimeout and I/O callbacks
- The entire microtask queue drains every tick: Including microtasks scheduled by other microtasks, which can cause starvation if unbounded
- setTimeout(fn, 0) is not instant: Browsers clamp minimum delay to 1-4ms, and the callback waits for all microtasks to complete first
- Long synchronous code blocks everything: Break heavy work into chunks with setTimeout or offload to Web Workers for true parallelism
Frequently Asked Questions
What is the difference between process.nextTick and queueMicrotask?
Does the event loop exist in Web Workers?
Can I observe the event loop directly?
Why do Promises resolve before setTimeout even with a 0ms delay?
Conclusion
The event loop is JavaScript's concurrency mechanism. Synchronous code runs to completion on the call stack. Microtasks (Promises, queueMicrotask) run between macrotasks. Macrotasks (setTimeout, I/O) run one per loop iteration. Understanding this order is essential for debugging async behavior and avoiding UI freezes. For the fetch API that relies on this async architecture, see how to use the JS fetch API complete tutorial. For optional chaining patterns used in async callbacks, see advanced JS optional chaining complete guide.
More in this topic
OffscreenCanvas API in JS for UI Performance
Master the OffscreenCanvas API to offload rendering from the main thread. Covers worker-based 2D and WebGL rendering, animation loops inside workers, bitmap transfer, double buffering, chart rendering pipelines, image processing, and performance measurement strategies.
Advanced Web Workers for High Performance JS
Master Web Workers for truly parallel JavaScript execution. Covers dedicated and shared workers, structured cloning, transferable objects, SharedArrayBuffer with Atomics, worker pools, task scheduling, Comlink RPC patterns, module workers, and performance profiling strategies.
JavaScript Macros and Abstract Code Generation
Master JavaScript code generation techniques for compile-time and runtime metaprogramming. Covers AST manipulation, Babel plugin authorship, tagged template literals as macros, code generation pipelines, source-to-source transformation, compile-time evaluation, and safe eval alternatives.