Call Stack vs Task Queue vs Microtask Queue in JS
Master the three core mechanisms of JavaScript's concurrency model. Covers the call stack execution model, macrotask queue scheduling, microtask queue priority, their interaction with the event loop, visualizing execution order, and debugging async timing issues.
JavaScript's single-threaded execution model relies on three interconnected mechanisms: the call stack processes synchronous code, the task queue holds deferred callbacks, and the microtask queue handles promise continuations with higher priority. Understanding how these interact is essential for predicting async execution order.
For the full event loop algorithm that coordinates these queues, see JavaScript Event Loop Internals Full Guide.
The Call Stack
// The call stack is a LIFO (Last In, First Out) data structure
// that tracks function execution context
// CALL STACK MODEL
class CallStack {
#frames = [];
#maxDepth;
constructor(maxDepth = 15000) {
this.#maxDepth = maxDepth;
}
push(frame) {
if (this.#frames.length >= this.#maxDepth) {
throw new RangeError("Maximum call stack size exceeded");
}
this.#frames.push(frame);
return this;
}
pop() {
return this.#frames.pop();
}
peek() {
return this.#frames[this.#frames.length - 1];
}
get depth() {
return this.#frames.length;
}
get isEmpty() {
return this.#frames.length === 0;
}
dump() {
console.log("--- Call Stack ---");
for (let i = this.#frames.length - 1; i >= 0; i--) {
console.log(` ${this.#frames[i].name} (${this.#frames[i].file}:${this.#frames[i].line})`);
}
console.log("------------------");
}
}
// EXECUTION TRACE EXAMPLE
function main() { // Stack: [main]
const a = first(); // Stack: [main, first]
const b = second(); // Stack: [main, second]
return a + b; // Stack: [main] -> returns, stack empty
}
function first() { // Stack: [main, first]
return helper(10); // Stack: [main, first, helper]
} // helper returns -> [main, first] -> first returns -> [main]
function second() { // Stack: [main, second]
return 20; // Stack: [main, second] -> second returns -> [main]
}
function helper(n) { // Stack: [main, first, helper]
return n * 2; // Returns 20, pops helper
}
main(); // Returns 40
// STACK OVERFLOW
function infiniteRecursion() {
return infiniteRecursion(); // RangeError: Maximum call stack size exceeded
}
// RUN-TO-COMPLETION
// Once a function starts, it runs until it returns (or throws)
// No other code can interrupt it on the same thread
// This is why long synchronous operations freeze the UI
function blockingOperation() {
const start = Date.now();
while (Date.now() - start < 5000) {
// Blocks the call stack for 5 seconds
// No callbacks, no rendering, no I/O during this time
}
}The Task Queue (Macrotask Queue)
// The task queue holds callbacks from async APIs
// Only ONE task is dequeued per event loop iteration
// COMMON MACROTASK SOURCES:
// - setTimeout(callback, delay)
// - setInterval(callback, interval)
// - setImmediate(callback) (Node.js)
// - I/O callbacks (fs.readFile, net.connect, etc.)
// - UI rendering events (click, scroll, resize)
// - MessageChannel.port.onmessage
class TaskQueue {
#queue = [];
enqueue(task) {
this.#queue.push({
callback: task.callback,
source: task.source,
scheduledAt: Date.now()
});
}
dequeueOne() {
return this.#queue.shift() || null;
}
get length() {
return this.#queue.length;
}
get isEmpty() {
return this.#queue.length === 0;
}
}
// TASK QUEUE BEHAVIOR
console.log("1: sync start");
setTimeout(() => console.log("2: timeout A"), 0);
setTimeout(() => console.log("3: timeout B"), 0);
setTimeout(() => console.log("4: timeout C"), 0);
console.log("5: sync end");
// OUTPUT:
// 1: sync start (call stack - synchronous)
// 5: sync end (call stack - synchronous)
// 2: timeout A (task queue - first macrotask)
// [microtask checkpoint - empty]
// 3: timeout B (task queue - second macrotask)
// [microtask checkpoint - empty]
// 4: timeout C (task queue - third macrotask)
// EACH setTimeout callback is a SEPARATE macrotask
// Between each macrotask, ALL microtasks are drained
// This means a microtask scheduled inside timeout A
// runs BEFORE timeout B
setTimeout(() => {
console.log("A: macrotask 1");
Promise.resolve().then(() => console.log("B: microtask from A"));
}, 0);
setTimeout(() => {
console.log("C: macrotask 2");
}, 0);
// OUTPUT:
// A: macrotask 1
// B: microtask from A (drained before next macrotask)
// C: macrotask 2The Microtask Queue
// The microtask queue is drained COMPLETELY after each macrotask
// and after the initial script execution
// MICROTASK SOURCES:
// - Promise.prototype.then / catch / finally
// - queueMicrotask(callback)
// - MutationObserver (browser)
// - process.nextTick (Node.js, technically separate but similar priority)
class MicrotaskQueue {
#queue = [];
enqueue(microtask) {
this.#queue.push({
callback: microtask.callback,
source: microtask.source
});
}
// ALL microtasks are drained, including ones added during draining
drainAll(callStack) {
while (this.#queue.length > 0) {
const task = this.#queue.shift();
callStack.push({ name: task.source, file: "microtask", line: 0 });
task.callback();
callStack.pop();
}
}
get length() {
return this.#queue.length;
}
}
// MICROTASK DRAINING DEMO
console.log("1: script start");
queueMicrotask(() => {
console.log("2: microtask 1");
queueMicrotask(() => {
console.log("3: nested microtask");
queueMicrotask(() => {
console.log("4: deeply nested microtask");
});
});
});
queueMicrotask(() => console.log("5: microtask 2"));
console.log("6: script end");
// OUTPUT:
// 1: script start (synchronous)
// 6: script end (synchronous)
// 2: microtask 1 (first microtask)
// 5: microtask 2 (second microtask, queued before nested)
// 3: nested microtask (queued during microtask 1)
// 4: deeply nested microtask (queued during nested microtask)
//
// ALL microtasks drain before ANY macrotask runs
// Nested microtasks are appended to the same queue and processed immediately
// STARVATION RISK
// Microtasks can prevent macrotasks from ever running
function microtaskStarvation() {
let count = 0;
function recurseMicrotask() {
count++;
if (count < 1_000_000) {
queueMicrotask(recurseMicrotask);
}
}
queueMicrotask(recurseMicrotask);
setTimeout(() => console.log("This runs after 1M microtasks"), 0);
}Complete Execution Model
// FULL MODEL: Call Stack + Task Queue + Microtask Queue
class JSRuntime {
#callStack = [];
#taskQueue = [];
#microtaskQueue = [];
// Execute synchronous code
execute(fn) {
this.#callStack.push(fn.name || "anonymous");
fn();
this.#callStack.pop();
// After each call stack empties, drain microtasks
this.#drainMicrotasks();
}
// Schedule a macrotask
scheduleTask(callback) {
this.#taskQueue.push(callback);
}
// Schedule a microtask
scheduleMicrotask(callback) {
this.#microtaskQueue.push(callback);
}
// The event loop
runEventLoop(maxIterations = 1000) {
let iterations = 0;
while (this.#taskQueue.length > 0 && iterations < maxIterations) {
// Step 1: Pick ONE macrotask
const task = this.#taskQueue.shift();
this.#callStack.push("macrotask");
task();
this.#callStack.pop();
// Step 2: Drain ALL microtasks
this.#drainMicrotasks();
iterations++;
}
}
#drainMicrotasks() {
while (this.#microtaskQueue.length > 0) {
const micro = this.#microtaskQueue.shift();
this.#callStack.push("microtask");
micro();
this.#callStack.pop();
}
}
}
// COMPREHENSIVE EXECUTION ORDER EXAMPLE
console.log("A: script start"); // 1. sync
setTimeout(() => console.log("B: setTimeout 1"), 0); // -> task queue
setTimeout(() => { // -> task queue
console.log("C: setTimeout 2");
Promise.resolve().then(() => console.log("D: promise in setTimeout 2"));
}, 0);
Promise.resolve()
.then(() => console.log("E: promise 1")) // -> microtask queue
.then(() => console.log("F: promise 2")); // -> microtask queue (later)
queueMicrotask(() => { // -> microtask queue
console.log("G: queueMicrotask");
queueMicrotask(() => console.log("H: nested queueMicrotask"));
});
console.log("I: script end"); // 2. sync
// EXECUTION ORDER:
// A: script start (sync - call stack)
// I: script end (sync - call stack)
// --- call stack empty, drain microtasks ---
// E: promise 1 (microtask)
// G: queueMicrotask (microtask)
// F: promise 2 (microtask - chained from E)
// H: nested queueMicrotask (microtask - queued during G)
// --- all microtasks drained, pick macrotask ---
// B: setTimeout 1 (macrotask)
// --- drain microtasks (none) ---
// C: setTimeout 2 (macrotask)
// --- drain microtasks ---
// D: promise in setTimeout 2 (microtask from C)Practical Patterns and Pitfalls
// PATTERN 1: Yielding to the event loop
// Use setTimeout(fn, 0) to let pending I/O and rendering happen
async function processLargeArray(items) {
const results = [];
for (let i = 0; i < items.length; i++) {
results.push(heavyComputation(items[i]));
// Yield every 100 items via macrotask
if (i % 100 === 99) {
await new Promise((resolve) => setTimeout(resolve, 0));
// This creates a macrotask, allowing:
// - Pending I/O callbacks to fire
// - Browser to render a frame
// - Other macrotasks to run
}
}
return results;
}
function heavyComputation(item) {
let result = 0;
for (let i = 0; i < 10000; i++) result += Math.sqrt(item * i);
return result;
}
// PITFALL 1: await does NOT yield to macrotasks
async function awaitPitfall() {
console.log("A: start");
// await creates a MICROTASK, not a macrotask
await Promise.resolve();
console.log("B: after await");
// B runs as a microtask, BEFORE any setTimeout callbacks
}
setTimeout(() => console.log("C: timeout"), 0);
awaitPitfall();
// Output: A: start -> B: after await -> C: timeout
// PITFALL 2: Promise.resolve() vs new Promise()
// Promise.resolve(value) schedules .then as microtask immediately
// new Promise((resolve) => resolve(value)) does too, but the
// executor runs synchronously
console.log("1");
new Promise((resolve) => {
console.log("2"); // This is SYNCHRONOUS (executor runs immediately)
resolve();
}).then(() => console.log("3")); // Microtask
console.log("4");
// Output: 1, 2, 4, 3
// PITFALL 3: Thenable objects add extra microtask ticks
const thenable = {
then(resolve) {
resolve(42);
}
};
Promise.resolve(thenable).then((val) => console.log("thenable:", val));
Promise.resolve(100).then((val) => console.log("direct:", val));
// Output: direct: 100 -> thenable: 42
// Thenables add an extra microtask tick for unwrapping
// PATTERN 2: Immediate vs deferred execution
function scheduleWork(callback) {
// Runs BEFORE I/O callbacks (microtask)
queueMicrotask(callback);
}
function deferWork(callback) {
// Runs AFTER I/O callbacks (macrotask)
setTimeout(callback, 0);
}
// PATTERN 3: Batching DOM updates with microtasks
class DOMBatcher {
#pendingUpdates = [];
#scheduled = false;
update(element, property, value) {
this.#pendingUpdates.push({ element, property, value });
if (!this.#scheduled) {
this.#scheduled = true;
// Batch all updates and apply in one microtask
queueMicrotask(() => this.#flush());
}
}
#flush() {
for (const { element, property, value } of this.#pendingUpdates) {
element.style[property] = value;
}
this.#pendingUpdates = [];
this.#scheduled = false;
}
}| Feature | Call Stack | Task Queue (Macrotask) | Microtask Queue |
|---|---|---|---|
| Structure | LIFO stack | FIFO queue | FIFO queue |
| Processing | Run to completion | One per loop iteration | All drained per cycle |
| Priority | Immediate | Low | High (after call stack) |
| Sources | Function calls | setTimeout, I/O, events | Promise.then, queueMicrotask |
| Can starve | Blocks everything | Can be starved by microtasks | Can starve macrotasks |
| Overflow | RangeError (stack) | Memory limit (unbounded) | Memory limit (unbounded) |
Rune AI
Key Insights
- The call stack runs synchronous code to completion before anything else can execute: No callbacks, microtasks, or macrotasks can interrupt a running call stack frame
- The microtask queue is drained completely after each macrotask and after the initial script execution: Promise chains resolve fully before timers or I/O callbacks fire
- Only one macrotask is processed per event loop iteration, followed by a full microtask drain: This ensures microtasks always get priority over pending macrotasks
- Recursive microtasks can starve macrotasks and block rendering because the queue must empty before proceeding: Use setTimeout to yield to the event loop during heavy microtask work
- Understanding the priority hierarchy (call stack, nextTick, microtasks, macrotasks) is essential for debugging async timing bugs: Most unexpected execution ordering stems from confusing microtask and macrotask scheduling
Frequently Asked Questions
Why are microtasks processed before the next macrotask?
Can microtasks block rendering in the browser?
What is the maximum call stack depth in JavaScript?
How does process.nextTick differ from queueMicrotask in Node.js?
Conclusion
The call stack, task queue, and microtask queue form the foundation of JavaScript's concurrency model. Synchronous code runs on the call stack until it completes. Microtasks (promises) are processed with high priority after each call stack frame and between each macrotask. Macrotasks (timers, I/O) run one at a time with microtask checkpoints between them. For the event loop that orchestrates these queues, see JavaScript Event Loop Internals Full Guide. For how libuv provides the underlying I/O mechanism, explore Understanding libuv and JS Asynchronous I/O.
More in this topic
OffscreenCanvas API in JS for UI Performance
Master the OffscreenCanvas API to offload rendering from the main thread. Covers worker-based 2D and WebGL rendering, animation loops inside workers, bitmap transfer, double buffering, chart rendering pipelines, image processing, and performance measurement strategies.
Advanced Web Workers for High Performance JS
Master Web Workers for truly parallel JavaScript execution. Covers dedicated and shared workers, structured cloning, transferable objects, SharedArrayBuffer with Atomics, worker pools, task scheduling, Comlink RPC patterns, module workers, and performance profiling strategies.
JavaScript Macros and Abstract Code Generation
Master JavaScript code generation techniques for compile-time and runtime metaprogramming. Covers AST manipulation, Babel plugin authorship, tagged template literals as macros, code generation pipelines, source-to-source transformation, compile-time evaluation, and safe eval alternatives.