Advanced JS Promise Patterns: Complete Tutorial
A complete tutorial on advanced JavaScript promise patterns. Covers Promise.allSettled for resilient parallel execution, Promise.race for timeouts, Promise.any for fastest success, sequential promise chains, promise pooling for concurrency control, retry with promises, promise memoization, and composing complex async workflows.
Beyond basic async/await, JavaScript promises enable powerful patterns for parallel execution, timeout management, concurrency control, and resilient error handling. This guide covers production-grade patterns that solve real-world async coordination problems.
Promise Combinators Overview
| Method | Resolves When | Rejects When | Use Case |
|---|---|---|---|
Promise.all | All promises fulfill | Any promise rejects | Parallel tasks where all must succeed |
Promise.allSettled | All promises settle | Never rejects | Parallel tasks with partial failure tolerance |
Promise.race | First promise settles | First promise rejects | Timeouts, fastest response |
Promise.any | First promise fulfills | All promises reject | Fastest success from multiple sources |
Promise.allSettled for Resilient Loading
async function loadDashboard(userId) {
const results = await Promise.allSettled([
fetch(`/api/users/${userId}`).then((r) => r.json()),
fetch(`/api/users/${userId}/projects`).then((r) => r.json()),
fetch(`/api/notifications`).then((r) => r.json()),
fetch(`/api/analytics/summary`).then((r) => r.json()),
]);
const [userResult, projectsResult, notificationsResult, analyticsResult] = results;
return {
user: userResult.status === "fulfilled" ? userResult.value : null,
projects: projectsResult.status === "fulfilled" ? projectsResult.value : [],
notifications: notificationsResult.status === "fulfilled" ? notificationsResult.value : [],
analytics: analyticsResult.status === "fulfilled" ? analyticsResult.value : null,
errors: results
.filter((r) => r.status === "rejected")
.map((r) => r.reason.message),
};
}The dashboard loads even if analytics or notifications fail. Promise.all would have rejected the entire operation.
Promise.race for Timeouts
function withTimeout(promise, ms, errorMessage = "Operation timed out") {
const timeout = new Promise((_, reject) => {
setTimeout(() => reject(new Error(errorMessage)), ms);
});
return Promise.race([promise, timeout]);
}
// Usage
async function getUser(id) {
try {
const user = await withTimeout(
fetch(`/api/users/${id}`).then((r) => r.json()),
5000,
`User ${id} request timed out`
);
return user;
} catch (error) {
console.error(error.message);
return null;
}
}Timeout With Cleanup
function withTimeoutAndAbort(fetchFn, ms) {
const controller = new AbortController();
const timeout = new Promise((_, reject) => {
setTimeout(() => {
controller.abort();
reject(new Error(`Request timed out after ${ms}ms`));
}, ms);
});
return Promise.race([fetchFn(controller.signal), timeout]);
}
// Usage
const data = await withTimeoutAndAbort(
(signal) => fetch("/api/heavy-data", { signal }).then((r) => r.json()),
10000
);See using AbortController in JS complete tutorial for comprehensive cancellation patterns.
Promise.any for Fastest Success
async function fetchFromMirrors(resource) {
const mirrors = [
`https://cdn1.example.com/${resource}`,
`https://cdn2.example.com/${resource}`,
`https://cdn3.example.com/${resource}`,
];
try {
const response = await Promise.any(
mirrors.map((url) => fetch(url).then((r) => {
if (!r.ok) throw new Error(`${url} returned ${r.status}`);
return r;
}))
);
return response.json();
} catch (error) {
// AggregateError: all mirrors failed
console.error("All mirrors failed:", error.errors.map((e) => e.message));
throw error;
}
}Promise.any resolves as soon as the first promise fulfills. If all reject, it throws an AggregateError containing all rejection reasons.
Sequential Promise Chain
async function processSequentially(items, asyncFn) {
const results = [];
for (const item of items) {
const result = await asyncFn(item);
results.push(result);
}
return results;
}
// Usage: process one at a time (order matters)
const users = await processSequentially(
[1, 2, 3, 4, 5],
async (id) => {
const res = await fetch(`/api/users/${id}`);
return res.json();
}
);Sequential With Reduce
async function processWithReduce(items, asyncFn) {
return items.reduce(async (prevPromise, item) => {
const results = await prevPromise;
const result = await asyncFn(item);
return [...results, result];
}, Promise.resolve([]));
}Concurrency Pool
Run N promises at a time from a larger set:
async function pool(tasks, concurrency) {
const results = [];
const executing = new Set();
for (const [index, task] of tasks.entries()) {
const promise = task().then((result) => {
executing.delete(promise);
results[index] = { status: "fulfilled", value: result };
}).catch((error) => {
executing.delete(promise);
results[index] = { status: "rejected", reason: error };
});
executing.add(promise);
if (executing.size >= concurrency) {
await Promise.race(executing);
}
}
await Promise.all(executing);
return results;
}
// Usage: fetch 100 URLs, 5 at a time
const urls = Array.from({ length: 100 }, (_, i) => `https://api.example.com/items/${i}`);
const tasks = urls.map((url) => () => fetch(url).then((r) => r.json()));
const results = await pool(tasks, 5);Pool Size Guidelines
| Pool Size | Use Case | Trade-off |
|---|---|---|
| 1 | Sequential processing | Slowest, least server load |
| 3-5 | API calls with rate limits | Good balance |
| 10-20 | Internal microservices | Fast, moderate load |
| 50+ | Local file operations | Maximum throughput |
Retry With Promises
async function retry(fn, options = {}) {
const maxRetries = options.maxRetries || 3;
const baseDelay = options.baseDelay || 1000;
const shouldRetry = options.shouldRetry || (() => true);
let lastError;
for (let attempt = 0; attempt <= maxRetries; attempt++) {
try {
return await fn(attempt);
} catch (error) {
lastError = error;
if (attempt === maxRetries || !shouldRetry(error, attempt)) {
throw error;
}
const delay = baseDelay * Math.pow(2, attempt) + Math.random() * 500;
console.warn(`Attempt ${attempt + 1} failed. Retrying in ${Math.round(delay)}ms`);
await new Promise((resolve) => setTimeout(resolve, delay));
}
}
throw lastError;
}
// Usage
const data = await retry(
() => fetch("/api/data").then((r) => {
if (!r.ok) throw new Error(`HTTP ${r.status}`);
return r.json();
}),
{
maxRetries: 3,
baseDelay: 1000,
shouldRetry: (error) => !error.message.includes("HTTP 4"),
}
);See API retry patterns in JavaScript full tutorial for advanced retry strategies.
Promise Memoization
function memoizeAsync(fn, options = {}) {
const cache = new Map();
const ttl = options.ttl || 60000; // 1 minute default
return async function (...args) {
const key = options.keyFn ? options.keyFn(...args) : JSON.stringify(args);
const cached = cache.get(key);
if (cached && Date.now() - cached.timestamp < ttl) {
return cached.value;
}
// Store the promise itself to prevent duplicate in-flight requests
const promise = fn.apply(this, args);
cache.set(key, { value: promise, timestamp: Date.now() });
try {
const result = await promise;
cache.set(key, { value: Promise.resolve(result), timestamp: Date.now() });
return result;
} catch (error) {
cache.delete(key); // Remove failed entries
throw error;
}
};
}
// Usage
const getUser = memoizeAsync(
async (id) => {
const res = await fetch(`/api/users/${id}`);
return res.json();
},
{ ttl: 30000 }
);
// First call fetches, second call returns cached
const user1 = await getUser(42);
const user2 = await getUser(42); // instant, no networkComposing Async Workflows
function pipe(...fns) {
return async (input) => {
let result = input;
for (const fn of fns) {
result = await fn(result);
}
return result;
};
}
// Define steps
const fetchUser = async (userId) => {
const res = await fetch(`/api/users/${userId}`);
return res.json();
};
const enrichWithProjects = async (user) => {
const res = await fetch(`/api/users/${user.id}/projects`);
const projects = await res.json();
return { ...user, projects };
};
const calculateStats = async (user) => {
return {
...user,
stats: {
totalProjects: user.projects.length,
activeProjects: user.projects.filter((p) => p.status === "active").length,
},
};
};
// Compose the pipeline
const getUserWithStats = pipe(fetchUser, enrichWithProjects, calculateStats);
const result = await getUserWithStats(42);Rune AI
Key Insights
- Promise.allSettled never rejects: It waits for every promise to settle and returns statuses, making it ideal for independent parallel operations
- Promise.race does not cancel losers: Combine with
AbortControllerto actually stop losing requests and free resources - Concurrency pools prevent server overload: Run N operations at a time from a larger set using a
Setof executing promises andPromise.race - Memoize the promise, not just the result: Storing the pending promise deduplicates in-flight requests from concurrent callers
- Retry only transient errors: Network failures, 429, and 5xx responses are retryable; 4xx responses are permanent client errors
Frequently Asked Questions
When should I use Promise.all vs Promise.allSettled?
How does Promise.race handle the losing promises?
What is the best concurrency pool size for API calls?
Should I retry on all errors?
How do I prevent duplicate in-flight requests?
Conclusion
Advanced promise patterns solve real coordination problems: Promise.allSettled for partial-failure tolerance, Promise.race for timeouts, concurrency pools for controlled parallelism, retry for transient failures, and memoization for deduplication. The key insight is that promises are values you can store, race, combine, and chain. For cancellation with AbortController, see using AbortController in JS complete tutorial. For the event loop that schedules promise microtasks, see the JS event loop architecture complete guide.
More in this topic
OffscreenCanvas API in JS for UI Performance
Master the OffscreenCanvas API to offload rendering from the main thread. Covers worker-based 2D and WebGL rendering, animation loops inside workers, bitmap transfer, double buffering, chart rendering pipelines, image processing, and performance measurement strategies.
Advanced Web Workers for High Performance JS
Master Web Workers for truly parallel JavaScript execution. Covers dedicated and shared workers, structured cloning, transferable objects, SharedArrayBuffer with Atomics, worker pools, task scheduling, Comlink RPC patterns, module workers, and performance profiling strategies.
JavaScript Macros and Abstract Code Generation
Master JavaScript code generation techniques for compile-time and runtime metaprogramming. Covers AST manipulation, Babel plugin authorship, tagged template literals as macros, code generation pipelines, source-to-source transformation, compile-time evaluation, and safe eval alternatives.