Removing Duplicates from JavaScript Arrays Guide

Learn every method to remove duplicates from JavaScript arrays. Covers Set, filter with indexOf, reduce, Map for objects, performance comparisons, and real-world deduplication patterns with practical code examples.

JavaScriptbeginner
14 min read

Duplicate values in arrays cause incorrect counts, redundant API calls, and broken UI displays. Removing them is a task you will face repeatedly when working with user input, merged datasets, API responses, and search results. JavaScript provides multiple approaches, from the one-line Set method to custom logic for complex objects.

This guide covers every deduplication technique, explains when to use each one, compares performance, and shows real-world patterns for handling arrays of primitive values and arrays of objects.

A Set stores only unique values. Converting an array to a Set and back to an array removes all duplicates in one line:

javascriptjavascript
const numbers = [1, 2, 2, 3, 3, 3, 4, 4, 5];
 
const unique = [...new Set(numbers)];
console.log(unique); // [1, 2, 3, 4, 5]
 
// Works with strings too
const tags = ["react", "javascript", "react", "node", "javascript"];
const uniqueTags = [...new Set(tags)];
console.log(uniqueTags); // ["react", "javascript", "node"]

You can also use Array.from() instead of the spread operator:

javascriptjavascript
const unique = Array.from(new Set(numbers));
AspectSet Approach
Syntax[...new Set(arr)]
Preserves orderYes (insertion order)
Works with primitivesYes (numbers, strings, booleans)
Works with objectsNo (compares by reference, not value)
PerformanceExcellent, O(n)

Method 2: filter() with indexOf()

The filter method keeps only the first occurrence of each value by comparing each element's index with its first index:

javascriptjavascript
const colors = ["red", "blue", "green", "red", "blue", "yellow"];
 
const unique = colors.filter((color, index) => {
  return colors.indexOf(color) === index;
});
 
console.log(unique); // ["red", "blue", "green", "yellow"]

How It Works

For each element, indexOf() returns the index of its first occurrence. If the current index matches, it is the first occurrence and passes the filter. If the current index is higher, it is a duplicate and gets filtered out.

javascriptjavascript
// "red" at index 0: indexOf("red") = 0 === 0 -> keep
// "blue" at index 1: indexOf("blue") = 1 === 1 -> keep
// "green" at index 2: indexOf("green") = 2 === 2 -> keep
// "red" at index 3: indexOf("red") = 0 !== 3 -> remove (duplicate)
// "blue" at index 4: indexOf("blue") = 1 !== 4 -> remove (duplicate)
// "yellow" at index 5: indexOf("yellow") = 5 === 5 -> keep

Method 3: reduce() with Accumulator

Use reduce to build a new array, adding each element only if it is not already present:

javascriptjavascript
const words = ["hello", "world", "hello", "javascript", "world"];
 
const unique = words.reduce((acc, word) => {
  if (!acc.includes(word)) {
    acc.push(word);
  }
  return acc;
}, []);
 
console.log(unique); // ["hello", "world", "javascript"]

Reduce with Case-Insensitive Deduplication

javascriptjavascript
const emails = [
  "alice@example.com",
  "Alice@Example.com",
  "BOB@example.com",
  "bob@example.com"
];
 
const uniqueEmails = emails.reduce((acc, email) => {
  const normalized = email.toLowerCase();
  if (!acc.some((e) => e.toLowerCase() === normalized)) {
    acc.push(email); // Keep the first occurrence
  }
  return acc;
}, []);
 
console.log(uniqueEmails);
// ["alice@example.com", "BOB@example.com"]

Method 4: Map for Objects (Deduplicate by Key)

For arrays of objects, Set will not work because it compares references, not property values. Use a Map keyed by the unique property:

javascriptjavascript
const users = [
  { id: 1, name: "Alice", role: "admin" },
  { id: 2, name: "Bob", role: "editor" },
  { id: 1, name: "Alice", role: "admin" },
  { id: 3, name: "Charlie", role: "viewer" },
  { id: 2, name: "Bob", role: "editor" }
];
 
const uniqueUsers = [
  ...new Map(users.map((user) => [user.id, user])).values()
];
 
console.log(uniqueUsers);
// [
//   { id: 1, name: "Alice", role: "admin" },
//   { id: 2, name: "Bob", role: "editor" },
//   { id: 3, name: "Charlie", role: "viewer" }
// ]

How It Works

  1. users.map(user => [user.id, user]) creates key-value pairs: [[1, {id:1,...}], [2, {id:2,...}], ...]
  2. new Map(...) builds a Map; duplicate keys overwrite, keeping the last occurrence
  3. .values() extracts the unique objects
  4. [...] converts the Map iterator back to an array

Keeping the First Occurrence Instead

javascriptjavascript
// Reverse the array before creating the Map
const uniqueFirst = [
  ...new Map(users.reverse().map((user) => [user.id, user])).values()
].reverse();

Using reduce for Object Deduplication

javascriptjavascript
const products = [
  { sku: "A001", name: "Widget", price: 9.99 },
  { sku: "B002", name: "Gadget", price: 24.99 },
  { sku: "A001", name: "Widget", price: 11.99 },
  { sku: "C003", name: "Gizmo", price: 14.99 }
];
 
const uniqueProducts = products.reduce((acc, product) => {
  const exists = acc.find((p) => p.sku === product.sku);
  if (!exists) {
    acc.push(product);
  }
  return acc;
}, []);
 
console.log(uniqueProducts.length); // 3
console.log(uniqueProducts[0].price); // 9.99 (first occurrence kept)

Method 5: for Loop (Manual Approach)

A manual for loop with explicit tracking works well when you need maximum control:

javascriptjavascript
const input = [5, 3, 8, 3, 1, 5, 8, 2, 1];
const seen = new Set();
const unique = [];
 
for (let i = 0; i < input.length; i++) {
  if (!seen.has(input[i])) {
    seen.add(input[i]);
    unique.push(input[i]);
  }
}
 
console.log(unique); // [5, 3, 8, 1, 2]

This approach has O(n) performance (same as the Set-spread method) but gives you a place to add custom logic during the deduplication process.

Performance Comparison

MethodTime ComplexitySpace ComplexityBest For
[...new Set(arr)]O(n)O(n)Primitives, simplicity
filter + indexOfO(n^2)O(n)Small arrays, readability
reduce + includesO(n^2)O(n)Custom logic, transformations
Map (by key)O(n)O(n)Arrays of objects
for + SetO(n)O(n)Maximum control
javascriptjavascript
// Benchmark: 100,000 elements with ~50% duplicates
const large = Array.from({ length: 100000 }, () =>
  Math.floor(Math.random() * 50000)
);
 
console.time("Set");
const r1 = [...new Set(large)];
console.timeEnd("Set"); // ~3ms
 
console.time("filter+indexOf");
const r2 = large.filter((v, i) => large.indexOf(v) === i);
console.timeEnd("filter+indexOf"); // ~2500ms
 
console.time("for+Set");
const seen = new Set();
const r3 = [];
for (const v of large) {
  if (!seen.has(v)) { seen.add(v); r3.push(v); }
}
console.timeEnd("for+Set"); // ~4ms

The O(n^2) methods (filter+indexOf and reduce+includes) become unusable on arrays over 10,000 elements.

Real-World Patterns

Deduplicating Merged API Responses

javascriptjavascript
async function fetchAllUsers() {
  const [page1, page2, page3] = await Promise.all([
    fetch("/api/users?page=1").then((r) => r.json()),
    fetch("/api/users?page=2").then((r) => r.json()),
    fetch("/api/users?page=3").then((r) => r.json())
  ]);
 
  // Merge and deduplicate by ID
  const allUsers = [...page1, ...page2, ...page3];
  const uniqueUsers = [
    ...new Map(allUsers.map((u) => [u.id, u])).values()
  ];
 
  return uniqueUsers;
}

Deduplicating User Tags

javascriptjavascript
const userTags = ["React", "react", "JavaScript", "REACT", "javascript", "Node"];
 
const normalized = [...new Set(userTags.map((tag) => tag.toLowerCase()))];
console.log(normalized); // ["react", "javascript", "node"]
 
// If you need to preserve original casing (first occurrence)
const uniqueOriginal = userTags.reduce((acc, tag) => {
  if (!acc.some((t) => t.toLowerCase() === tag.toLowerCase())) {
    acc.push(tag);
  }
  return acc;
}, []);
console.log(uniqueOriginal); // ["React", "JavaScript", "Node"]

Deduplicating by Multiple Properties

javascriptjavascript
const orders = [
  { userId: 1, product: "Laptop", date: "2026-01-15" },
  { userId: 1, product: "Laptop", date: "2026-01-15" },
  { userId: 2, product: "Phone", date: "2026-01-16" },
  { userId: 1, product: "Phone", date: "2026-01-17" }
];
 
// Create a composite key from multiple properties
const uniqueOrders = [
  ...new Map(
    orders.map((o) => [`${o.userId}-${o.product}-${o.date}`, o])
  ).values()
];
 
console.log(uniqueOrders.length); // 3

Common Mistakes

Using Set on Object Arrays

javascriptjavascript
const items = [{ id: 1 }, { id: 1 }, { id: 2 }];
 
// Bug: Set compares references, not values
const unique = [...new Set(items)];
console.log(unique.length); // 3 -- all kept because different references
 
// Fix: use Map with a key property
const uniqueById = [...new Map(items.map((i) => [i.id, i])).values()];
console.log(uniqueById.length); // 2

Forgetting Case Sensitivity

javascriptjavascript
const tags = ["React", "react", "REACT"];
 
// Bug: Set treats different cases as different values
const unique = [...new Set(tags)];
console.log(unique); // ["React", "react", "REACT"]
 
// Fix: normalize before deduplication
const uniqueNormalized = [...new Set(tags.map((t) => t.toLowerCase()))];
console.log(uniqueNormalized); // ["react"]

Best Practices

  1. Default to Set for primitives. [...new Set(arr)] is the fastest and most readable one-liner.
  2. Use Map for object deduplication. Key by the unique identifier (ID, SKU, email) for O(n) performance.
  3. Normalize data before deduplication. Lowercase strings, trim whitespace, and standardize formats before comparing.
  4. Avoid O(n^2) methods on large arrays. filter+indexOf and reduce+includes should only be used on arrays under 1,000 elements.
  5. Decide which occurrence to keep. Map keeps the last occurrence by default; reverse the array first if you need the first.
Rune AI

Rune AI

Key Insights

  • Set is the default choice for primitives: [...new Set(arr)] removes duplicates in O(n) time with one line of code.
  • Map handles object arrays efficiently: key by the unique property (id, sku, email) for O(n) deduplication.
  • Normalize before comparing: lowercase strings, trim whitespace, and create composite keys for multi-property deduplication.
  • Avoid O(n^2) methods at scale: filter+indexOf and reduce+includes are fine for small arrays but collapse on 10,000+ elements.
  • Set preserves insertion order: the first occurrence of each value is kept, and subsequent duplicates are silently ignored.
RunePowered by Rune AI

Frequently Asked Questions

What is the fastest way to remove duplicates from a JavaScript array?

The `Set` approach (`[...new Set(arr)]`) and the `for` loop with `Set` tracking are both O(n) and handle arrays with millions of elements in milliseconds. The `filter+indexOf` method is O(n^2) and becomes unusably slow on large arrays. For object arrays, `Map` keyed by a unique property is the fastest approach.

Does Set preserve the original order?

Yes. A JavaScript `Set` maintains insertion order. When you convert an array to a Set and back, the elements appear in the same order as their first occurrence in the original array. Duplicates are simply skipped when they are encountered again.

How do I remove duplicates from an array of objects?

Use a `Map` keyed by the unique property: `[...new Map(arr.map(item => [item.id, item])).values()]`. This is O(n) and keeps the last occurrence. For keeping the first occurrence, reverse the array before mapping, then reverse the result. For deduplication by multiple properties, create a composite string key.

Can I remove duplicates without creating a new array?

Not directly with Set or filter. You can use a `for` loop that iterates backward and uses [splice](/tutorials/programming-languages/javascript/javascript-array-splice-method-complete-tutorial) to remove duplicates in place, but this mutates the original array and is slower due to index shifting. Creating a new array with `Set` or `Map` is almost always preferred.

How do I handle case-insensitive deduplication?

Normalize the values before deduplication. For a simple unique list, use `[...new Set(arr.map(s => s.toLowerCase()))]`. If you need to preserve original casing, use `reduce` with a case-insensitive comparison to keep the first occurrence of each value.

Conclusion

Array deduplication in JavaScript ranges from one-line Set operations for primitive values to Map-based approaches for complex objects. The Set method handles the majority of cases with excellent performance and readability. For object arrays, Map keyed by a unique identifier provides O(n) deduplication. Always normalize your data (lowercase, trim, standardize) before comparing, and avoid O(n^2) methods like filter+indexOf on arrays larger than a few hundred elements. Choose the approach that best matches your data type and performance requirements.