Set data structure.
Node.js Series Overview
- Increase the Memory Limit for Your Process
- Why You Should Add “node” in Your Travis Config
- Create a PDF from HTML with Puppeteer and Handlebars
- Create Your Own Custom Error
- Extend Multiple Classes (Multi Inheritance)
- Get a File’s Created Date
- Get a File’s Last Modified/Updated Date
- Write a JSON Object to a File
- How to Create an Empty File
- How to Merge Objects
- How to Run an Asynchronous Function in Array.map()
- How to Reset and Empty an Array
- for…of vs. for…in Loops
- Get an Array With Unique Values (Delete Duplicates)
- Callback and Promise Support in your Node.js Modules
- Run Async Functions/Promises in Sequence
- Run Async Functions/Promises in Parallel
- Run Async Functions in Batches
This is a welcoming use-case for the
Here‘s an example extracting the unique values from an array of items:
const items = [1, 2, 3, 2, 3, 1, 5] const unique = [ ...new Set(items)] // [1, 2, 3, 5]
The way it works: create a new set instance of your items which automatically removes the duplicates. Use the spread operator to return the set’s items back into a new array.
Array.from() method which creates an array from an iterable. The
Set data structure is an iterable object and a valid parameter for this approach:
const items = [1, 2, 3, 2, 3, 1, 5] const unique = Array.from(new Set(items)) // [1, 2, 3, 5]
If you’re an avid Node.js developer working a lot with collection pipelines, you may check out the Supercharge Collections package.
This package provides async/await-ready, chainable array utilities:
const Collect = require('@supercharge/collections') const items = [1, 2, 3, 2, 3, 1, 5] await Collect(items).unique().all() // [1, 2, 3, 5]
Which Way Should I Go?
All three methods work totally fine. You may choose whatever you like more.
We like to use
Array.from() in simple situations because it reads more explicit, like “create an array from the set of items”.
We use the collections package when in need of a collection pipeline, like mapping, sorting, and filtering besides removing duplicates.