Set data structure.
Node.js Series Overview
- String Replace All Appearances
- How to Run an Asynchronous Function in Array.map()
- How to Reset and Empty an Array
- for…of vs. for…in Loops
- Get an Array With Unique Values (Delete Duplicates)
- Callback and Promise Support in your Node.js Modules
- Run Async Functions/Promises in Sequence
- Run Async Functions/Promises in Parallel
- Run Async Functions in Batches
- Get a File’s Created Date
- Get a File’s Last Modified/Updated Date
- How to Create an Empty File
- Check If a Path or File Exists
- How to Rename a File
- Check If a Path Is a Directory
- Check If a Path Is a File
- Retrieve the Path to the User’s Home Directory
- How to Touch a File
- Read File Content as String (Coming soon)
This is a welcoming use-case for the
Here‘s an example extracting the unique values from an array of items:
const items = [1, 2, 3, 2, 3, 1, 5] const unique = [ ...new Set(items)] // [1, 2, 3, 5]
The way it works: create a new set instance of your items which automatically removes the duplicates. Use the spread operator to return the set’s items back into a new array.
Array.from() method which creates an array from an iterable. The
Set data structure is an iterable object and a valid parameter for this approach:
const items = [1, 2, 3, 2, 3, 1, 5] const unique = Array.from(new Set(items)) // [1, 2, 3, 5]
If you’re an avid Node.js developer working a lot with collection pipelines, you may check out the Supercharge Collections package.
This package provides async/await-ready, chainable array utilities:
const Collect = require('@supercharge/collections') const items = [1, 2, 3, 2, 3, 1, 5] await Collect(items).unique().all() // [1, 2, 3, 5]
Which Way Should I Go?
All three methods work totally fine. You may choose whatever you like more.
We like to use
Array.from() in simple situations because it reads more explicit, like “create an array from the set of items”.
We use the collections package when in need of a collection pipeline, like mapping, sorting, and filtering besides removing duplicates.