Important: This documentation covers Yarn 1 (Classic).
For Yarn 2+ docs and migration guide, see yarnpkg.com.

Package detail

chunkify

sindresorhus6.5kMIT5.0.0TypeScript support: included

Split an iterable into evenly sized chunks

chunkify, chunk, chunks, chunked, chunking, array, iterable, iterator, generator, set, map, parts, split, size, partition, divide, segment, batch, slice, subarray, group, allocate, dissect, segregate, separate, section

readme

chunkify

Split an iterable into evenly sized chunks

Install

npm install chunkify

Usage

import chunkify from 'chunkify';

console.log([...chunkify([1, 2, 3, 4], 2)]);
//=> [[1, 2], [3, 4]]

console.log([...chunkify([1, 2, 3, 4], 3)]);
//=> [[1, 2, 3], [4]]

API

chunkify(iterable, chunkSize)

Returns an iterable with the chunks. The last chunk could be smaller.

iterable

Type: Iterable (for example, Array)

The iterable to chunkify.

chunkSize

Type: number (integer)\ Minimum: 1

The size of the chunks.

Use-cases

Batch processing

When dealing with large datasets, breaking data into manageable chunks can optimize the batch processing tasks.

import chunkify from 'chunkify';

const largeDataSet = [...Array(1000).keys()];
const chunkedData = chunkify(largeDataSet, 50);

for (const chunk of chunkedData) {
    processBatch(chunk);
}

Parallel processing

Dividing data into chunks can be useful in parallel processing to distribute workload evenly across different threads or workers.

import {Worker} from 'node:worker_threads';
import chunkify from 'chunkify';

const data = [/* some large dataset */];
const chunkedData = chunkify(data, 20);

for (const [index, chunk] of chunkedData.entries()) {
    const worker = new Worker('./worker.js', {
        workerData: {
            chunk,
            index
        }
    });
}

Network requests

Splitting a large number of network requests into chunks can help in managing the load on the network and preventing rate limiting.

import chunkify from 'chunkify';

const urls = [/* Array of URLs */];

const chunkedUrls = chunkify(urls, 10);

for (const chunk of chunkedUrls) {
    await Promise.all(chunk.map(url => fetch(url)));
}