chunk
Splits an array into subarrays (“chunks”) of the specified size. The last chunk may be smaller if there are not enough elements.
1/**
2 * Splits an array into subarrays (“chunks”) of the specified size.
3 * The last chunk may be smaller if there are not enough elements.
4 *
5 * @param arr - The input array to split.
6 * @param size - The maximum size of each chunk (must be > 0).
7 * @returns An array of chunked subarrays.
8 * @throws Error if size is not greater than 0.
9 */
10export function chunk<T>(arr: T[], size: number): T[][] {
11 if (size <= 0) {
12 throw new Error('Chunk size must be greater than 0');
13 }
14
15 const result: T[][] = [];
16 for (let i = 0; i < arr.length; i += size) {
17 result.push(arr.slice(i, i + size));
18 }
19 return result;
20}
Efficient single-pass logic
Iterates through the array only once using a
for
loop, making it time-efficient and memory-friendly even for large datasets.Handles uneven divisions
Automatically handles edge cases where the last chunk is smaller than the specified size, preserving all elements without loss.
Built-in validation
Includes a guard clause (
size <= 0
) to catch invalid input early, helping prevent silent errors and misuse.Immutability
Does not mutate the original array — returns a new array of subarrays, adhering to functional programming principles.
Tests | Examples
1test('chunk with exact division', () => {
2 expect(chunk([1, 2, 3, 4], 2)).toEqual([[1, 2], [3, 4]]);
3});
4
5test('chunk with size greater than array', () => {
6 expect(chunk([1, 2, 3], 10)).toEqual([[1, 2, 3]]);
7});
8
9test('chunk with size 1', () => {
10 expect(chunk([1, 2, 3], 1)).toEqual([[1], [2], [3]]);
11});
12
13test('chunk with empty array', () => {
14 expect(chunk([], 3)).toEqual([]);
15});
16
17test('chunk with invalid size', () => {
18 expect(() => chunk([1, 2], 0)).toThrow('Chunk size must be greater than 0');
19});
Common Use Cases
Paginating large datasets
Breaks data into manageable pages for UI rendering (e.g., list views, tables, carousels).
Batch processing
Useful for throttling operations like network requests, database inserts, or file processing where operations are chunked in groups.
Grid layout construction
Helps organize content into rows or columns of fixed size for responsive layouts or visual groupings.
Controlled memory usage
Enables step-wise processing of large datasets in constrained environments (e.g., mobile apps, Node.js scripts).
CSV and log parsing
When transforming large input arrays (e.g., rows from a CSV), chunking helps process line-by-line in blocks.