Yevhen Klymentiev
dark
light
console
darkness
y.klymentiev@gmail.com
Reusable Snippets|Practical utility code for everyday use — custom-built and ready to share

memoizeWithLRU

Memoizes a function using a Least Recently Used (LRU) caching strategy. This utility stores a fixed number of recent calls. When the limit is reached, the least recently used entry is evicted to make space for new ones.

Also provides a .reset() method to clear the internal cache manually.

TypeScript
Copied!
1/**
2 * Memoizes a function using a Least Recently Used (LRU) caching strategy.
3 *
4 * This utility stores a fixed number of recent calls. When the limit is reached,
5 * the least recently used entry is evicted to make space for new ones.
6 *
7 * Also provides a `.reset()` method to clear the internal cache manually.
8 *
9 * @template T - The type of the function to memoize.
10 * @param fn - The target function to memoize.
11 * @param options - Configuration object.
12 * @param options.maxSize - Maximum number of cached entries.
13 * @returns A memoized version of the function with LRU behavior and a reset method.
14 */
15type MemoizeOptions = {
16  maxSize: number;
17};
18
19type MemoizedWithReset<T extends (...args: any[]) => any> = T & {
20  reset: () => void;
21};
22
23export function memoizeWithLRU<T extends (...args: any[]) => any>(
24  fn: T,
25  options: MemoizeOptions
26): MemoizedWithReset<T> {
27  const cache = new Map<string, ReturnType<T>>();
28  const { maxSize } = options;
29
30  const memoizedFn = (...args: Parameters<T>): ReturnType<T> => {
31    const key = JSON.stringify(args);
32
33    if (cache.has(key)) {
34      const value = cache.get(key)!;
35      cache.delete(key); // move key to most recently used
36      cache.set(key, value);
37      return value;
38    }
39
40    const result = fn(...args);
41    cache.set(key, result);
42
43    if (cache.size > maxSize) {
44      const firstKey = cache.keys().next().value;
45      cache.delete(firstKey);
46    }
47
48    return result;
49  };
50
51  // Add reset method
52  (memoizedFn as MemoizedWithReset<T>).reset = () => {
53    cache.clear();
54  };
55
56  return memoizedFn as MemoizedWithReset<T>;
57}
  • LRU Caching Strategy

    Implements Least Recently Used eviction to maintain only the most relevant cache entries, reducing memory usage and stale data.

  • Argument-Based Memoization

    Uses serialized function arguments as keys, supporting any deterministic function input structure.

  • Manual Cache Control

    Provides a .reset() method to clear the internal cache when needed — useful for lifecycle-aware cleanup.

  • Preserves Execution Semantics

    Fully preserves the original function’s return type and argument signature through TypeScript generics.

  • Performance Optimization

    Prevents redundant computation of expensive function calls when invoked repeatedly with the same arguments.

Tests | Examples

TypeScript
Copied!
1test('memoizeWithLRU - caches and evicts correctly', () => {
2  const spy = jest.fn((x: number) => x * 10);
3  const memoized = memoizeWithLRU(spy, { maxSize: 2 });
4
5  expect(memoized(1)).toBe(10); // compute
6  expect(memoized(2)).toBe(20); // compute
7  expect(memoized(1)).toBe(10); // cached
8  expect(spy).toHaveBeenCalledTimes(2);
9
10  expect(memoized(3)).toBe(30); // compute, evicts 2
11  expect(spy).toHaveBeenCalledTimes(3);
12
13  expect(memoized(2)).toBe(20); // recompute, 2 was evicted
14  expect(spy).toHaveBeenCalledTimes(4);
15});
16
17test('memoizeWithLRU - reset clears cache', () => {
18  const spy = jest.fn((x: number) => x * 5);
19  const memoized = memoizeWithLRU(spy, { maxSize: 2 });
20
21  expect(memoized(1)).toBe(5);  // compute
22  expect(memoized(1)).toBe(5);  // cached
23  expect(spy).toHaveBeenCalledTimes(1);
24
25  memoized.reset();
26
27  expect(memoized(1)).toBe(5);  // recompute after reset
28  expect(spy).toHaveBeenCalledTimes(2);
29});

Common Use Cases

  • Heavy Computation Caching

    Cache results of CPU-intensive calculations (e.g., math, image processing, or analytics queries).

  • API or Network Response Memoization

    Avoid duplicate requests by caching recent responses when the input (e.g., query) is the same.

  • UI Rendering Optimization

    Cache layout calculations or style resolutions in UI frameworks to improve performance.

  • Search and Filtering

    Store previous filtered or sorted results to speed up repeated search operations.

  • Reactive or Derived State Caching

    Memoize derived values in state management systems (e.g., selectors in Redux) to avoid unnecessary recomputation.

Codebase: Utilities -> Functions -> memoizeWithLRU | Yevhen Klymentiev