Understanding the Least Recently Used (LRU) Caching Algorithm

Milad Fahmy
6 min readOct 10, 2024

--

Generated By AI

Caching is a crucial technique in computer science that improves application performance by temporarily storing frequently accessed data. Among various caching algorithms, the Least Recently Used (LRU) algorithm stands out for its effectiveness in managing limited cache space. This article will explore the LRU caching algorithm, its implementation, and its advantages through code examples.

What is LRU Caching?

The LRU caching algorithm evicts the least recently used items when the cache reaches its maximum size. This strategy is based on the principle that data that hasn’t been accessed for the longest time is least likely to be needed again shortly.

Key Concepts

  1. Cache: A temporary storage area that holds frequently accessed data to reduce access time.
  2. Eviction: The process of removing items from the cache to free up space.
  3. Usage Order: The order in which items are accessed, determines which items are considered for eviction.

How LRU Works

  • Insertion: When a new item is added to the cache, it is stored at the most recently used position.
  • Access: When an item is accessed, it is moved to the most recently used position.
  • Eviction: When the cache reaches its maximum capacity, the item that has not been used for the longest time is removed.

Implementation of LRU Caching

Here’s a basic implementation of the LRU caching algorithm using TypeScript, featuring a LRUCache class and a QueryCache a class that utilizes it.

LRUCache Class

The LRUCache The class manages the cache, tracks the usage of each entry, and performs evictions based on the LRU policy.

type EvictionStrategy = 'LRU' | 'FIFO';

export interface LRUCacheOptions<K, V> {
maxSize: number;
ttl?: number;
onEvict?: (key: K, value: V) => void;
evictionStrategy?: EvictionStrategy;
stayAlive?: boolean;
}
export class LRUCache<K, V> {
private cache: Map<K, { value: V; expiry?: number }> = new Map();

private maxSize: number;

private ttl?: number;

private onEvict?: (key: K, value: V) => void;

private evictionStrategy: EvictionStrategy;

private stayAlive?: boolean;

private hitCount: number = 0;

private missCount: number = 0;

private cleanupInterval: number = 60000; // 1 minute cleanup interval

// eslint-disable-next-line no-undef
private cleanupTimer?: NodeJS.Timeout;

constructor(options: LRUCacheOptions<K, V>) {
const {
maxSize,
ttl, onEvict,
evictionStrategy = 'LRU',
stayAlive = false,
} = options;

if (maxSize <= 0) {
throw new Error('maxSize must be greater than 0');
}
this.maxSize = maxSize;
this.ttl = ttl;
this.onEvict = onEvict;
this.evictionStrategy = evictionStrategy;
if (stayAlive) {
this.startCleanup();
}
}

private reorder(key: K): void {
if (this.evictionStrategy === 'LRU') {
const entry = this.cache.get(key);
if (!entry) return;
this.cache.delete(key); // Remove the entry
this.cache.set(key, entry); // Re-insert to update the position
}
}

// eslint-disable-next-line class-methods-use-this
private isExpired(entry: { value: V; expiry?: number }): boolean {
if (entry.expiry === undefined) return false;
return performance.now() > entry.expiry;
}

private cleanupExpiredEntries(): void {
const now = performance.now();
Array.from(this.cache.entries()).forEach(([key, entry]) => {
if (entry.expiry !== undefined && now > entry.expiry) {
this.cache.delete(key);
if (this.onEvict) {
this.onEvict(key, entry.value);
}
}
});
}

private startCleanup(): void {
if (this.cleanupInterval) {
this.cleanupTimer = setInterval(() => this.cleanupExpiredEntries(), this.cleanupInterval);
}
}

stopCleanup(): void {
if (this.cleanupTimer) {
clearInterval(this.cleanupTimer);
}
}

get(key: K): V | undefined {
const entry = this.cache.get(key);
if (!entry) {
this.missCount += 1;
return undefined;
}

if (this.isExpired(entry)) {
this.cache.delete(key);
this.missCount += 1;
return undefined;
}

this.hitCount += 1;
this.reorder(key);
return entry.value;
}

set(key: K, value: V, ttl?: number): void {
if (this.cache.has(key)) {
this.reorder(key);
} else if (this.cache.size === this.maxSize) {
let oldestKey: K | undefined;

if (this.evictionStrategy === 'LRU') {
// Get the first key for LRU strategy
oldestKey = this.cache.keys().next().value;
} else if (this.evictionStrategy === 'FIFO') {
// Get the oldest key for FIFO strategy
oldestKey = [...this.cache.keys()].shift();
}

if (oldestKey !== undefined) {
const oldestEntry = this.cache.get(oldestKey);
this.cache.delete(oldestKey);
if (this.onEvict && oldestEntry) {
this.onEvict(oldestKey, oldestEntry.value);
}
}
}

// Avoid nested ternaries by using simple conditionals
let expiry: number | undefined;
if (ttl !== undefined) {
expiry = performance.now() + ttl;
} else if (this.ttl !== undefined) {
expiry = performance.now() + this.ttl;
}

this.cache.set(key, { value, expiry });
}

delete(key: K): boolean {
return this.cache.delete(key);
}

clear(): void {
this.cache.clear();
}

get size(): number {
return this.cache.size;
}

has(key: K): boolean {
const entry = this.cache.get(key);
if (!entry) return false;

if (this.isExpired(entry)) {
this.cache.delete(key);
return false;
}

return true;
}

entries(): Array<[K, V]> {
return Array.from(this.cache.entries())
.filter(([, entry]) => !this.isExpired(entry))
.map(([key, entry]) => [key, entry.value]);
}

serialize(): string {
const entries = this.entries();
return JSON.stringify(entries);
}

deserialize(serializedCache: string): void {
const entries: [K, V][] = JSON.parse(serializedCache);
this.cache.clear();
entries.forEach(([key, value]) => {
this.set(key, value);
});
}

get hitRate(): number {
const total = this.hitCount + this.missCount;
return total === 0 ? 0 : this.hitCount / total;
}

get missRate(): number {
const total = this.hitCount + this.missCount;
return total === 0 ? 0 : this.missCount / total;
}

async getAsync(key: K, fetchFn: () => Promise<V>): Promise<V> {
const cached = this.get(key);
if (cached) {
return cached;
}

const result = await fetchFn();
this.set(key, result);
return result;
}
}

export default LRUCache;

QueryCache Class

The QueryCache class extends the functionality of the LRUCache by allowing the storage of document arrays with optional time-to-live (TTL) settings.

import { LRUCache, LRUCacheOptions } from './lru-cache';
import { DocumentWithId } from '../types';

export interface QueryCacheOptions<T> extends LRUCacheOptions<string, DocumentWithId<T>[]> {
ttl?: number;
evictionStrategy?: 'LRU' | 'FIFO';
}

export class QueryCache<T> {
// Create an LRUCache instance that stores string keys and arrays of DocumentWithId<T> as values
private cache: LRUCache<string, DocumentWithId<T>[]>;

// Metrics for cache hits and misses
private cacheHits: number = 0;

private cacheMisses: number = 0;

/**
* Initializes the QueryCache with optional maxSize, TTL, and eviction strategy.
* @param options - Object containing maxSize, TTL, and eviction strategy.
*/
constructor(options?: QueryCacheOptions<T>) {
const { maxSize = 1000, ttl, evictionStrategy = 'LRU' } = options || {};
this.cache = new LRUCache<string, DocumentWithId<T>[]>({
maxSize,
ttl,
evictionStrategy,
});
}

/**
* Retrieves a list of documents by the given key from the cache.
* Tracks cache hits and misses.
* @param key - The key to retrieve.
* @returns The array of DocumentWithId<T> associated with the key, or undefined if not found or expired.
*/
public get(key: string): DocumentWithId<T>[] | undefined {
const result = this.cache.get(key);
if (result) {
this.cacheHits += 1;
} else {
this.cacheMisses += 1;
}
return result;
}

/**
* Adds or updates a list of documents in the cache.
* Supports custom TTL for each entry.
* @param key - The key of the cache entry.
* @param value - The array of DocumentWithId<T> to cache.
* @param ttl - Optional TTL for this specific entry, overriding global TTL.
*/
public set(key: string, value: DocumentWithId<T>[], ttl?: number): void {
this.cache.set(key, value, ttl); // Allow custom TTL per entry
}

/**
* Invalidates (removes) a cache entry by key, or clears the entire cache if no key is provided.
* @param key - Optional key to remove from the cache. If not provided, the entire cache is cleared.
*/
public invalidate(key?: string): void {
if (key) {
this.cache.delete(key);
} else {
this.cache.clear();
}
}

/**
* Returns cache hit/miss statistics.
* @returns An object with hit and miss counts.
*/
public getStats(): { hits: number; misses: number } {
return {
hits: this.cacheHits,
misses: this.cacheMisses,
};
}
}

export default QueryCache;

How to Use LRU Cache

To use the QueryCache class, instantiate it with desired options, and use the get, set, and invalidate methods to manage your cached data.

const queryCache = new QueryCache<MyDocumentType>({
maxSize: 500,
ttl: 60000, // 1 minute
evictionStrategy: 'LRU',
});

// Adding data to the cache
queryCache.set('user:1', [{ id: '1', name: 'John Doe' }]);

// Retrieving data from the cache
const cachedUser = queryCache.get('user:1');

// Invalidating a specific cache entry
queryCache.invalidate('user:1');

// Getting cache stats
const stats = queryCache.getStats();
console.log(`Cache Hits: ${stats.hits}, Cache Misses: ${stats.misses}`);

Advantages of LRU Caching

  1. Efficiency: LRU is efficient for workloads where recent data is more likely to be accessed again.
  2. Simplicity: The algorithm is straightforward to implement and understand.
  3. Adaptability: Works well in a variety of applications, from web servers to databases.

Use Cases

  • Web Browsers: Caching web pages and resources.
  • Database Query Results: Storing the results of expensive database queries to reduce load times.
  • In-memory Data Stores: Caching frequently accessed objects in applications.

Conclusion

The Least Recently Used (LRU) caching algorithm is an effective strategy for optimizing data retrieval and improving performance in various applications. With its straightforward implementation and proven effectiveness, LRU is a go-to choice for developers seeking to enhance their systems’ responsiveness.

By using the LRUCache and QueryCache classes outlined in this article, you can leverage LRU caching in your own applications and enjoy the benefits of faster data access.

--

--

Milad Fahmy
Milad Fahmy

Written by Milad Fahmy

I’m a JS Champion, I wrote about software engineering. I’m currently developing various open-source projects (like: https://encrypt-rsa.js.org)

No responses yet