Skip to content

Instantly share code, notes, and snippets.

@Akuma-U1
Last active February 10, 2023 13:27
Show Gist options
  • Save Akuma-U1/09bd9506a63d5e165ec06ac1644d21d0 to your computer and use it in GitHub Desktop.
Save Akuma-U1/09bd9506a63d5e165ec06ac1644d21d0 to your computer and use it in GitHub Desktop.
LRU Cache implementation in Typescript
class LRU_Cache<T> extends Map<string,T> {
private _limit: number;
constructor(limit: number, entries?: [string, T][]) {
super(entries);
this._limit = limit;
}
get(key: string) {
const hasKey = this.has(key);
if (!hasKey) return;
const value = super.get(key)!;
this.delete(key);
this.set(key, value);
return value;
}
set(key: string, value: T) {
if (this.size >= this._limit) {
this.delete(this.keys().next().value);
}
return super.set(key, value);
}
}
export default LRU_Cache;
@Akuma-U1
Copy link
Author

Description

LRU (Least Recently Used) Cache is a type of cache algorithm that is used to manage a limited size cache by removing the least recently used item when the cache is full and a new item needs to be added.

Map according to MDN has the following property:

The specification requires maps to be implemented "that, on average, provide access times that are sublinear on the number of elements in the collection". Therefore, it could be represented internally as a hash table (with O(1) lookup), a search tree (with O(log(N)) lookup), or any other data structure, as long as the complexity is better than O(N).

This property makes Map objects ideal to use for LRU cache implementation, as fetch time for items in such scenarios need to be as low as possible.

Additional functionality like Map.prototype.keys(), Map.prototype.clear() is nice to have in a cache-like structure.

Real world uses include caching results from heavy DB queries in an Express server or caching frequently used images or other media in your frontend implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment