Skip to content

Instantly share code, notes, and snippets.

@dhsathiya
Created September 15, 2019 12:07
Show Gist options
  • Save dhsathiya/580887c1a79d81fc1d75fcc9356047dc to your computer and use it in GitHub Desktop.
Save dhsathiya/580887c1a79d81fc1d75fcc9356047dc to your computer and use it in GitHub Desktop.

Let's consider a constant stream of cache requests with a cache capacity of 3, see below:

A, B, C, A, A, A, A, A, A, A, A, A, A, A, B, C, D

If we just consider a Least Recently Used (LRU) cache with a HashMap + doubly linked list implementation with O(1) eviction time and O(1) load time, we would have the following elements cached while processing the caching requests as mentioned above.

[A]
[A, B]
[A, B, C]
[B, C, A] <- a stream of As keeps A at the head of the list.
[C, A, B]
[A, B, C]
[B, C, D] <- here, we evict A, we can do better! 

When you look at this example, you can easily see that we can do better - given the higher expected chance of requesting an A in the future, we should not evict it even if it was least recently used.

A - 12
B - 2
C - 2
D - 1

Least Frequently Used (LFU) cache takes advantage of this information by keeping track of how many times the cache request has been used in its eviction algorithm.

source : https://stackoverflow.com/questions/17759560/what-is-the-difference-between-lru-and-lfu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment