Design and implement Least Recently Used cache

Implement Least Recently Used (LRU) cache

Before discussing implementation of least recently used cache,let’s understand what is a cache? In plain computer architectural terms, a cache is small buffer of pages OS maintains in order to avoid more expensive main memory accesses. Caches are faster than main memory, however, they smaller in size compared to main memory.  Therefore, there is probability pages are swapped in and out of cache. If a page is not found in cache and main memory is accessed to fetch that page, it’s a cache miss. Page is brought in cache and next time when it is accessed, it is served from cache.

What if there is no space left in cache when a cache miss occurs? New page as to swapped with one of already existing pages in cache. How to decide which page goes out of cache, so that there is minimum increase in cache miss? There are many approaches (eviction methods) to decide which page goes out of cache to make space for new page like First In First Out approach, Least Recently Used, Least Frequently Used etc.

What is least recently used cache ?

In ‘First In First Out’ approach, OS selects page which is oldest in cache and swaps that it with new page. In ‘Least Recently Used’ approach, OS selects the page which was not accessed for longest period of time. In ‘Least Frequently Used’ approach, OS selects the page which is accessed least number of time till a given point of time.

In this post, we would concentrate on Least Recently Used approach and implement it.

LRU cache is similar to first in first out (FIFO) storage where pages which came first are evicted first. Difference between FIFO storage and LRU cache comes from the fact that when a page is accessed again, that page is move to end.

If a page is entered in cache first, it is first candidate to go out if it not accessed again in before cache is full and cache miss happens.  FIFO can be best implemented using queues. Any doubt?

Let’s take an example.We have  cache with size 4. Order of accessing page is 1,2,3,4,5,2,6,6

First 4 steps would be
implement least recently used cache

Now comes page 5, it’s a cache miss, needs to be brought in from memory and there is no space in cache. Evict least recently used page. This will page will be at the front of queue. Remove page 1 as it was the least recently page among four pages we have in cache.

Design lru cache 
When page 2 is accessed, it’s a cache hit. Fine we do not swap anything here.

Page 6 is accessed, it’s a cache miss and cache is already full. If as per previous approach, we swap page at front of queue, wrong page will be swapped as page 2 is most recently used.

How can swapping of page 2 can be avoided?  Remove it from front and put it at tail of queue when it is accessed last. That will give page 3 at front when page 6 is accessed and hence correct page will be remove as page 3 is now least recently used page.
Least recently used cache design
Insertion and removal operation are with O(1) with doubly linked list implementation of Queue. With singly linked lists, deletion may be O(N) because of previous and next pointer mess.

There is still a problem : how to check for cache hit or miss efficiently? Scanning through whole queue searching a page would be O(N). How can we make look up fast? Lookups are best optimized using hash tables.

A hash table with key as page number and value as pointer to node in the queue. With hash table, look up of any page in queue in O(1) complexity.

Least recently used cache lru cache design

1. If cache has free entry, add the page entry to queue.
2. If cache is full and its cache hit, remove the page from present location and add it to end of queue.
3. If cache is full and its cache miss, remove the page at front and insert the new page at end of queue.
4. To check hit or miss, use hash table.

Execution Trace


So, we saw how a least recently used cache is implemented using queue and hash.

Please share if there is something wrong or missing . We would love to hear from you.

  • Ramesh Babu Y

    very useful , thanks

    • http://algorithmsandme.com/ Jitendra Sangar

      Glad that you liked it. Shall I add your implementation in java in the post?

  • Ramesh Babu Y

    import java.util.ArrayDeque;

    import java.util.Queue;

    public class LRUCache {

    int CACHE_CAPACITY = 4;

    Queue cache = new ArrayDeque(CACHE_CAPACITY);

    public Integer addToCache(Integer element) {

    boolean isAdded = false;

    if (cache.size() < CACHE_CAPACITY) {

    isAdded = cache.add(element);

    } else {

    removeLRU();

    isAdded = cache.add(element);

    }

    return isAdded ? element : new Integer(-1);

    }

    private void removeLRU() {

    cache.remove();

    }

    public Integer getElement(Integer element) {

    Integer elementInCache = cache.poll();

    // cache hit

    if (elementInCache != null) {

    cache.add(elementInCache);

    }

    return elementInCache;

    }

    public boolean removeElement(Integer element) {

    return cache.remove(element);

    }

    }