LRU Cache – Design and Implementation in Java

In this article we will learn how to design a LRU Cache, understand it’s cache replacement algorithm. We also look at description of LRU Cache with some examples. Then, we look at the implementation of this design in code with their complexity analysis.

Caching is a method of organizing data in a faster memory, usually RAM to serve future requests of same data in a efficient way. It avoids repetitive main memory access by storing frequently accessed data in cache. However, the Cache size is usually not big enough to store large data sets compared to main memory. So, there is a need for cache eviction when it becomes full. There are many algorithms to implement cache eviction. LRU caching is a commonly used cache replacement algorithm.

Least Recently Used (LRU) Cache organizes data according to their usage, allowing us to identify which data item hasn’t been used for the longest amount of time. The main idea is to evict the oldest data or the least recently used from the cache to accommodate more data followed with replacement of data if already present in the cache (Cache Hit) and bring it in the front or top of the cache for quick access.

LRU Cache

Example:

Let’s consider a cache of capacity 4 with elements already present as:

Elements are added in order 1,2,3 and 4. Suppose we need to cache or add another element 5 into our cache, so after adding 5 following LRU Caching the cache looks like this:

So, element 5 is at the top of the cache. Element 2 is the least recently used or the oldest data in cache. Now if we want to access element 2 again from the cache. The cache becomes:

So, element 2 comes to the top of the cache. Element 3 is the least recent used data and next in line for eviction.

LRU Cache Implementation

We follow these steps to implement a LRU Cache in our program:

  • We use two Data Structures a Deque or Doubly Ended Queue, where insertion and deletion can take place from both ends and a Hash-Map. The Deque will act as our Cache.
  • In Queue, we enter each element at first/front of the queue, so we use a Deque. If we need to access any element already present in our cache we search that element in queue, remove it and then add it to the front of the queue/cache. But searching an element in queue could take O(n) time in worst case and will be a costly operation.
  • So, to avoid this we use a Hash-Map which provides look-up or search time for our keys in O(1) time. Then, we can directly delete the element without the need to search in cache. So if our Hash-Map contains the data then we can just bring that element to the front of queue and add data as entry to our map for future look-ups.
  • If our capacity is full then we remove from the rear end of Deque which contains least recent data. Along with this, we remove the entry of the element from our Map.
  • So, we mainly need to implement two methods for our cache: One to get element from cache and other to add element into our cache following the LRU algorithm and above steps.
  • In get method we need to just get the value of data item if it’s present in our cache. In put method we add the data into our cache and the Map and update the order of cache elements.

Why Hash-Map not Hash-Set?

Since we only need to search whether an element is present in our cache or not we can also do it by using a Hash-Set then what is the purpose of Hash-Map. The reason is while accessing resources from the Cache a key is required to access the data. The key will be unique for each data. So with the key we can access the actual data. In real life scenario, for a product there can be many attributes we need to access with the product key. As we know, Hash-Map stores data in Key-Value Pair the Key Field holds the key of data and Value field can hold the actual data or attributes.

Now let’s look at the implementation of above in Java:

Output:

So this was the implementation of LRU Cache in code let’s have a look at the time and space complexities of our approach.

Time Complexity: Basically we implement this cache in O(1) time, as the search for an element present in cache require constant time and with our Map the search time is also constant. In put method, we add elements to front of cache which also require constant time. So overall time is O(1).

Space Complexity: We use a Deque which store n number of keys and a Map which store n Key-Value pairs so the overall space complexity is O(n).

That’s it for the article you can post your doubts in the comment section below.

1 thought on “LRU Cache – Design and Implementation in Java”

  1. Since the code uses the Deque class from the standard Java library, and a call to Deque’s remove(someObject) method is an O(n) operation, the cost of both lookup and insertion in this implementation is O(n). Switching to a custom implementation where we have access directly to the list element’s pointers is an alternate approach that can achieve O(1).

Leave a Comment

Your email address will not be published. Required fields are marked *