MEDIUM
DATA STRUCTURES AND ALGORITHMS

How to Solve LRU Cache

Written By
Adam Bhula

LRU Cache Introduction

The LRU Cache problem requires designing an LRU Cache (Least Recently Used Cache) that supports two operations: get and put in this case. The key is in achieving O(1) time by utilizing a double-linked list and a hash map.

LRU Cache Problem

Implement an LRU Cache

LRU = Least recently used cache

LRU caches are often used to implement caches which you do not want to grow indefinitely. The cache has a max size, and when a new key is inserted that would make it grow larger than the max size, the key which has not been accessed for the longest time is removed to make space.

It should support these operations:

get(key): get the value of the key if the key exists in the cache put(key, value): update or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item. What happens if you call get(key) on a key which doesn't exist in the cache is up to you to define.

Example usage:

cache = new LruCache (2) // capacity = 2

cache.put(1, "1") cache.put(2, "2") cache.get(1) // returns "1" cache.put(3, "3") was retrieved // evicts key 2, because the key 1 was retrieved cache.get(2) evicted // returns null, because 2 was just evicted cache.put(4, "4") // evicts key 1, cache.get(1) // returns null because, it was evicted earlier cache.get(3) // returns "3" cache.get(4) // returns "4"

LRU Cache Solutions

To solve the LRU Cache problem, we take an optimized approach by combining a doubly linked list and a hashmap. Think of the cache as a collection of items, where each item has a key-value pair. The doubly linked list helps us keep track of the items based on their usage history, while the hashmap allows us to quickly access and update the items.

When an item is accessed or inserted, it is moved to the front of the linked list to represent its recent usage. This ensures that the most recently used items are always at the front, while the least recently used ones are towards the rear. If the cache becomes full and we need to insert a new item, we can easily evict the least recently used item from the rear of the linked list and remove it from the hashmap.

By using the hashmap, we achieve fast retrieval and update of items with a time complexity of O(1) for both get() and put() operations. The doubly linked list helps maintain the order of the items, enabling efficient removal of the least recently used item when needed. This provides an efficient LRU Cache implementation, allowing us to handle large datasets while maintaining constant time operations for accessing and updating the cache.

class ListNode:
    def __init__(self, key=None, value=None):
        self.key = key
        self.value = value
        self.prev = None
        self.next = None

class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity
        self.cache = {}
        self.head = ListNode()
        self.tail = ListNode()
        self.head.next = self.tail
        self.tail.prev = self.head

    def _add_node(self, node):
        # Add the node after the head node
        node.prev = self.head
        node.next = self.head.next
        self.head.next.prev = node
        self.head.next = node

    def _remove_node(self, node):
        # Remove the node from the linked list
        prev_node = node.prev
        next_node = node.next
        prev_node.next = next_node
        next_node.prev = prev_node

    def _move_to_head(self, node):
        # Move the node to the front (after the head)
        self._remove_node(node)
        self._add_node(node)

    def _pop_tail(self):
        # Remove and return the node before the tail
        tail_node = self.tail.prev
        self._remove_node(tail_node)
        return tail_node

    def get(self, key):
        if key in self.cache:
            node = self.cache[key]
            self._move_to_head(node)
            return node.value
        return None

    def put(self, key, value):
        if key in self.cache:
            # If the key exists, update the value and move the node to the front
            node = self.cache[key]
            node.value = value
            self._move_to_head(node)
        else:
            if len(self.cache) >= self.capacity:
                # If the cache is full, remove the least recently used node (before the tail)
                tail_node = self._pop_tail()
                del self.cache[tail_node.key]
            # Add the new node to the front and update the cache
            new_node = ListNode(key, value)
            self._add_node(new_node)
            self.cache[key] = new_node

Time/Space Complexity

  • Time complexity: O(1), The solution maintains a constant time complexity of O(1) for the get() and put() and remove() operations. This is achieved by leveraging the efficient lookup of the hashmap for retrieving values and the constant-time operations of the doubly linked list for maintaining the order of recently used items.
  • Space complexity: O(n), where n represents the maximum capacity of the cache specified during initialization. The space used by the cache scales linearly with the maximum capacity, as it needs to store the key-value pairs and maintain the doubly linked list and hashmap structures.

About interviewing.io

interviewing.io is a mock interview practice platform. We've hosted over 100K mock interviews, conducted by senior engineers from FAANG & other top companies. We've drawn on data from these interviews to bring you the best interview prep resource on the web.

We know exactly what to do and say to get the company, title, and salary you want.

Interview prep and job hunting are chaos and pain. We can help. Really.