How to Solve LRU Cache
LRU Cache Introduction
LRU Cache Introduction
The LRU Cache problem requires designing an LRU Cache (Least Recently Used Cache) that supports two operations: get and put in this case. The key is in achieving O(1) time by utilizing a double-linked list and a hash map.
LRU Cache Problem
LRU Cache Problem
Implement an LRU Cache
LRU = Least recently used cache
LRU caches are often used to implement caches which you do not want to grow indefinitely. The cache has a max size, and when a new key is inserted that would make it grow larger than the max size, the key which has not been accessed for the longest time is removed to make space.
It should support these operations:
get(key): get the value of the key if the key exists in the cache put(key, value): update or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item. What happens if you call get(key) on a key which doesn't exist in the cache is up to you to define.
Example usage:
cache = new LruCache (2) // capacity = 2
cache.put(1, "1") cache.put(2, "2") cache.get(1) // returns "1" cache.put(3, "3") was retrieved // evicts key 2, because the key 1 was retrieved cache.get(2) evicted // returns null, because 2 was just evicted cache.put(4, "4") // evicts key 1, cache.get(1) // returns null because, it was evicted earlier cache.get(3) // returns "3" cache.get(4) // returns "4"
LRU Cache Solutions
LRU Cache Solutions
To solve the LRU Cache problem, we take an optimized approach by combining a doubly linked list and a hashmap. Think of the cache as a collection of items, where each item has a key-value pair. The doubly linked list helps us keep track of the items based on their usage history, while the hashmap allows us to quickly access and update the items.
When an item is accessed or inserted, it is moved to the front of the linked list to represent its recent usage. This ensures that the most recently used items are always at the front, while the least recently used ones are towards the rear. If the cache becomes full and we need to insert a new item, we can easily evict the least recently used item from the rear of the linked list and remove it from the hashmap.
By using the hashmap, we achieve fast retrieval and update of items with a time complexity of O(1) for both get() and put() operations. The doubly linked list helps maintain the order of the items, enabling efficient removal of the least recently used item when needed. This provides an efficient LRU Cache implementation, allowing us to handle large datasets while maintaining constant time operations for accessing and updating the cache.
class ListNode:
def __init__(self, key=None, value=None):
self.key = key
self.value = value
self.prev = None
self.next = None
class LRUCache:
def __init__(self, capacity):
self.capacity = capacity
self.cache = {}
self.head = ListNode()
self.tail = ListNode()
self.head.next = self.tail
self.tail.prev = self.head
def _add_node(self, node):
# Add the node after the head node
node.prev = self.head
node.next = self.head.next
self.head.next.prev = node
self.head.next = node
def _remove_node(self, node):
# Remove the node from the linked list
prev_node = node.prev
next_node = node.next
prev_node.next = next_node
next_node.prev = prev_node
def _move_to_head(self, node):
# Move the node to the front (after the head)
self._remove_node(node)
self._add_node(node)
def _pop_tail(self):
# Remove and return the node before the tail
tail_node = self.tail.prev
self._remove_node(tail_node)
return tail_node
def get(self, key):
if key in self.cache:
node = self.cache[key]
self._move_to_head(node)
return node.value
return None
def put(self, key, value):
if key in self.cache:
# If the key exists, update the value and move the node to the front
node = self.cache[key]
node.value = value
self._move_to_head(node)
else:
if len(self.cache) >= self.capacity:
# If the cache is full, remove the least recently used node (before the tail)
tail_node = self._pop_tail()
del self.cache[tail_node.key]
# Add the new node to the front and update the cache
new_node = ListNode(key, value)
self._add_node(new_node)
self.cache[key] = new_node
1class ListNode:
2 def __init__(self, key=None, value=None):
3 self.key = key
4 self.value = value
5 self.prev = None
6 self.next = None
7
8class LRUCache:
9 def __init__(self, capacity):
10 self.capacity = capacity
11 self.cache = {}
12 self.head = ListNode()
13 self.tail = ListNode()
14 self.head.next = self.tail
15 self.tail.prev = self.head
16
17 def _add_node(self, node):
18 # Add the node after the head node
19 node.prev = self.head
20 node.next = self.head.next
21 self.head.next.prev = node
22 self.head.next = node
23
24 def _remove_node(self, node):
25 # Remove the node from the linked list
26 prev_node = node.prev
27 next_node = node.next
28 prev_node.next = next_node
29 next_node.prev = prev_node
30
31 def _move_to_head(self, node):
32 # Move the node to the front (after the head)
33 self._remove_node(node)
34 self._add_node(node)
35
36 def _pop_tail(self):
37 # Remove and return the node before the tail
38 tail_node = self.tail.prev
39 self._remove_node(tail_node)
40 return tail_node
41
42 def get(self, key):
43 if key in self.cache:
44 node = self.cache[key]
45 self._move_to_head(node)
46 return node.value
47 return None
48
49 def put(self, key, value):
50 if key in self.cache:
51 # If the key exists, update the value and move the node to the front
52 node = self.cache[key]
53 node.value = value
54 self._move_to_head(node)
55 else:
56 if len(self.cache) >= self.capacity:
57 # If the cache is full, remove the least recently used node (before the tail)
58 tail_node = self._pop_tail()
59 del self.cache[tail_node.key]
60 # Add the new node to the front and update the cache
61 new_node = ListNode(key, value)
62 self._add_node(new_node)
63 self.cache[key] = new_node
Time/Space Complexity
- Time complexity:
O(1)
, The solution maintains a constant time complexity of O(1) for the get() and put() and remove() operations. This is achieved by leveraging the efficient lookup of the hashmap for retrieving values and the constant-time operations of the doubly linked list for maintaining the order of recently used items. - Space complexity:
O(n)
, where n represents the maximum capacity of the cache specified during initialization. The space used by the cache scales linearly with the maximum capacity, as it needs to store the key-value pairs and maintain the doubly linked list and hashmap structures.

About interviewing.io
interviewing.io is a mock interview practice platform. We've hosted over 100K mock interviews, conducted by senior engineers from FAANG & other top companies. We've drawn on data from these interviews to bring you the best interview prep resource on the web.