Problem Statement:
Implement basic LRU (Least Recently Used) Cache
Solution:
LRU cache properties:
Full Source Code : LINK
Item Structure in the LRU Cache:
Implementation of the LRU Cache
This post is just to provide basic knowledge of LRU cache. You can extract basic idea from this post for developing your production code to work in multi-threaded environment and insertion of generic items. The above post is similar to the API implementation of LinkedHashMap in Java.
Feel free to post your comments or suggestions , critiques are too most welcome.
Happy Coding !! :)
Implement basic LRU (Least Recently Used) Cache
Solution:
LRU cache properties:
- Adds item to the start of list if not present in the list
- Updates the item if already present , and brings the item to the start of list
- If cache is full removes the last element and add the new item to the start of list.
Given below LRU cache is implemented using concept of Queue(implemented by Linked List) and Hash Table.
Item Structure in the LRU Cache:
/** * Item Node structure in the LRU Cache * @author Prateek */ class Item { public Item prev; public Item next; public String key; public String data; public Item(){} public Item(String key, String data){ this.key = key; this.data = data; } @Override public String toString(){ return this.key+" : "+this.data; } }
Implementation of the LRU Cache
/** * LRU Cache implementation * @author Prateek */ public class LRUCache { public Item head; public Item tail; public int maxsize; //to hold key for every item private HashMaphash; public LRUCache(int size) { this.maxsize = size ; hash = new HashMap (); head = new Item("0","0"); tail = new Item("0","0"); head.next = tail; tail.prev = head; } /** * Add item to the beggining of Linked List * @param item */ public void addItemAtFirst(Item item){ item.next = head.next; item.prev = head; head.next.prev = item; head.next = item; } /** * Remove item from the Linked List * @param item */ public void removeItem(Item item){ item.prev.next = item.next; item.next.prev = item.prev ; } /** * @return the value , and remove the node from between and add to the * the beggining */ public String get(String key){ Item item= hash.get(key); if(item!=null) { if(hash.size()==1) return item.data; removeItem(item); addItemAtFirst(item); return item.data; } else return null; } /** * Insert to hashmap and linkedlist, if size is exceeded remove the tail element */ public void put(String key, String data){ Item item = hash.get(key); if(item==null) //item not found { item = new Item(key,data); hash.put(key,item); addItemAtFirst(item); if(hash.size() == maxsize) { hash.remove(tail.prev.key); removeItem(tail.prev); } } else //item found in the hashmap, update data { item.data = data; removeItem(item); addItemAtFirst(item); } } /** * Display Cache Content */ public void displayLRUCache(){ System.out.println("Items in the Cache"); Item temp=head.next; while(temp.next!=null) { System.out.print(temp.data + " "); temp=temp.next; } } }
This post is just to provide basic knowledge of LRU cache. You can extract basic idea from this post for developing your production code to work in multi-threaded environment and insertion of generic items. The above post is similar to the API implementation of LinkedHashMap in Java.
Feel free to post your comments or suggestions , critiques are too most welcome.
Happy Coding !! :)
No comments:
Post a Comment