cache algorithm

A cache algorithm is a detailed list of instructions that directs which items should be discarded in a computing device's cache of information. 

Examples of cache algorithms include:

Least Frequently Used (LFU): This cache algorithm uses a counter to keep track of how often an entry is accessed. With the LFU cache algorithm, the entry with the lowest count is removed first. This method isn't used that often, as it does not account for an item that had an initially high access rate and then was not accessed for a long time.

Least Recently Used (LRU): This cache algorithm keeps recently used items near the top of cache. Whenever a new item is accessed, the LRU places it at the top of the cache. When the cache limit has been reached, items that have been accessed less recently will be removed starting from the bottom of the cache. This can be an expensive algorithm to use, as it needs to keep "age bits" that show exactly when the item was accessed. In addition, when a LRU cache algorithm deletes an item, the "age bit" changes on all the other items.

Adaptive Replacement Cache (ARC): Developed at the IBM Almaden Research Center, this cache algorithm keeps track of both LFU and LRU, as well as evicted cache entries to get the best use out of the available cache.

Most Recently Used (MRU): This cache algorithm removes the most recently used items first. A MRU algorithm is good in situations in which the older an item is, the more likely it is to be accessed.

See also: Memcached

This was last updated in January 2014

Continue Reading About cache algorithm

Dig Deeper on Storage architecture and strategy

Disaster Recovery
Data Backup
Data Center