What is the relationship between cache technology, CPU and RAM?

As flash memory caching is on its way in because of its cost and non-volatile nature, RAM may be on its way out the door, says analyst Jon Toigo.

"Cache" is an over-used term, describing everything from extremely fast memory built directly into the computer's CPU (called "Level 1" or "L1") or on an adjacent chip (L2 or L3), to RAM accessed across a motherboard backplane, to flash or disk drives used to store frequently accessed data (Read caching) or to organize a lot of write operations for greater efficiency (write cache). Where tape is used for purposes such as active archiving, a front-end rank of disk may be used as a tape cache.

Generally speaking, cache technology is an optimizing technology usually deployed to help balance the differences between different components. The CPU uses memory to hold instructions that are used repeatedly in the operation of programs. L1 cache memories (and, in some configurations, L2 cache memories) are built directly into chips to facilitate the fastest possible access to memory locations, supporting faster CPU performance. In other cases, adjacent chips are architected with direct pathways to the CPU, again to optimize chip performance. When L1 and L2 are built into the CPU chip, the adjacent chip is often referred to as "L3 cache." If the CPU has only L1 cache, the adjacent chip might play the role of L2 cache technology.

RAM is dynamic and usually volatile (meaning its contents will be lost if power is discontinued) memory that users can install on a motherboard. It is usually about half as fast as L1, L2 or L3 cache, and much less expensive. Since it is accessed by the CPU across the motherboard of the computer, it is subject to the speed limits of the bus. RAM, however, is much faster in terms of data access than are mechanical storage devices such as hard disks or tape and came into great use in the last few decades as a location for storing frequently accessed disk data with a goal of expediting I/O performance.

Flash memory has come into vogue in part as an alternative to traditional RAM caching. Flash is less expensive than RAM and it is non-volatile. Plus, disk vendors are showing great interest in pairing disk with flash to improve the performance of higher capacity disk or disk pools, prior to what some analysts claim will be the wholesale replacement of all magnetic media with solid-state storage in the medium-term future. There is also substantial discussion of storage architectures such as flape (flash plus tape) in which data is written to flash storage and to tape. When access to the data decreases, the data is eliminated from the flash storage and retained only on tape.

Next Steps

Avoiding write caching problems with SSD

Infinio releases new version of RAM cache tool

Dig Deeper on Flash memory and storage

Disaster Recovery
Data Backup
  • RAID 5 vs. RAID 10

    RAID is a staple for backup and storage administrators who want to create redundancy and protect data. RAID 5 and RAID 10 offer ...

  • Eon makes cloud backups available across major hyperscalers

    Cloud backup vendor Eon, which emerged from stealth earlier this month, offers a platform for hyperscaler migrations with ...

  • RAID 1 vs. RAID 5

    Neither RAID 1 nor RAID 5 is clearly better than the other, but there are several areas to compare the two to find the right ...

Data Center
Sustainability
and ESG
Close