Browse Definitions :
Definition

write-through

What is write-through in storage?

Write-through is a storage method in which data is written into the cache and the corresponding main memory location at the same time.

With write-through caching, the computer's central processing unit writes data to its local processor cache and holds off on executing the corresponding application until the same data is also successfully written to the associated storage resource, which is usually main memory -- RAM -- or disk storage. The processor maintains high-speed access to cached data, while ensuring complete consistency between cache and data storage.

How does write-through cache work?

The concept of caching is well established in computer design as a means of accelerating processor and computer performance by placing frequently used data in a small amount of extremely fast memory -- the cache -- which is placed adjacent to the processor. This allows the processor to find often-used data much faster than accessing the same data from disk or even from much faster main memory. Modern processors frequently work with several levels of cache, such as first-level, or L1, cache or second-level, or L2, cache.

When a processor writes or outputs data, that data is first placed in a cache. The cache contents are then written to main memory or disk using any of the following three major algorithms or policies:

  • Write-through cache. With a write-through caching policy, the system's processor writes data to the cache first and then immediately copies that new cache data to the corresponding memory or disk. The application that's working to produce this data holds execution until the new data writes to storage are completed. Write-through caching ensures data is always consistent between cache and storage, though application performance could see a slight penalty because the application must wait for longer input/output operations to memory or even to disk.
  • Write-back cache. With a write-back caching policy, the processor writes data to the cache first, and then application execution is allowed to continue before the cached data is copied to main memory or disk. Cache contents are copied to storage later as a background task -- such as during idle processor cycles. This allows the processor to continue working on the application sooner and helps to improve application performance. However, this also means there could be brief time periods when application data in cache differs from data in storage, which raises a possibility of application data loss in the event of application disruption or crashes.
  • Write-around cache. A write-around caching policy doesn't use cache and writes data directly to storage assets such as main memory or even disk. This is sometimes preferred when an application must move large amounts of data that aren't well suited to cache. Write-around caching isn't appropriate for all applications, and not all applications benefit from this technique.
Different caching methods and their strengths.
This chart illustrates how write-through caching compares with write-back and write-around caching.

What is write-through cache used for?

Write-through cache is a caching technique or policy that controls how a processor and its local cache interact with other principal storage assets within the computer, such as RAM or disk -- whether solid-state drive or traditional hard disk drive -- and some storage-centric enterprise applications such as Structured Query Language, or SQL.

Write-through cache, as the name implies, writes data to the processor's local cache first and then immediately writes that cached data through to the final storage target in memory or disk. The application's execution halts until the data is successfully written to the final storage asset.

The purpose of write-through cache is to achieve data consistency between the processor's cache and application storage. Write-through caching ensures data in cache and storage is always identical, so there's no chance of data loss or corruption if the application crashes or the computer system fails before cache is written, which might happen with write-back caching. Write-through caching is often the preferred technique for critical computing tasks that can't tolerate the risk of data loss.

However, write-through caching holds application execution until new cache data is committed to storage. This can impose a slight penalty on effective or apparent application performance.

Write-through vs. write-back

Both write-through and write-back caching policies define the way that data is moved from the cache to major storage such as memory or disk.

Write-through cache immediately copies new cache data to corresponding storage in main memory or disk. This ensures data is always consistent between cache and storage before application execution is allowed to continue. Write-through cache is considered the safest form of caching because no data is lost or corrupted if the application or system fails. But this comes at the expense of application performance.

By comparison, write-back cache allows application execution to continue before cached data is committed to storage. Cached data is copied to storage at certain intervals or as a background task. This allows for better application performance, but it also poses a risk of data loss or corruption if the application or system should fail before cached data is committed to storage.

What are the benefits and disadvantages of write-through cache?

As a general concept, processor cache often helps to enhance application performance by allowing a processor to access recent data from cache far faster than it could access that same data from other storage resources such as main memory or disk. However, for cache to be helpful, the application needs the processor to read the same cached data often -- otherwise there's no need for cache.

Write-through cache provides this processor performance benefit by writing data to a cache first. However, the new data placed into the cache is also committed or copied to the corresponding location in main memory or disk before the application's execution is allowed to continue. This ensures the data in cache and storage is always consistent and there's no data lost or corrupted if the application fails for any reason. Write-through cache is usually a preferred technique for important applications that can't tolerate the risk of data loss.

However, the write-through process must halt application execution until the data is fully committed to storage. While this might only take a matter of milliseconds, frequent caching results in frequent pauses, and this can reduce the application's apparent performance. Human users typically won't be able to discern any impact of caching policies on application performance, but performance-centric applications might not be appropriate for write-through caching.

Learn how cloud storage caching works and the different types of caching appliances.

This was last updated in September 2023

Continue Reading About write-through

Networking
  • local area network (LAN)

    A local area network (LAN) is a group of computers and peripheral devices that are connected together within a distinct ...

  • TCP/IP

    TCP/IP stands for Transmission Control Protocol/Internet Protocol and is a suite of communication protocols used to interconnect ...

  • firewall as a service (FWaaS)

    Firewall as a service (FWaaS), also known as a cloud firewall, is a service that provides cloud-based network traffic analysis ...

Security
  • identity management (ID management)

    Identity management (ID management) is the organizational process for ensuring individuals have the appropriate access to ...

  • fraud detection

    Fraud detection is a set of activities undertaken to prevent money or property from being obtained through false pretenses.

  • single sign-on (SSO)

    Single sign-on (SSO) is a session and user authentication service that permits a user to use one set of login credentials -- for ...

CIO
  • project scope

    Project scope is the part of project planning that involves determining and documenting a list of specific project goals, ...

  • core competencies

    For any organization, its core competencies refer to the capabilities, knowledge, skills and resources that constitute its '...

  • change management

    Change management is a systematic approach to dealing with the transition or transformation of an organization's goals, processes...

HRSoftware
  • recruitment management system (RMS)

    A recruitment management system (RMS) is a set of tools designed to manage the employee recruiting and hiring process. It might ...

  • core HR (core human resources)

    Core HR (core human resources) is an umbrella term that refers to the basic tasks and functions of an HR department as it manages...

  • HR service delivery

    HR service delivery is a term used to explain how an organization's human resources department offers services to and interacts ...

Customer Experience
  • martech (marketing technology)

    Martech (marketing technology) refers to the integration of software tools, platforms, and applications designed to streamline ...

  • transactional marketing

    Transactional marketing is a business strategy that focuses on single, point-of-sale transactions.

  • customer profiling

    Customer profiling is the detailed and systematic process of constructing a clear portrait of a company's ideal customer by ...

Close