Browse Definitions :

distributed computing

What is distributed computing?

Distributed computing is a model in which components of a software system are shared among multiple computers or nodes. Even though the software components may be spread out across multiple computers in multiple locations, they're run as one system. This is done to improve efficiency and performance. The systems on different networked computers communicate and coordinate by sending messages back and forth to achieve a defined task.

Distributed computing can increase performance, resilience and scalability, making it a common computing model in database and application design.

How distributed computing works

Distributed computing networks can be connected as local networks or through a wide area network if the machines are in a different geographic location. Processors in distributed computing systems typically run in parallel.

In enterprise settings, distributed computing generally puts various steps in business processes at the most efficient places in a computer network. For example, a typical distribution has a three-tier model that organizes applications into the presentation tier (or user interface), the application tier and the data tier. These tiers function as follows:

  1. User interface processing occurs on the PC at the user's location
  2. Application processing takes place on a remote computer
  3. Database access and processing algorithms happen on another computer that provides centralized access for many business processes

In addition to the three-tier model, other types of distributed computing include client-server, n-tier and peer-to-peer:

  • Client-server architectures. These use smart clients that contact a server for data, then format and display that data to the user.
  • N-tier system architectures. Typically used in application servers, these architectures use web applications to forward requests to other enterprise services.
  • Peer-to-peer architectures. These divide all responsibilities among all peer computers, which can serve as clients or servers.
How distributed computing works
An example of how networks, servers and computers are structured in distributed computing.

Benefits of distributed computing

Distributed computing includes the following benefits:

  • Performance. Distributed computing can help improve performance by having each computer in a cluster handle different parts of a task simultaneously.
  • Scalability. Distributed computing clusters are scalable by adding new hardware when needed.
  • Resilience and redundancy. Multiple computers can provide the same services. This way, if one machine isn't available, others can fill in for the service. Likewise, if two machines that perform the same service are in different data centers and one data center goes down, an organization can still operate.
  • Cost-effectiveness. Distributed computing can use low-cost, off-the-shelf hardware.
  • Efficiency.Complex requests can be broken down into smaller pieces and distributed among different systems. This way, the request is simplified and worked on as a form of parallel computing, reducing the time needed to compute requests.
  • Distributed applications. Unlike traditional applications that run on a single system, distributed applications run on multiple systems simultaneously.

Grid computing, cloud computing and distributed computing

Grid computing is a computing model involving a distributed architecture of multiple computers connected to solve a complex problem. In the grid computing model, servers or PCs run independent tasks and are linked loosely by the internet or low-speed networks. Individual participants can enable some of their computer's processing time to solve complex problems.

[email protected] is one example of a grid computing project. Although the project's first phase wrapped up in March 2020, for more than 20 years, individual computer owners volunteered some of their multitasking processing cycles -- while concurrently still using their computers -- to the Search for Extraterrestrial Intelligence (SETI) project. This computer-intensive problem used thousands of PCs to download and search radio telescope data.

Grid computing and distributed computing are similar concepts that can be hard to tell apart. Generally, distributed computing has a broader definition than grid computing. Grid computing is typically a large group of dispersed computers working together to accomplish a defined task. Conversely, distributed computing can work on numerous tasks simultaneously. Some may also define grid computing as just one type of distributed computing. In addition, while grid computing typically has well-defined architectural components, distributed computing can have various architectures, such as grid, cluster and cloud computing.

Cloud computing is also similar in concept to distributed computing. Cloud computing is a general term for anything that involves delivering hosted services over the internet. These services, however, are divided into three main types: infrastructure as a service (IaaS), platform as a service (PaaS) and software as a service (SaaS). Cloud computing is also divided into private and public clouds. A public cloud sells services to another party, while a private cloud is a proprietary network that supplies a hosted service to a limited number of people, with specific access and permissions settings. Cloud computing aims to provide easy, scalable access to computing resources and IT services.

Cloud and distributed computing both focus on spreading a service or services to a number of different machines; however, cloud computing typically offers a service like a specific software or storage for organizations to use on their own tasks. Meanwhile, distributed computing involves distributing services to different computers to aid in or around the same task.

Learn more about distributed computing and how edge object storage helps improve distributed systems.  

This was last updated in August 2022

Continue Reading About distributed computing

  • quantum key distribution (QKD)

    Quantum key distribution (QKD) is a secure communication method for exchanging encryption keys only known between shared parties.

  • Common Body of Knowledge (CBK)

    In security, the Common Body of Knowledge (CBK) is a comprehensive framework of all the relevant subjects a security professional...

  • buffer underflow

    A buffer underflow, also known as a buffer underrun or a buffer underwrite, is when the buffer -- the temporary holding space ...

  • benchmark

    A benchmark is a standard or point of reference people can use to measure something else.

  • spatial computing

    Spatial computing broadly characterizes the processes and tools used to capture, process and interact with 3D data.

  • organizational goals

    Organizational goals are strategic objectives that a company's management establishes to outline expected outcomes and guide ...

  • talent acquisition

    Talent acquisition is the strategic process employers use to analyze their long-term talent needs in the context of business ...

  • employee retention

    Employee retention is the organizational goal of keeping productive and talented workers and reducing turnover by fostering a ...

  • hybrid work model

    A hybrid work model is a workforce structure that includes employees who work remotely and those who work on site, in a company's...

Customer Experience
  • database marketing

    Database marketing is a systematic approach to the gathering, consolidation and processing of consumer data.

  • cost per engagement (CPE)

    Cost per engagement (CPE) is an advertising pricing model in which digital marketing teams and advertisers only pay for ads when ...

  • B2C (Business2Consumer or Business-to-Consumer)

    B2C -- short for business-to-consumer -- is a retail model where products move directly from a business to the end user who has ...