Browse Definitions :
Definition

hyperscale computing

Hyperscale computing is a distributed infrastructure that can quickly accommodate an increased demand for internet-facing and back-end computing resources without requiring additional physical space, cooling or electrical power. Hyperscale computing is characterized by standardization, automation, redundancy, high performance computing (HPC) and high availability (HA). The term is often associated with cloud computing and the very large data centers owned by Facebook, Google, Amazon and Netflix.

While a corporate data center might support hundreds of physical servers and thousands of virtual machines (VMs), a hyperscale data center needs to support thousands of physical servers and millions of virtual machines. To accommodate such demand, cloud providers like Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP) have developed new infrastructures that maximize hardware density, while minimizing the cost of cooling and administrative overhead.

There is a lot of interest in hyperscale computing right now because the open source software and architectural changes created for hyperscale data centers are expected to trickle down to smaller data centers, helping them to use physical space more efficiently, consume less power and respond more quickly to user’s needs. Hyperscale innovations currently being adopted by smaller organizations include software-defined networking (SDN), converged infrastructure and microsegmentation.

This was last updated in February 2018

Continue Reading About hyperscale computing

SearchNetworking
  • throughput

    Throughput is a measure of how many units of information a system can process in a given amount of time.

  • traffic shaping

    Traffic shaping, also known as packet shaping, is a congestion management method that regulates network data transfer by delaying...

  • open networking

    Open networking describes a network that uses open standards and commodity hardware.

SearchSecurity
  • buffer underflow

    A buffer underflow, also known as a buffer underrun or a buffer underwrite, is when the buffer -- the temporary holding space ...

  • single sign-on (SSO)

    Single sign-on (SSO) is a session and user authentication service that permits a user to use one set of login credentials -- for ...

  • pen testing (penetration testing)

    A penetration test, also called a pen test or ethical hacking, is a cybersecurity technique that organizations use to identify, ...

SearchCIO
  • benchmark

    A benchmark is a standard or point of reference people can use to measure something else.

  • spatial computing

    Spatial computing broadly characterizes the processes and tools used to capture, process and interact with 3D data.

  • organizational goals

    Organizational goals are strategic objectives that a company's management establishes to outline expected outcomes and guide ...

SearchHRSoftware
  • talent acquisition

    Talent acquisition is the strategic process employers use to analyze their long-term talent needs in the context of business ...

  • employee retention

    Employee retention is the organizational goal of keeping productive and talented workers and reducing turnover by fostering a ...

  • hybrid work model

    A hybrid work model is a workforce structure that includes employees who work remotely and those who work on site, in a company's...

SearchCustomerExperience
Close