Browse Definitions :
Definition

order of magnitude

What is order of magnitude?

An order of magnitude is an exponential change of plus or minus 1 in the value of a quantity or unit. The term is generally used in conjunction with power-of-10 scientific notation.

Order of magnitude is used to make the size of numbers and measurements of things more intuitive and understandable. It is generally used to provide approximate comparisons between two numbers. For example, if the circumference of the Sun is compared with the circumference of the Earth, the Sun's circumference would be described as many orders of magnitude larger than the Earth's.

How is order of magnitude calculated?

Order of magnitude refers to the class of scale of any numerical value in which each class contains values of a fixed ratio to the class before it.

On a logarithmic scale -- such as base 10, the most common numeration scheme worldwide -- an increase of one order of magnitude is the same as multiplying a quantity by 10. That increases the exponent by one to the next nearest power of 10. An increase of two orders of magnitude is the equivalent of multiplying by 100, or 102. In general, an increase of n orders of magnitude is the equivalent of multiplying a quantity by 10n. Thus, 2,315 is one order of magnitude larger than 231.5, which in turn is one order of magnitude larger than 23.15.

As values get smaller, a decrease of one order of magnitude is the same as multiplying a quantity by 0.1. A decrease of two orders of magnitude is the equivalent of multiplying by 0.01, or 10-2. In general, a decrease of n orders of magnitude is the equivalent of multiplying a quantity by 10-n. Thus, 23.15 is one order of magnitude smaller than 231.5, which in turn is one order of magnitude smaller than 2,315. As the order of magnitude of a number gets smaller, the decimal moves to the left.

In the International System of Units, most quantities can be expressed in multiple or fractional terms according to the order of magnitude. For example, attaching the prefix kilo- to a unit increases the size of the unit by three orders of magnitude, or 1,000 (103). Attaching the prefix micro- to a unit decreases the size of the unit by six orders of magnitude, the equivalent of multiplying it by 1 millionth (10-6). Scientists and engineers have designated prefix multipliers from septillionths (10-24) to septillions (1024), a span of 48 orders of magnitude.

Order of magnitude uses and examples

Order of magnitude is used to make estimates and approximate comparisons in scientific notation. It is a way to represent numbers that are comparatively larger or smaller than other numbers.

Some examples of very large and very small numbers that benefit from order of magnitude representation include the following:

Some different ways that order of magnitude can be used include the following:

Differences. This is the number of factors of 10 that lie between two values. For example, if one value is 100 times bigger than another value, the first value is two orders of magnitude larger than the other.

Estimates. This is when an order of magnitude is used to estimate the value of a variable with an unknown precise value. An example is if someone were estimating the Earth's population, which is around 7.9 billion. The exact number is unknown, but it falls somewhere between 1 and 10 billion. If rounding to the next order of magnitude, someone would say the Earth's population is approximately 10 billion, because it is a multiple of 10.

There are free order of magnitude calculation tools online. These tools are used to represent a larger number more simply and intuitively using scientific notation, like the amount of exabytes in a digital storage medium.

The definition of order of magnitude is also used more loosely in various contexts. The term can simply mean a very large or small number, a large or small amount of something, or something that is significantly larger or smaller.

Big data architecture stores massive amounts of data that would be suitable for using orders of magnitude to measure. Learn the core components and best practices for building a big data architecture capable of handling ever-expanding volumes.

This was last updated in February 2022

• throughput

Throughput is a measure of how many units of information a system can process in a given amount of time.

• traffic shaping

Traffic shaping, also known as packet shaping, is a congestion management method that regulates network data transfer by delaying...

• open networking

Open networking describes a network that uses open standards and commodity hardware.

• buffer underflow

A buffer underflow, also known as a buffer underrun or a buffer underwrite, is when the buffer -- the temporary holding space ...

• single sign-on (SSO)

Single sign-on (SSO) is a session and user authentication service that permits a user to use one set of login credentials -- for ...

• pen testing (penetration testing)

A penetration test, also called a pen test or ethical hacking, is a cybersecurity technique that organizations use to identify, ...

• benchmark

A benchmark is a standard or point of reference people can use to measure something else.

• spatial computing

Spatial computing broadly characterizes the processes and tools used to capture, process and interact with 3D data.

• organizational goals

Organizational goals are strategic objectives that a company's management establishes to outline expected outcomes and guide ...

• talent acquisition

Talent acquisition is the strategic process employers use to analyze their long-term talent needs in the context of business ...

• employee retention

Employee retention is the organizational goal of keeping productive and talented workers and reducing turnover by fostering a ...

• hybrid work model

A hybrid work model is a workforce structure that includes employees who work remotely and those who work on site, in a company's...

Close