Browse Definitions :

Getty Images/iStockphoto

A brief history of the evolution and growth of IT

The history of information technology began long before the modern-day computer was ever invented.

The history that led to the development of IT as it's known today goes back millennia.

But the term information technology is a relatively recent development. The phrase first appeared in a 1958 Harvard Business Review article which predicted its future effects, titled Management in the 1980s:

"Over the last decade a new technology has begun to take hold in American business, one so new that its significance is still difficult to evaluate ... The new technology does not yet have a single established name. We shall call it information technology."

Information technology has evolved and changed ever since. This article will explore that history and the meaning of IT.

What is IT today?

Information technology is no longer just about installing hardware or software, solving computer issues, or controlling who can access a particular system. Today, IT professionals are in demand, and they also:

  • create policies to ensure that IT systems run effectively and are aligned with an organization's strategic goals;
  • maintain networks and devices for maximum uptime;
  • automate processes to improve business efficiency;
  • research, implement and manage new technologies to accommodate changing business needs; and
  • maintain service levels, security and connectivity to ensure business continuity and longevity.

In fact, today's modern hyper-connected data economy would collapse without information technology.  

The slow evolution of computers and computing technology

Before the modern-day computer ever existed, there were precursors that helped people achieve complex tasks.

The abacus is the earliest known calculating tool, in use since 2400 B.C.E. and still used in part of the world today. An abacus consists of rows of movable beads on a rod that represent numbers.

But it wasn't until the 1800s that the idea of programming devices really came along. At this time the Jacquard loom was developed, enabling looms to produce fabrics with intricate woven patterns. This system used punched cards that were fed into the loom to control weaving patterns. Computers well into the 20th century used the loom's system of automatically issuing machine instructions. But electronic devices eventually replaced this method.

In the 1820s, English mechanical engineer Charles Babbage -- known as the father of the computer -- invented the Difference Engine to aid in navigational calculations. This was regarded as the first mechanical computer device.

Then in the 1830s, he released plans for his analytical engine. The Analytical Engine would have operated on a punch card system. Babbage's pupil, Ada Lovelace, expanded on these plans. She brought these plans beyond simple math calculations and designed a series of operational instructions for the machine -- now known as a computer program. The Analytical Engine would have been the world's first general-purpose computer. But it was never completed, and the instructions were never executed.

Many of the data processing and execution capabilities of modern IT, such as conditional branches (if statements) and loops, are derived from the early work of Jacquard, Babbage and Lovelace.

Herman Hollerith, an American inventor and statistician, also used punch cards to feed data to his census-tabulating machine in the 1890s. This was an important precursor of the modern electronic computer. Hollerith's machine recorded statistics by automatically reading and sorting cards numerically encoded by perforation position. Hollerith started the Tabulating Machine Company to manufacture these machines in 1911. It was renamed International Business Machines Corp. (IBM) in 1924.

German engineer Konrad Zuse invented Z2, one of the world's earliest electromechanical relay computers, in 1940. It had very low operating speeds that would be unimaginable today. Later in the 1940s came Colossus computers, developed during World War II by British codebreakers. These computers intercepted and deciphered encrypted communications from German cipher machines, code-named "Tunny." Around the same time, British mathematician Alan Turing invented the Bombe. This machine decrypted messages from the German Enigma machine. 

Turing -- immortalized by the Turing Test -- first conceptualized the modern computer in his paper "On Computable Numbers" in 1936. In this piece, Turing suggested that programmable instructions could be stored in a machine's memory to execute certain activities. This concept forms the very basis of modern computing technology.

By 1951, British electrical engineering company Ferranti Ltd. produced the Ferranti Mark 1, the world's first commercial general-purpose digital computer. This machine was based on the Manchester Mark 1, developed at Victoria University of Manchester. 

The IT revolution picks up pace

J. Lyons and Co. released the LEO I computer in 1951 and ran its first business application that same year. MIT's Whirlwind -- also released in 1951 -- was one of the first digital computers capable of operating in real time. In 1956, it also became the first computer that enabled users to input commands with a keyboard.

As computers evolved, so too did what eventually led to the field of IT. From the 1960s onward, the development of the following devices set the stage for an IT revolution:

  • screens
  • text editors
  • the mouse
  • hard drives
  • fiber optics
  • integrated circuits
  • programming languages such as FORTRAN and COBOL

Today's IT sector is no longer the exclusive domain of mathematicians. It employs professionals from a variety of backgrounds and skillsets, such as network engineers, programmers, business analysts, project managers and cybersecurity analysts.

Read more here about the top cybersecurity careers.

The information revolution and the invention of the internet

In the 1940s, '50s and '60s, governments, defense establishments and universities dominated computing IT. However, it also spilled over into the corporate world with the development of office applications such as spreadsheets and word processing software. This created a need for specialists who could design, create, adapt and maintain the hardware and software required to support business processes.

Various computer languages were created and experts for those languages also appeared. Oracle and SAP programmers emerged to run databases, and C programmers to write and update networking software. These were in high demand -- a trend that continues to this day, especially in areas of cybersecurity, AI and compliance.

The invention of email in the 1970s revolutionized IT and communications. Email began as an experiment to see if two computers could exchange a message, but it evolved into a fast and easy way for humans to stay in touch. The term "email" itself was not coined until later, but many of its early standards, including the use of @, are still in use today.

Many IT technologies owe their existence to the internet and the world wide web. However, ARPANET, a U.S. government-funded network that was conceptualized as an intergalactic computer network by MIT scientists in the 1960s, is considered the precursor of the modern internet. ARPANET grew into an interconnected network of networks from just four computers. It eventually led to the development of Transmission Control Protocol (TCP) and Internet Protocol (IP). This enabled distant computers to communicate with each other virtually. Packet switching -- sending information from one computer to another -- also brought machine-to-machine communication from the realm of possibility to fruition.

Tim Berners-Lee introduced the World Wide Web, an "internet" that was a web of information retrievable by anyone, in 1991. In 1996, the Nokia 9000 Communicator became the world's first internet-enabled mobile device. By this time, the world's first search engine, the first laptop computer and the first domain search engine were already available. In the late '90s, search engine giant Google was established.

The turn of the century saw the development of WordPress, an open source web content management system. This enabled humans to move from web consumers to active participants, posting their own content.

IT continues to expand

Since the invention of the world wide web, the IT realm has quickly expanded. Today, IT encompasses tablets, smartphones, voice-activated technology, nanometer computer chips, quantum computers and more.

Cloud computing, first invented in the 1960s, is now an inseparable part of many organizations' IT strategies. In the 1960s and '70s, the concept of time-sharing -- sharing computing resources with multiple users at the same time -- was developed. And by 1994, the cloud metaphor described virtual services and machines that act as real computer systems.  

But it wasn't until 2006 and the creation of Amazon Web Services (AWS) that cloud computing really took off. AWS and its top competitors -- Google Cloud Platform, Microsoft Azure and Alibaba Cloud -- now hold the largest slice of the cloud computing market. The top three providers -- AWS, Google and Azure -- accounted for 58% of the total cloud spending in the first quarter of 2021.

Learn more about the history of cloud computing here.

Over the past decade, other technological advancements have also influenced the world of IT. This includes developments in:

  • social media
  • internet of things
  • artificial intelligence
  • computer vision
  • machine learning
  • robotic process automation
  • big data
  • mobile computing -- in both devices and communications technologies such as 4G and 5G

Connectivity between systems and networks is also on the rise. By 2030, there will be an estimated 500 billion devices connected to the internet, according to a Cisco report.

Next Steps

Jetsons technology that became mainstream

The evolution of television technology explained

Dig Deeper on IT management

Networking
  • local area network (LAN)

    A local area network (LAN) is a group of computers and peripheral devices that are connected together within a distinct ...

  • TCP/IP

    TCP/IP stands for Transmission Control Protocol/Internet Protocol and is a suite of communication protocols used to interconnect ...

  • firewall as a service (FWaaS)

    Firewall as a service (FWaaS), also known as a cloud firewall, is a service that provides cloud-based network traffic analysis ...

Security
  • identity management (ID management)

    Identity management (ID management) is the organizational process for ensuring individuals have the appropriate access to ...

  • fraud detection

    Fraud detection is a set of activities undertaken to prevent money or property from being obtained through false pretenses.

  • single sign-on (SSO)

    Single sign-on (SSO) is a session and user authentication service that permits a user to use one set of login credentials -- for ...

CIO
  • IT budget

    IT budget is the amount of money spent on an organization's information technology systems and services. It includes compensation...

  • project scope

    Project scope is the part of project planning that involves determining and documenting a list of specific project goals, ...

  • core competencies

    For any organization, its core competencies refer to the capabilities, knowledge, skills and resources that constitute its '...

HRSoftware
  • recruitment management system (RMS)

    A recruitment management system (RMS) is a set of tools designed to manage the employee recruiting and hiring process. It might ...

  • core HR (core human resources)

    Core HR (core human resources) is an umbrella term that refers to the basic tasks and functions of an HR department as it manages...

  • HR service delivery

    HR service delivery is a term used to explain how an organization's human resources department offers services to and interacts ...

Customer Experience
  • martech (marketing technology)

    Martech (marketing technology) refers to the integration of software tools, platforms, and applications designed to streamline ...

  • transactional marketing

    Transactional marketing is a business strategy that focuses on single, point-of-sale transactions.

  • customer profiling

    Customer profiling is the detailed and systematic process of constructing a clear portrait of a company's ideal customer by ...

Close