Getty Images/iStockphoto

The history of semiconductors and the chip-making industry

Semiconductors are the building blocks of modern technology and are essential for innovation in the modern age.

Semiconductors are the foundation of modern technology. They are found in day-to-day electronic devices such as cars, laptops, medical devices and smartphones.

 A semiconductor is a material, often silicon, that can act as a conductor or insulator and serves as a foundation for computers and other electronic devices. The term is now widely associated with integrated circuits.

Materials that enable the flow of electricity are conductors, while a substance that reduces the flow of electricity is known as an insulator. Semiconductors are somewhere in the middle, with properties between a conductor and an insulator. Semiconductors can block and conduct electricity when needed, acting like a switch.

The importance of semiconductors has only grown in recent decades, with the development of technologies such as smartphones and computers. As technology expands into all realms of modern life, the global economy increasingly depends on a steady supply of advanced chips. With the advent of technology such as AI, electric cars, wind turbines and the development of 5G networks, semiconductors act as the foundation for innovation by enabling the creation of transistors, integrated circuits and components used to store data, control electronic signals and process information.

Early developments of the semiconductor (1800s-1940s)

Though the first practical semiconductor was not invented until 1947 by John Bardeen, Walter Brattain, and William Shockley at Bell Laboratories, early developments and observations dating back to the early 19th century were building blocks toward the birth of the semiconductor.

  • 1821. German physicist Thomas Johann Seebeck discovered that a temperature difference between two dissimilar metals can create a voltage. This phenomenon is known as the Seebeck effect, and indirectly contributed to the invention of semiconductors by highlighting the varying property values of different materials and their response to temperature.
  • 1833. English scientist Michael Faraday, namesake of the electrical capacitance unit farad (F), discovered that silver sulfide's electrical conductivity increased as temperature rose during his investigations.
  • 1874. German electrical engineer, Karl Ferdinand Braun discovered the first semiconductor rectifier effect; the process of redirecting alternating current (AC) into a direct current (DC).
  • 1901. Indian physics professor Jagadish Chandra Bose filed a patent for the first semiconductor rectifier using crystals to detect radio waves.
  • 1906. Lee De Forest's invention of the triode vacuum tube -- or Audion -- strengthened weak signals and controlled the flow of current. The Audion paved the way for radio and telephone technologies.
  • 1927. Julius Lilienfeld, an American electrical engineer, patents the concept of a field-effect semiconductor device. This device was based on the semiconducting properties of copper sulfide.
  • 1930s. The advent of quantum mechanics in the 1930s laid the theoretical foundation of the semiconductor and explained how to manipulate materials to create devices such as integrated circuits. In 1931, Alan Wilson published The Theory of Electronic Semiconductors.
  • 1940. Russell Ohl discovered the p-n junction (the joining a p-type and an n-type semiconductor) as well as the photovoltaic effects in silicon. This discovery uncovered the unique electrical properties of silicon, which are essential in the creation of the transistor.

      The invention of the transistor

      In 1947, John Bardeen, Walter Brattain and William Shockley at Bell Labs invented the point-contact transistor, marking a significant moment in the creation of the semiconductor.

      This first functional transistor won the team at Bell Labs the Nobel Prize in Physics in 1956. The award recognized their "research on semiconductors and their discovery of the transistor effect." The small device was able to manipulate electrical signals and shift the landscape of electronics, the effects of which we are still experiencing today.

      The transistor consisted of a germanium base, two gold foil contacts and a plastic triangle holding each component together. It acted as an electronic switch, amplifying electrical signals and operating with much less power than vacuum tubes.

      The invention of the point-contact transistor was pivotal in the history of the semiconductor. The transistor was the basis of modern computers and led to the miniaturization of electronic devices.

      The move to silicon

      While germanium was used in early transistors, the use of silicon marked a pivotal moment in the history of the semiconductor. Prior to the 1950s, germanium was used in favor of silicon, due to silicon's unstable properties, including the following:

      • High reactivity with strong bases.
      • Oxidation when exposed to air.
      • Sensitivity to ionizing radiation.
      • Susceptibility to electromagnetic interference.
      • Thermal expansion.

      Though these unstable qualities originally pushed manufacturers towards germanium, ultimately the abundance and low cost of silicon made it an effective material for the industry. In 1954, Morris Tanenbaum, chemist at Bell Labs, used the grown-junction technique -- a technique that mixes P-type and N-type impurities into a single crystal during manufacturing -- to create the first silicon transistor. Later that same year, Gordon Teal at Texas Instruments independently created the first commercial silicon transistor. Though Tanenbaum was the first to create the silicon transistor, it was Texas Instruments that commercialized the invention  

      Famously, at the Institute of Radio Engineers National Conference in 1954, Gordon Teal announced, "Contrary to what my colleagues have told you about the bleak prospects for silicon transistors, I happen to have a few of them here in my pocket."

      While silicon did present certain challenges, it offered several advantages over germanium, including:

      • Better oxide formation capabilities (which became crucial for integrated circuits).
      • Greater abundance in nature.
      • Higher thermal stability.
      • More stable and predictable electrical properties.
      • Can withstand higher voltage.

      Though Texas Instruments was soon joined by companies such as Raytheon in commercializing silicon transistors, its early start established the company as a major player in the semiconductor industry.

      The development of integrated circuits

      In 1958, Jack Kilby at Texas Instruments and Robert Noyce at Fairchild developed the first integrated circuits (ICs). This led to their widespread adoption and commercial production throughout the 1960s.

      ICs are small devices that combine components such as transistors, resistors and capacitors onto one semiconductor material. Prior to 1958, vacuum tubes were used in place of ICs, but these were not space-efficient and required a large amount of power to run.

      Integrated circuits have many key benefits, including the following:

      • Cost-effective.
      • Enhanced reliability.
      • Greater design flexibility.
      • Improved performance.
      • Space efficient.

      In 1965, Gordon Moore, co-founder of Intel, introduced Moore's Law. Moore observed that every 24 months, the number of transistors on an integrated circuit doubled, while costs halved. This observation is relevant because it demonstrates the pace of semiconductor advancement and has propelled investment in the semiconductor industry. Today, though the pace has slightly slowed, the observations still largely run true.

      The microprocessor age

      A microprocessor is a type of semiconductor, specifically a single semiconductor chip. It is an IC that functions as a computer's central processing unit (CPU). It contains millions of transistors on a single chip and performs the basic operations of a computer program. The microprocessor works by decoding instructions and then performing the calculation. Today, microprocessors can perform billions of operations per second.

      In 1971, Intel introduced the 4004, the first commercial microprocessor, and by 1978, the company had introduced the 8086 processor. Unlike previous processors, the 8086 could handle significantly larger chunks of information and could run at a much faster speed. The 8086 kickstarted the "x86" family of processors, which are still used to power computers today.

      The development of the microprocessor largely impacted the semiconductor industry by making personal computers more powerful and practical. The advent of the microprocessor also opened new markets for the semiconductor industry, including memory chips and interface circuits. At its core, the invention of the microprocessor helped to increase the demand for semiconductors around the world.

      The modern semiconductor industry

      The early 2000s continued to see the semiconductor industry grow exponentially as personal computing became a staple of modern life, and then as smartphones drove demand for mobile processors. Power efficiency became central to chip design as consumer demands, such as longer battery life, higher power and smaller devices, shifted with technology development. Power-saving technologies such as dynamic voltage, sleep states and efficient transistor designs helped combat technical challenges by using less energy.

      Cloud computing equally impacted the semiconductor industry, specifically when it came to chip demands. As data centers expanded and needed specialized processors, memory chips that could handle larger amounts of data, and network processors that enabled faster connections were essential.

      The cloud computing era also created new markets for the semiconductor industry. Companies such as Amazon and Microsoft became prominent chip buyers. Other cloud companies began to design their own chips, and as innovation boomed to keep up with demand, new types of memory and storage chips were developed

      Today, AI is a major player in the semiconductor industry. The demand for AI-driven hardware is at an all-time high, with Deloitte predicting that in 2025, gen AI chips will be worth $150 billion. A new growth cycle is expected to take place over the next few years, with Pierre Cambou, principal analyst at Yole Group, predicting that the semiconductor industry will be a $1 trillion market by 2030 due to digital transformations such as AI and machine learning.

      As of April 2025, the largest semiconductor manufacturers according to Statista are:

      • Nvidia.
      • Broadcom.
      • TSMC.
      • Samsung.
      • ASML.

      Industry challenges of the semiconductor industry

      As the semiconductor industry grows, it faces several challenges, including the following:

      • Supply chain vulnerabilities. The semiconductor industry is reliant on the supply chain, and with a high concentration of manufacturing happening in Asia, the supply chain can be volatile. There are a limited number of suppliers for critical materials, and with long lead times for equipment and materials, demand is high.
      • Geopolitical tensions. With manufacturing happening overseas, any form of geopolitical tension is likely to harm the industry. New tariffs, disrupted supply chains, and increased costs are impacting manufacturers and consumers. U.S.-China trade restrictions and tech controls are changing market dynamics and creating levels of uncertainty. Geopolitical tensions are also apparent in the competition for semiconductor sovereignty and the heightening export control regulations.
      • Environmental challenges. The manufacturing of semiconductors involves high energy consumption. According to McKinsey & Company, a typical semiconductor fabrication plant will use as much power in a single year as 50,000 homes. Production also involves significant water use and chemical waste management. Extreme weather conditions can also disrupt manufacturing and damage the global supply chain of semiconductor chips.

      Rosa Heaton is a content manager for the Learning Content group at Informa TechTarget.

      Dig Deeper on Computer science