Union IT Minister Ashwini Vaishnaw presented India’s first fully indigenous microprocessor — a type of semiconductor chip — to Prime Minister Narendra Modi at Semicon India 2025 on September 2. The chip, called Vikram 3201, has been developed by the semiconductor laboratory of the Indian Space Research Organisation (ISRO).
At the event, Vaishnaw said, “Just a few years ago, we met for the first time to make a new beginning driven by our Prime Minister’s farsighted vision, we launched the India Semiconductor mission [in 2021]… In a short span of 3.5 years, we have the world looking at India with confidence. Today, the construction of five Semiconductor units is going on at a rapid pace… We just presented the first “Made-in-India” chip to PM Modi.”
Semiconductor chips are the building blocks of modern computation. From smartphones to the vast data centres powering the Internet, from electric cars to cruise missiles, from high-end luxury products to weather-predicting supercomputers — all of them run on these tiny chips.
But what exactly are semiconductor chips? What do they have to do with computers? How are they manufactured? We take a look.
Semiconductors are unique, in that most materials are either conductors — which let electric current flow — or insulators — which block electricity. For instance, a copper wire is a conductor, while glass is an insulator.
Semiconductors, however, are different. Although in their natural state, semiconductors are a weak conductor of electricity, when certain materials are added to them and an electric field is applied, current can start to flow. Adding phosphorus to semiconducting materials, such as silicon and germanium, for example, allows the flow of a negative current.
Semiconducting materials became crucial after the invention of transistors, which power all modern electronics today, in the late 1940s. That is because they were one of the first electronic components to be built using a semiconductor.
The first transistor, invented by American physicists Walter Brattain and John Bardeen, consisted of strips of gold foil on a plastic triangle, pushed down into contact with a slab of germanium.
“All information in a computer is transmitted or stored in forms of binary digits – zeros and ones – and these zeros and ones are ‘voltages’ that are generated, transmitted and stored using little switches made out of transistors,” according to a 2023 report published by Stanford University in the United States.
Before transistors, vacuum tubes — lightbulb-like metal filaments closed in glass — were used for the same purpose. However, there were several issues with them. For instance, as they glowed like lightbulbs, they would attract insects, requiring “debugging” regularly. They would often burn out as well, bringing the whole system to a halt. Vacuum tubes were also quite big in size — ENIAC, a highly sophisticated computer built in 1945, had some 18,000 of these tubes, with each one being the size of a fist.
Transistors proved to be much more efficient and compact than vacuum tubes and quickly replaced them. But in the initial years, using transistors posed a different problem. All the wiring required to connect them — there could be thousands of transistors in one system — created a jungle of complexity.
This issue was resolved in 1958 with the invention of an “integrated circuit” by American electrical engineer Jack Kilby.
“Rather than use a separate piece of silicon or germanium to build each transistor, he thought of assembling multiple components on the same piece of semiconductor material… Multiple transistors could be built into a single slab of silicon or germanium. Kilby called his invention an “integrated circuit,” but it became known colloquially as a “chip,” because each integrated circuit was made from a piece of silicon “chipped” off a circular silicon wafer,” economic historian Chris Miller wrote in his book, Chip War: The Fight for the World’s Most Critical Technology (2022).
Six months after Kilby’s invention, another engineer, Robert Noyce, based in California, independently came up with the idea of making an integrated circuit. His chip was better suited for mass production, and soon he became part of Intel.
In 1971, the company launched a chip called the 4004 and described it as the world’s first microprocessor — “a micro-programmable computer on a chip”. This sparked a revolution in computing, and semiconductor materials have been an essential part of the industry since then.
Modern integrated circuits, also known as semiconductor chips, comprise transistors, diodes, capacitors, and resistors, and the myriad interconnections between them, layered on a wafer sheet of silicon.
The tiny chips are made in highly specialised manufacturing facilities (known as foundries) through a rigorous process called wafer fabrication, or wafer fab. The process begins with the slicing of semiconducting material, such as silicon, into a thin segment.
Once the wafer has been created, it is polished and ground through a series of different, highly specialised machines, and an integrated circuit is installed on its surface, according to a report by IBM.
Note that the fabrication requires clean rooms designed to maintain sterile conditions to prevent contamination by air particles. There could be between 500 and 1,500 steps in the overall manufacturing process of semiconductor chips.