What is a Transistor?
The transistor was a major technological advance that replaced bulky vaccum tubes.
CREDIT: vlabo | Shutterstock
A transistor is an electronic component in a circuit used to control current or voltage levels. Short for “transfer resistance,” this technology is used largely in computers, allowing the switching of electronic signals. The transistor was a major technological advance that helped pave the way to the computers of today. It marked the first advancement toward smaller and more efficient computers and effectively ended the reign of the bulky and high-maintenance vacuum tubes.
The purpose of a transistor
These small devices are made up of a semiconductor material designed to amplify a signal or open and close a circuit. Through this simple purpose, they contribute largely to the functioning of digital circuits such as computer microprocessors. Technology today allows for transistors to reach the microscopic level, which means modern microprocessors now contain millions of transistors. These tiny components replace the larger and more antiquated vacuum tubes, which used more energy and created more heat.
While technological trends have facilitated newer and more efficient iterations of the transistor, the technology still relies on the same fundamental element of using a semiconductor material for current amplification. [Related: New Transistor is Made from Blood and Mucus]
The history of transistors
The creation of the transistor occurred in response to the advances made during World War II of diode technology. Vacuum tubes, the predecessor to transistors, used to amplify audio and transmit telephone signals over great distances and proved quite high maintenance. The transistor found its humble beginning in 1947 at Bell Telephone Laboratories. While researching germanium crystals and their potential to replace vacuum tubes as semiconductors, John Bardeen, William Shocklev and Walter Brattain stumbled upon the discovery of transistors.
Just before the team gave up on what felt an unsuccessful experiment, they finally created the first “point-contact” transistor amplifier. This consisted of two gold foil contacts resting on a germanium crystal and using the crystal to boost the strength of the current flowing between each contact. Over the years they continued improving upon the technology, later receiving the Nobel Prize in Physics in 1956 for their work.
The transistor began to gain popularity due to Texas Instruments’ use of it in radio technology. As transistors grew smaller and smaller, so did the radio, eventually allowing for a pocket-size transistor radio. These evolutions made engineers realize that transistors could replace vacuum tubes in computers, reducing the traditional 30-ton computers — which used more than 17,000 vacuum tubes — to a more manageable size.
How transistors work
Transistors are made up of three layers of a semiconductor material, all capable of carrying a current between them. The semiconductor is typically made up of germanium — though silicon is now used in modern transistor manufacturing due to its more reliable nature — a material which conducts electricity “semi-enthusiastically.” The material is given certain properties through the chemical process of “doping,” which adds electrons to the material and makes it more conductive.
One layer of semiconductor receives the “N-type” treatment, which gives is more negative electrons, and it is then sandwiched between two layers with “P-type” treatment, which contain more positive electrons. The resulting “PNP” configuration facilitates rapid changes in currents, allowing the transistor to “open” and “close” the gate of a circuit multiple times per second. On a microprocessor sending thousands of currents with many different processes taking place, opening and closing currents rapidly becomes imperative for modern computing. [Related: What is the Future of Computers?]
In a microprocessor, transistors act as part of an integrated circuit, or microchip. In this, they work in tandem to help computers complete calculations. Computers typically use the currents moderated by transistors to perform Boolean algebra for making simple decisions. For even the most basic of tasks, modern computers require billions of transistors, a feat only accomplished through the advances of technology. Thanks to the reliability of transistors as well as their small size, engineers are able to pack countless transistors into any modern device. [Related: Computing Revolution — 3D Transistor Means Smaller & Faster Gadgets]