Although we rarely notice it -- mostly because they're so small and sealed up inside machines -- transistors are one of the most important pieces of technology that affect our modern life. They're everywhere in the world of electronics. Without them, we wouldn't have CDs, MP3s, DVDs, cell phones or personal computers, and you certainly couldn't read this text over the Internet.
Transistors are the building blocks of microprocessors, the computer chips that run our desktops and laptops. They act like simple on-and-off switches, allowing electronic data to travel from point to point, store information and deliver the commands we make. Initially, transistors were large, and they were constructed one at a time. But between the years 1958 and 1959, two inventors, Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Camera, found a way to put all of the necessary parts of a computer chip into one tiny package, something called the integrated circuit. Their invention was the first step in a half-century-long revolution in miniaturized technology.
These developments led Gordon Moore, one of the founders of Intel Corporation, to make a straightforward, yet very influential, prediction: The number of transistors on a 1-inch (2.5-centimeter) diameter computer chip will double about every two years. People began referring to this observation as Moore's Law, and so far the statement has been correct. As transistors become smaller and smaller, computer scientists can now fit billions of transistors onto a single computer chip. Intel's newest microprocessor, for instance, will have transistors that are only 32 nanometers wide [source: Intel].
So more transistors equal good things for computer processing, right? Some people don't think so, and there's even a completely different law that upends Moore's statement -- Wirth's Law. Who is Niklaus Wirth, and why does he think Moore's Law is irrelevant?