How Wirth's Law Works

Does the amount of transistors on a computer chip really matter? According to Wirth's Law, not so much.
Does the amount of transistors on a computer chip really matter? According to Wirth's Law, not so much.
Steve McAlister/Getty Images

Although we rarely notice it -- mostly because they're so small and sealed up inside machines -- transistors are one of the most important pieces of technology that affect our modern life. They're everywhere in the world of electronics. Without them, we wouldn't have CDs, MP3s, DVDs, cell phones or personal computers, and you certainly couldn't read this text over the Internet.

Transistors are the building blocks of microprocessors, the computer chips that run our desktops and laptops. They act like simple on-and-off switches, allowing electronic data to travel from point to point, store information and deliver the commands we make. Initially, transistors were large, and they were constructed one at a time. But between the years 1958 and 1959, two inventors, Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Camera, found a way to put all of the necessary parts of a computer chip into one tiny package, something called the integrated circuit. Their invention was the first step in a half-century-long revolution in miniaturized technology.

­These developments led Gordon Moore, one of the founders of Intel Corporation, to make a straightforward, yet very influential, prediction: The number of transistors on a 1-inch (2.5-centimeter) diameter computer chip will double about every two years. People began referring to this observation as Moore's Law, and so far the statement has been correct. As transistors become smaller and smaller, computer scientists can now fit billions of transistors onto a single computer chip. Intel's newest microprocessor, for instance, will have transistors that are only 32 nanometers wide [source: Intel].

­So more transistors equal good things for computer processing, right? Some people don't think so, and there's even a completely different law that upends Moore's statement -- Wirth's Law. Who is Niklaus Wirth, and why does he think Moore's Law is irrelevant?

Beyond Moore's Law

Wirth's Law states that software is getting slower more rapidly than hardware becomes faster.
Wirth's Law states that software is getting slower more rapidly than hardware becomes faster.
Matt Hind/Getty Images

Moore's Law makes things useful. By increasing the number of transistors on integrated circuits to several billions and reducing their size to mere nanometers, engineers can produce ever-faster microprocessors that are the same size, or even smaller, than the ones in today's computers. At the same time, Moore's Law increases efficiency and reduces costs of production. This consistent improvement in processing speed and memory capacity has paved the way for numerous improvements -- without it, we wouldn't be able to have technological advances such as more pixels on high-definition televisions and digital cameras. Intel's Web site lists off some of the more impressive things electronics might achieve in the future with the help of Moore's Law, including facial recognition software and real-time language translation [source: Intel].

While some expect Moore's Law to continue for at least another decade and others -- especially Intel -- think it will hold true for much longer, some have questioned if the statement will continue to matter. Piling transistors onto computer chips, according to critics, doesn't really matter in the end. One of the most prominent critics of Moore's Law is Niklaus Wirth, a prominent Swiss computer scientist who introduced his own "law" as a sort of counterproposal.

Wirth, born in 1934, is an expert in software engineering, and is known for developing the programming language for Pascal and other notable computer languages during the 1960s and 1970s. His is an important voice in computer engineering topics, and in1995 he published a paper titled "A Plea for Leaner Software." In it, Wirth called attention to two statements, with tongue slightly in cheek. This is what he said:

  • Software expands to fill the available memory.
  • Software is getting slower more rapidly than hardware becomes faster.

Both statements are important in Wirth's thinking, but it's the second statement that we associate with Wirth's Law. In the paper, Wirth actually attributes the sentence to Martin Reiser, so the popular statement we know as Wirth's Law is really a paraphrasing of something Reiser supposedly said at one point. Ironically, Reiser felt he had nothing to do with the idea whatsoever, saying: "It is not the first time I am accused of having said something that I cannot remember having said -- and most likely never have said" [source: IEEE].

So what do these two statements, and especially the second one, have to say about computer engineering?

Slow Software, Fast Hardware

Regardless of whether Wirth or Reiser made the statement, Wirth's comments on software caught people's attention. While proponents of Moore's Law remain pretty excited about how much faster next week's hardware will be, Wirth continues to tell everyone to slow down a moment. Even though the hardware may be performing faster, it doesn't necessarily mean the work you're doing is actually getting done faster. From the outset, it's clear that Wirth's focus is on software, not hardware. But what exactly does it mean when software is getting slower faster than hardware gets faster?

Although it's a roundabout way to say it, Wirth is essentially arguing that although processing speed has continually increased over the years and continues to do so, the software running our applications isn't much faster -- and indeed, it's sometimes even slower -- than older software that ran on much leaner processing machines more than 40 years ago. A word processing program from the 1970s, for example, might have only needed 8,000 bytes to run properly, an astonishingly low amount of memory by today's standards; however, current word processing applications need hundreds of times more storage to get essentially the same simple task done. The only reason we can actually use these programs, even supposedly simple ones like Microsoft Word, is because of the increase in processing speed that comes from Moore's Law.

This situation, according to Wirth, is not desirable in terms of design efficiency. If more thought was put into how we make and use software, the amount of work a processor does and the number of calculations it takes to run a program might look a bit more appealing.

Wirth attributes performance problems with today's software to something called software bloat, a term that refers to the increased complexity of today's software applications. That's related to Wirth's statement: "Software expands to fill the available memory." Because computer manufacturers keep increasing processing power and the amount of memory our computers can hold, software developers simply add more complexity to programs in order to make them do more -- and that's exactly what they do.

Cause of Software Bloat

Customers unaware of the difference between useful functions and useless features help keep software complicated, but Wirth feels the people making the software are even more responsible for allowing such bloated software to exist.
Customers unaware of the difference between useful functions and useless features help keep software complicated, but Wirth feels the people making the software are even more responsible for allowing such bloated software to exist.
Scott Olson/Getty Images

Why is this software bloat happening? There are two general reasons, according to Wirth, that software development has lagged behind hardware speed. One is the customer -- computer users in the general public who use applications casually. The other is the software vendor, or the people developing the software and choosing how everything is put together. These two factors don't work independently, of course, but rather come together to create a dependent relationship.

First, it's the customer's inability to see through unnecessary functions in certain applications that promotes software design that's a bit too complicated and fussy. Wirth considers things on our desktops that we normally accept as routine -- such as fancy picture icons that represent trash cans or musical notes -- to be essentially worthless. But customers expect these flashy representations, mostly because it's all part of a so-called user-friendly experience.

Of course, this is all very attractive to the customer, and what customers like is usually in the best interest of the company producing the software. As users ask for more complex, flashy features, developers release bloated software that contains any number of complexities that might be in demand. Even worse, this creates dependence on customer service. Instead of taking the time to learn how to use an application, many people dive right in, knowing they'll have to rely on someone else's help.

If Wirth's Law is general and straightforward, then so is the solution Wirth lays out for reducing software bloat. To cut our dependence on Moore's Law, he suggests reducing design elements within software. The fewer calculations a computer chip has to make from a program's computer language, the more efficient our work will actually be.

For more information about microprocessors and related topics, take a look at the links on the next page.

Related HowStuffWorks Articles

More Great Links


  • "Moore's Law." (Feb. 2, 2009)
  • Loosey, Chris. "Why Moore's Law is irrelevant." March 7, 2007. (Feb. 2, 2009)
  • Madison, N. "What is Pascal?" (Feb. 2, 2009)
  • Ross, Phillip. "5 commandments." IEEE Spectrum Careers.
  • Wirth, Niklaus. "A plea for lean software." IEEE. 1995. (Feb. 2, 2009)