Is Moore's Law outdated?

Gordon Moore
Gordon Moore, co-founder of Intel, is the Moore in Moore's Law.
AP Photo/Paul Sakuma

In 1965, the publication Electronics ran an article written by Dr. Gordon E. Moore, the director of research and development at Fairchild Semiconductor. Moore titled the article "Cramming more components onto integrated circuits." He observed that semiconductor companies like Fairchild could double the number of discrete components on a square inch of silicon every 12 months.

This is a type of exponential growth. A square-inch (6.5 square-centimeter) chip made in 1964 would have half the number of components -- such as transistors -- as a chip manufactured in 1965. Moore predicted this trend would continue indefinitely until chip manufacturers encountered fundamental barriers that block their progress.

Advertisement

Moore's observation depended on two important factors: technological advances and the economics of mass manufacturing. For his observation to remain valid, we have to innovate and find new ways to create increasingly smaller elements onto a chip. But we also have to make sure the manufacturing process is economically viable, or there will be no way to support further development.

Today, we call Moore's observation Moore's Law. Despite the name, it's not really a law. There's no fundamental rule in the universe that guides how powerful a newly made integrated circuit will be at any given time. But Moore's Law has become something of a self-fulfilling prophecy as chip manufacturers have pushed to keep up with the predictions Dr. Moore made way back in 1965. Whether it's out of a sense of pride or simply a desire to lead in the marketplace, companies like Intel have spent billions of dollars in research and development to keep pace.

So, is this nearly 50-year-old observation still relevant?

Advertisement

Quantum Leaps

transistor model
This is a model of the earliest transistor -- today's microprocessors have millions or even billions of them.
AP Photo/Paul Sakuma

It seems that with every year that passes, some technology pundit or journalist predicts that Moore's Law will come to an end. The components on today's microprocessors are now on the nanoscale -- a scale so tiny that you can't even see individual elements using a powerful light microscope. Physics behave differently at this size and quantum mechanics begin to take over for classical physics. Things get pretty weird.

For example, there's quantum tunneling. Imagine an electron isn't a particle with a defined position. Instead, it's a particle that behaves like a wave. The probability of the electron's position varies within the wave. In a way, the wave looks like a bell curve -- the narrow ends represent areas where it's possible -- but not probable -- for the electron to be. The wide middle section represents the area where the electron would most likely be found.

Advertisement

As this wave comes close to a barrier, such as a gap between two conductors, one end of the wave might overlap the barrier and touch the other conductor. That means the electron has the potential to be on the other side of the gap. If the potential is there, that means sometimes the electron is on the other side. It's as if the electron tunneled right through the barrier.

In a microprocessor, this is what we would call a bad thing. You can think of a microprocessor as a complex road system for electrons to travel through. Transistors in the microprocessors are gates -- they govern traffic flow. A closed gate shouldn't allow electrons to pass through. But if you get the gates thin enough -- shrinking those elements down further to keep up with Moore's Law -- you start to encounter quantum problems like electron tunneling. Electron leakage will cause computer errors as the microprocessor gets the wrong results in its calculations.

Over the years, engineers have found new ways to build transistors on the nanoscale while minimizing effects like quantum tunneling. Sometimes this involves using a different type of material within the transistor gates. Sometimes, it means creating a three-dimensional gate to increase the efficiency of the microprocessor. These have helped companies keep pace with the predictions of Moore's Law. But another reason Moore's Law hasn't gone away is because we keep fiddling with the definition.

Advertisement

Redefining a Law

Intel Xeon E7
The Intel Xeon processor E7 family die, which has up to 10 cores and 2.6 billion transistors.
Courtesy Intel

Originally, Moore's Law covered a pretty specific concept: the number of discrete components on a newly manufactured integrated circuit doubles every 12 months. Today, we fudge that number a bit -- you'll hear people in the tech industry say it's every 18 to 24 months. And we're not just talking about the number of elements on a chip.

One common way we reword Moore's observation is to say that for a given amount of time (again, usually between 18 and 24 months), the processing power of microprocessors double. This doesn't necessarily mean there are twice as many transistors on a chip in 2012 as there were in 2010. Instead, we may find new ways to design chips to make them more efficient, giving us a boost in processing speed without the need for exponential growth.

Advertisement

By redefining Moore's Law so that we're looking at processing power rather than physical components, we've extended the usefulness of the observation. Companies can combine advances in manufacturing technology with better microprocessor architecture designs to keep pace with the law.

Is redefining Moore's Law like this akin to cheating? Does it matter? In 1965, Moore predicted that a chip manufactured in 1975 would have 65,000 transistors on it should his observation hold true. Today, Intel builds processors that have 2.6 billion transistors [source: Intel]. Computers can process data much faster today than they could decades ago -- a home PC packs as big a punch as some of the early supercomputers.

Another way to look at the question is to ask if it even matters if computers are twice as powerful today as they were two years ago. If we live in a post-PC era, as Steve Jobs once suggested, then it could mean that faster microprocessors aren't as relevant as they used to be. It may be more important that our devices are energy-efficient and portable. If that's the case, we may see Moore's Law come to an end not because we hit some sort of fundamental limitation, but because it doesn't make economic sense to keep pushing the boundaries of what we can do.

Some segments of the computer-buying population will continue to demand the highest standards in processing. Video game enthusiasts and people who work with high-definition media need -- or crave -- all the processing power they can get. But what about the rest of us?

Even if all our personal computers turn into dumb terminals that access everything through the cloud, somewhere there will need to be a computer with a powerful processor. Perhaps we'll see another new definition of Moore's Law with a longer lead time before processors double in power. With its mutable history, it seems likely Moore's Law will stick around a while longer in some form or another.

Advertisement

Author's Note

To me, the most fascinating aspect of Moore's Law is its effect on the microprocessor industry. It's a goal everyone wants to meet. It inspires engineers to try new approaches and materials rather than risk falling behind. Ultimately, this observation guided the industry and paved the way for the PC and post-PC eras.

Related Articles

Sources

  • Computer History Museum. "1965 - 'Moore's Law' Predicts the Future of Integrated Circuits." 2007. (Sept. 11, 2012) http://www.computerhistory.org/semiconductor/timeline/1965-Moore.html
  • Intel. "Intel Xeon Processor E7-8800/4800/2800 Product Families." (Sept. 13, 2012) http://www.intel.com/newsroom/kits/xeon/e7e3/gallery/gallery.htm
  • Miller, Michael J. "Does Moore's Law Still Apply to Desktop Speeds?" ForwardThinking. Aug. 12, 2012. (Sept. 12, 2012) http://forwardthinking.pcmag.com/none/301435-does-moore-s-law-still-apply-to-desktop-speeds
  • Moore, Gordon E. "Cramming more components onto integrated circuits." Electronics, Vol. 38, No. 8, April 19, 1965. http://download.intel.com/museum/Moores_Law/Articles-Press_releases/Gordon_Moore_1965_Article.pdf
  • Motta, Leonardo. "Tunneling." Wolfram Research. 2007. (Sept. 12, 2012) http://scienceworld.wolfram.com/physics/Tunneling.html
  • Nave, R. "Barrier Penetration." HyperPhysics. (Sept. 12, 2012) http://hyperphysics.phy-astr.gsu.edu/hbase/quantum/barr.html

Advertisement

Frequently Answered Questions

What is Moore's Law in simple terms?
Moore's Law is the principle that the number of transistors on a microchip doubles every two years. Gordon Moore made the law in 1965.

Advertisement

Loading...