Is Moore's Law outdated?

Redefining a Law

The Intel Xeon processor E7 family die, which has up to 10 cores and 2.6 billion transistors.
The Intel Xeon processor E7 family die, which has up to 10 cores and 2.6 billion transistors.
Courtesy Intel

Originally, Moore's Law covered a pretty specific concept: the number of discrete components on a newly manufactured integrated circuit doubles every 12 months. Today, we fudge that number a bit -- you'll hear people in the tech industry say it's every 18 to 24 months. And we're not just talking about the number of elements on a chip.

One common way we reword Moore's observation is to say that for a given amount of time (again, usually between 18 and 24 months), the processing power of microprocessors double. This doesn't necessarily mean there are twice as many transistors on a chip in 2012 as there were in 2010. Instead, we may find new ways to design chips to make them more efficient, giving us a boost in processing speed without the need for exponential growth.

By redefining Moore's Law so that we're looking at processing power rather than physical components, we've extended the usefulness of the observation. Companies can combine advances in manufacturing technology with better microprocessor architecture designs to keep pace with the law.

Is redefining Moore's Law like this akin to cheating? Does it matter? In 1965, Moore predicted that a chip manufactured in 1975 would have 65,000 transistors on it should his observation hold true. Today, Intel builds processors that have 2.6 billion transistors [source: Intel]. Computers can process data much faster today than they could decades ago -- a home PC packs as big a punch as some of the early supercomputers.

Another way to look at the question is to ask if it even matters if computers are twice as powerful today as they were two years ago. If we live in a post-PC era, as Steve Jobs once suggested, then it could mean that faster microprocessors aren't as relevant as they used to be. It may be more important that our devices are energy-efficient and portable. If that's the case, we may see Moore's Law come to an end not because we hit some sort of fundamental limitation, but because it doesn't make economic sense to keep pushing the boundaries of what we can do.

Some segments of the computer-buying population will continue to demand the highest standards in processing. Video game enthusiasts and people who work with high-definition media need -- or crave -- all the processing power they can get. But what about the rest of us?

Even if all our personal computers turn into dumb terminals that access everything through the cloud, somewhere there will need to be a computer with a powerful processor. Perhaps we'll see another new definition of Moore's Law with a longer lead time before processors double in power. With its mutable history, it seems likely Moore's Law will stick around a while longer in some form or another.