Moore's law stems from an observation Intel co-founder Gordon Moore made back in 1965: The number of transistors on a 1-inch (2.5-centimeter) silicon chip tends to double every couple of years. While there's no universal law that dictates this must be so, technology companies like Intel have spent countless hours and billions of dollars on research and development to keep pace with Moore's law.
But Intel's strategy goes beyond finding ways to shrink components to tinier scales in order to boost power. The company has what it calls a tick-tock strategy. It develops chip technologies in two phases. The tick phase involves finding a way to shrink elements down to a smaller size. The tock phase is all about arranging the shrunken elements in the most efficient configuration to increase efficiency.
Intel's Sandy Bridge chip is an example of a tock technology. The previous tock chip, codenamed Nehalem, arranged 45-nanometer transistors in a way that allowed data multithreading, looping and branching, which made it a more powerful processor than the preceding 45-nanometer Penryn microprocessors.
After Nehalem came the next tick: the Westmere family of microprocessors. While they have the same configuration as the Nehalem family of chips, Intel engineered Westmere's components down to 32 nanometers. Following Westmere is the tock of Sandy Bridge.
The Sandy Bridge family of chips introduces some new features into Intel's bag of tricks. One of the most talked about is Intel's decision to dedicate part of the microprocessor to handling graphics processing -- a task often handled by a dedicated graphics processor. You might think Intel is firing a warning shot across the bow of companies that produce graphics processing units (GPUs).
Before we learn more about Sandy Bridge, it's important to understand how things work at this incredibly small scale.
It's Not Easy Being Small
There are lots of challenges that come with building electronics at the nanoscale. One of those problems is that materials display different properties at that size. Another is that it becomes harder to control electrons. And since electronics are based on channeling electrons to get results, that becomes a problem.
It boils down to quantum physics. The nanoscale world is one in which classic physics don't necessarily apply. For a transistor to work, it has to be able to either allow electrons to pass through or block their pathway. But electrons can be sneaky -- if the material blocking their path is of the right substance and is thin enough, the electrons can jump right through as if nothing is there at all. Quantum physicists call it electron tunneling even though electrons aren't literally digging through the barrier.
Solving a problem like electron tunneling isn't a walk in the park. It requires experimentation with different materials to discover which are more resistant to electron tunneling than others. Then it takes new production procedures to build microprocessors with elements at the right scale. It takes even more work to standardize those procedures so that the company can mass manufacture the new chips.
A lot can go wrong during this phase. If the material for the electron gates isn't just right, the microprocessors won't work properly. Electrons will leak through, causing processing errors and instability. Leakage can also contribute to heat production and too much heat spells disaster for microprocessors. And the small size has other challenges, too -- during the manufacturing process, even a single mote of dust can ruin a chip. Dust particles are much larger than the individual elements in a microprocessor.
The Sandy Bridge Architecture
Westmere, Sandy Bridge's predecessor, had an architecture based upon Nehalem. Sandy Bridge's architecture shares some similarities with the older chips but features a couple of major departures as well.
Sandy Bridge is a multicore microprocessor. That means each Sandy Bridge microprocessor has at least two processing cores capable of handling computational operations. At launch, the most advanced Sandy Bridge chip had four cores, making it a quad-core processor. But why use multiple cores at all? Why not just develop faster single-core processors?
It turns out that many computer processes are made up of smaller computational problems called parallel problems. Imagine a room that has a genius and four smart -- but not genius -- math students. You give the genius a sheet of four math problems to solve. You give each of the four smart students one of the math problems. While the genius might solve a single problem faster than any of the four smart students, collectively the students will finish before the genius. That's the idea behind multicore processors -- individually, they may be slower than a powerful single-core processor. But collectively, they're more efficient for lots of computer problems.
In addition to the multiple cores, each core itself can handle two threads of data. Intel included multithreading in the Nehalem microarchitecture and carried it over into Sandy Bridge. Intel now calls the technology Hyper-Threading. Multithreading depends heavily on software developers creating programs that can take advantage of the feature. With the right application, a single core can handle two threads of data, in effect doubling the processing power of the chip for those applications.
Each core in a Sandy Bridge chip has two levels of individual cache memory -- that means the cores can store some data within the processor itself to refer to while making computations. A third level of cache memory, called the last-level cache, is a shared resource. The cores refer to the last-level cache for shared data and to communicate with other cores.
The big departure for Sandy Bridge is the inclusion of a dedicated section on the chip for graphics processing. Out of the 995 million transistors on the Sandy Bridge quad-core desktop computer chip, 114 million of them reside in the graphics processing section [source: Lal Shimpi]. It can handle 3-D graphics processing. This capability decreases the necessity for a dedicated graphics card, though high-end applications like cutting-edge video games or video processing software may still require a graphics processing unit to run without a hitch.
Clock Cycles and Controversy
The launch of Sandy Bridge wasn't flawless. One decision Intel made that upset some computer enthusiasts was the decision to move the clock generator for the processor off the motherboard and onto the chipset itself. To understand why that would upset anyone, we need a quick reminder about clock cycles.
A clock cycle on a processor is an electronic pulse in which a processor can complete a basic operation such as retrieving a specific data point. Most computational operations actually require multiple clock cycles. Faster processors can complete more clock cycles per second than slower processors. We measure this in hertz -- the number of cycles per second. A 1-gigahertz processor can complete a billion cycles every second.
Manufacturers build in limits or caps in microprocessors. Many microprocessors are capable of completing many more cycles per second than the manufacturer allows. There are a couple of reasons to limit the clock speed on a microprocessor. First, microprocessors generate heat, and making them run as fast as they are able to will ramp up the heat production significantly. Without sufficient cooling measures in place, these microprocessors can fail as things heat up. Second, by imposing limits on a single microprocessor, the company can market the same chip with a less restrictive clock speed cap at a new price to people willing to pay for the speed.
One way to get around this restriction is to overclock your processor. Overclocking depends upon the equipment you own -- there's no single approach to overclock your computer. But essentially you use software -- and sometimes a little tweaking of actual hardware -- to allow your processor to run at a faster clock speed than the manufacturer normally allows.
When Intel moved the clock generator, which controls the clock speed and synchronizes processor functions, it also locked down the clock speed on most of the chipsets for Sandy Bridge. Some chips will allow overclock enthusiasts to give their processors a modest boost in speed. But it appears that if you want to really be a speed demon, you'll have to purchase one of the more expensive Sandy Bridge chipsets that doesn't include the clock-speed lock.
That wasn't the only controversy to plague Sandy Bridge. Next, we'll look at some errors that resulted in microprocessor underperformance.
It's Not Sandy Bridge's Fault!
Shortly after computers with Sandy Bridge chips hit store shelves, the news broke: an element in the computers' motherboards had a manufacturing flaw. Over time, this flaw would cause the computer's performance to decline, negating the benefits of the new microprocessor. Depending upon the source, a reader might get the impression that the Sandy Bridge chip was a lemon.
In truth, the problem didn't have anything to do with the microprocessor. It had to do with the series-6 chipset called Cougar Point. While still an Intel product, Cougar Point is not the same thing as Sandy Bridge. The chipset includes special chips that provide the link between computer hardware and computer software, including the chips that process information coming in and going out through various computer ports.
The culprit was a serial ATA 2 (SATA 2) chip. A SATA chip allows a computer to communicate with external devices such as external disk drives through a serial wire. The problem goes back to the issue of electron leakage -- the transistor gates received too high a voltage to restrict electron flow properly [source: Lal Shimpi]. And there's no easy fix beyond retooling the manufacturing process to eliminate the flaw. For that reason, some motherboard manufacturers and computer vendors put a halt on ordering Sandy Bridge systems until Intel resolved the issues.
Intel claimed that only 15 percent of all Sandy Bridge customers would notice a decrease in performance. The company also stated that the problem didn't affect all machines with a Sandy Bridge processor -- only some desktop computers with the Intel P67 chipset. But the damage was done -- shipments stopped for a short time while Intel addressed the problem. Intel then resumed shipments of Sandy Bridge machines that weren't affected by the problem.
Even for customers who own machines that have the flawed chipset, things aren't so dire. The flaw shouldn't be noticeable in the short term. It's expected that Intel will issue a recall once an improved chipset is available. It's also important to remember that this problem had nothing to do with the Sandy Bridge processor itself.
While the Cougar Point problem was a stumbling block for Sandy Bridge, it's likely Intel will bounce back from this it without too many scars. The chip provides more performance than any of its predecessors in the consumer market. And what's in the future? That would be the next tick in Intel's strategy: Ivy Bridge. That chip will have transistors at the 22-nanometer scale. That's big news for a small transistor.
Learn more about microprocessors and related topics on the next page.
More Great Links
- Abi-Chahla, Fedy. "A Three-Level Cache Hierarchy." Tom's Hardware. Oct. 14, 2008. (March 1, 2011) http://www.tomshardware.com/reviews/Intel-i7-nehalem-cpu,2041-10.html
- Ackerman, Dan. "Intel's 'Sandy Bridge' for laptops tested." CNET. Jan. 12, 2011. (Feb. 25, 2011) http://news.cnet.com/8301-17938_105-20028200-1.html
- Burt, Jeffrey. "Intel Plans for 22-nm 'Ivy Bridge,' 15-nm Atom Chips." eWeek. Sept. 17, 2010. (March 1, 2011) http://www.eweek.com/c/a/Desktops-and-Notebooks/Intel-Plans-for-22nm-Ivy-Bridge-15nm-Atom-Chips-585696/
- Hachman, Mark. "Intel Resumes 'Sandy Bridge' Chipset Shipments, with Restrictions." PCMag. Feb. 7, 2011. (March 2, 2011) http://www.pcmag.com/article2/0,2817,2379628,00.asp
- Intel. "High-k and Metal Gate Research." (Feb. 25, 2011) http://www.intel.com/technology/silicon/high-k.htm
- Intel. "Intel 32nm Logic Technology." (Feb. 25, 2011) http://www.intel.com/technology/architecture-silicon/32nm/
- Intel. "Intel Core Processor Family." (Feb. 25, 2011) http://www.intel.com/consumer/products/processors/core-family.htm
- Intel. "Intel Hyper-Threading Technology." (Feb. 23, 2011) http://www.intel.com/technology/platform-technology/hyper-threading/index.htm
- Intel. "Intel Microarchitecture Codename Sandy Bridge." (Feb. 24, 2011) http://www.intel.com/technology/architecture-silicon/2ndgen/
- Intel. "Intel Turbo Boost Technology 2." (March 2, 2011) http://www.intel.com/technology/turboboost/
- Intel. "Introduction to Intel's 32nm Process Technology." 2010. (Feb. 25, 2011) http://download.intel.com/technology/architecture-silicon/32nm/Intel_32nm_Overview.pdf
- Lal Shimpi, Anand. "Intel's Sandy Bridge Architecture Exposed." AnandTech. Sept. 14, 2010. (Feb. 24, 2011) http://www.anandtech.com/show/3922/intels-sandy-bridge-architecture-exposed/8
- Lal Shimpi, Anand. "The Sandy Bridge Review: Intel Core i7-2600K, i5-2500K and Core i3-2100 Tested." AnandTech. Jan. 3, 2011. (Feb. 28, 2011) http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested
- Lal Shimpi, Anand. "The Source of Intel's Cougar Point SATA Bug." AnandTech. Jan. 31, 2011. (March 2, 2011) http://www.anandtech.com/show/4143/the-source-of-intels-cougar-point-sata-bug
- Murray, Matthew. "Intel's Sandy Bridge Glitch: 7 Things You Need to Know." PCMag. Feb. 3, 2011. (March 1, 2011) http://www.pcmag.com/article2/0,2817,2379241,00.asp
- Nguyen, Tuan. "Sandy Bridge Debacle: What It Means for You." Tom's Hardware. Jan. 31, 2011. (March 2, 2011) http://www.tomshardware.com/news/sandy-bridge-sata-error-sata-3,12112.html
- Parrish, Kevin. "Intel May Show Ivy Bridge CPUs at Computex." Tom's Hardware. Feb. 8, 2011. (March 1, 2011) http://www.tomshardware.com/news/Sandy-Bridge-Ivy-Bridge-DirectX-11-LGA1155,12155.html