Supercomputers may not soar through the air in tights, heroically saving damsels falling from burning skyscrapers, but their capabilities are still pretty impressive, and President Obama seems sold on them.
Obama issued an executive order this summer establishing the National Strategic Computing Initiative (NSCI), which aims to maintain the United States' position as a leader in high-performance computing (HPC) and to build one serious supercomputer.
A supercomputer can process more information at a much faster speed than a traditional, small- to medium-sized machine.
"It embodies the hardware, software and algorithms needed to deliver the very highest capability at any time," writes Jack Dongarra, a professor in the electrical engineering and computer science department at the University of Tennessee. Dongarra should know. He's an author for TOP500, an organization that compiles and maintains a list of the world's 500 most powerful computer systems.
Supercomputer speed is measured in floating-point operations per second (FLOPS), which indicates the number of arithmetic operations that the unit can perform each second. In fact, researchers are aiming to produce a supercomputer capable of reaching exascale — also known as one exaflop (1018 operations per second) — within the next decade.
And the White House could help them to do it.
"This executive order really sets the stage for the work needed to reach exascale," Dongarra says.
Along with reaching exascale, the White House has several other things they hope the computing initiative can accomplish. Most of them have to do with big data (one exabyte, or 1018 bytes, big) and developing systems that "drive the convergence of compute-intensive and data-intensive systems." By compute-intensive, they just mean a system that can do an enormous number of calculations at once. The data-intensive bit refers to the amount of data, like medical records or Web pages that a computer can sift through.
That convergence could come in handy for studying something like weather. Researchers could pair simulations of weather with actual data from satellites and other sensors. Combining these two capacities would offer deeper insights into varying scientific events and concepts.
The initiative also aims to make supercomputer programming much simpler, as current systems are pretty tough to program. The public-private partnership plans to support research on new methods for building and programming in order to address this issue.
They also plan to make supercomputers readily available for companies and academics that could benefit from HPC capabilities, yet lack the resources.
Finally, they'll work to establish innovative hardware technology to be used in future HPC units. These successors will replace semiconductor technology, which is quickly reaching its limits.
All of that sounds pretty great, but why the heck should you care about supercomputers? After all, they won't improve your Netflix streaming experience or make you a master of computer solitaire. Nope, they're capable of way more than that. We can use them to make giant leaps and improvements in the industries of health care, meteorology and even national security.
"More powerful computing capability will allow diverse industries to more quickly engineer superior new products that could improve a nation's competitiveness," says Dongarra. "In addition, there are considerable flow-down benefits that will result from meeting both the hardware and software high-performance computing challenges. These would include enhancements to smaller computer systems and many types of consumer electronics, from smartphones to cameras."
Yep, supercomputers are capable of big things. And, with the White House's new initiative, those capabilities could get bigger.
NOW THAT'S COOL
Whenever Tianhe-2, currently the world's fastest supercomputer, is running, it draws enough power to supply approximately 5,100 Chinese households.