University of Rochester

Rochester Review
September–October 2012
Vol. 75, No. 1

pdf image
Story as a PDF

Departments

Review home

Features

A Super  Supercomputer Rochester is one of the first academic homes in the nation for IBM’s next-generation supercomputer, designed to ‘make knowledge out of data.’
fea_bigdata_bluegenePOWERHOUSE: At peak performance, IBM’s Blue Gene/Q supercomputer can make 209 trillion calculations a second. (Photo: Adam Fenster)

A pivotal piece of big data research at Rochester is the Health Sciences Center for Computational Innovation (HSCCI), which in August became home to the next-generation supercomputer built by IBM—the Blue Gene/Q. It makes Rochester one of the five most powerful university-based supercomputing sites in the nation.

“It will be one of the most, if not the most, powerful supercomputers dedicated to health science research in the world,” says David Topham, director of HSCCI. The University created the center in 2008, in partnership with IBM, and began work with what is now the previous generation of supercomputer, the Blue Gene/P. In collaboration, the University, New York State, and IBM have upgraded the center with the Blue Gene/Q. The Center for Governmental Research estimates that the project could create 900 jobs in the community and generate $205 million in new research funding over the next 10 years.

At peak performance, the BlueGene/Q can make 209 trillion calculations per second. It’s 15 times more powerful than the previous generation supercomputer—and has the computing power of about 20,000 laptops.

The supercomputer, which will enable scientists to sift through mountains of data and create complex models, has vast potential for applications in medicine, and Rochester scientists are applying high-performance computing to research programs in vaccine development, brain injury, and cardiac disease.

Topham calls it “truly a new domain” for research, as supercomputing technology allows researchers to ask fundamentally different questions about health, creating “knowledge out of data.”

At universities around the country, just a decade ago, when researchers needed to summon more computing power than could be found in a conventional machine, they created what were known as “Beowulf clusters”—100 or 200 desktop computers linked together. It took a lot of space, produced a lot of heat, and consumed a lot of energy—and it was only sustainable for a day or two. Other researchers would cluster their own servers, requiring special air conditioning and running up electricity bills.

Faced with that insupportable situation, faculty at Rochester came together to find a solution.

“They had different research domains, but they had the same core issue: they needed a facility that could help them accelerate their research results,” recalls David Lewis, vice president for information technology and CIO.

The solution they hit upon was a shared center—what is today the Center for Integrated Research Computing (CIRC), which advances research through high-performance computing. It supports 150 research groups across the University.

“We started with 16 pilot users. Now we have more than 500 researchers” from 35 University departments and centers, says Brendan Mort, director of CIRC.

“The results have been exponential,” says Lewis. “We’ve built this as a community, and people have contributed hardware and people. They feel they can get bigger outcomes by being part of a shared resource rather than trying to build their own thing.”