Creating Adaptable Computers R&D could change way people do science

In News

Throughout the 50 or more years of the modern computing era, virtually all machines ranging from laptops to smart phones have had one feature in common: a fixed, conventional “one size fits all” processor.  This means that software developers must craft applications to match the inflexible design of the processor, rather than the other way around.

Within the next 10 to 20 years, however, scientists hope to have many computers moving in a new direction, one that will enable machines to adapt easily to a wide range of software applications. The researchers like to call it “morphing.” It is an advance that will make computers run faster, and save energy as well.

“It’s a new and innovative way to build and use computers,” says Alan George, professor of electrical and computer engineering at the University of Florida and director of the Center for High-Performance Reconfigurable Computing, or CHREC (pronounced “shreck”), a research center based at four major universities with more than two dozen industry and government partners.

“With reconfigurable computing, the architecture of the processor is adaptive, and thus can be customized to match the unique needs of each application,” he adds. “By changing the mindset of computing, from processor-centric to application-centric, reconfigurable computing can perform at a fraction of the time and cost of traditional servers or supercomputers.”

The new design, for example, ultimately could transform computational biology, providing computers that could analyze DNA in a matter of seconds or minutes, rather than hours or days, or sequence genes in order to detect disease, or identify the best treatments for individual patients.

“It won’t replace conventional computing, but it will speed up certain aspects of computing, with much less energy consumption,” says Herman Lam, associate professor of electrical and computing engineering at the University of Florida.

Lam, who helped develop the center’s Novo-G supercomputer, believed to be the world’s most powerful reconfigurable computer, says the impact likely will be greatest in data analysis.

“Five or ten years ago, the big bottleneck in many science domains was the inability to collect or generate enough data,” he says.  “Today, machines are efficient at producing data, and the bottleneck now is that there is too much data, and computers can’t analyze it quickly. That’s what’s going to change.”

The National Science Foundation currently provides the University of Florida, the lead institution for CHREC, with $116,000 annually over five years, and $66,000 to each of its three research partners, which include Brigham Young University, George Washington University and Virginia Tech.  Most of the center’s funding, however, comes from industry support, a consortium that includes such companies as processor giant Intel, The Boeing Co., Monsanto, Honeywell Aerospace and Lockheed Martin, among many others.

Traditional computers with fixed processors can perform many tasks. But they need a substantial amount of overhead in space and energy. On the other hand, special-purpose, or “domain specific” computers, can perform certain tasks very well, but cannot adapt to new ones.

Reconfigurable computers make the best of both worlds. Using devices such as field-programmable gate arrays, or FPGAs, they allow the processor to “morph,” that is, rearrange its internal circuitry, sort of like children’s Legos, to create the right architecture’ for each job that comes along.

“Rather than coming up with one architecture that’s good for many applications, we’ve come up with one that’s adaptable,” George says. “We configure it one way for one application, then reconfigure it for another. It’s like Lego blocks. Instead of giving a child a toy truck or a cowboy figure, give him a bunch of Lego blocks in different sizes and shapes and he can make whatever he wants, and then five minutes later, he can make something else.  One minute it’s this, and then five minutes later, it’s something else.”

The researchers believe that eventually most computers will become, over time, increasingly more reconfigurable.  “We see this as a ubiquitous technology, even if it won’t always be visible to the user,” George says.  “Inevitably, this is the route most computers will have to take in one form or another.”

The Novo-G computer uses 288 reconfigurable processors, rivaling the speed of the world’s largest supercomputers for some applications, but at vastly less cost, size, power, and cooling, according to the researchers. Conventional supercomputers, for example, which can be the size of a large building, often require millions of watts of electrical power, and produce considerable heat, while Novo-G is about the size of two home refrigerators, and uses less than 12,000 watts, according to the researchers.

Novo-G currently is involved in several projects, using applications in genomic research, cancer diagnosis, investment finance, signal and image processing and plant science.  Monsanto, for example, is working with center researchers “to come up with new and better ways to make crops resistant to drought and pests,” George says. “Much of this is done using computational techniques. But the company is finding it increasingly difficult to analyze because the data are so massive and detailed. They can’t process it.  They need a computer that can adapt to the unique nature of all this information, which traditional computers can’t do.”

George predicts that reconfigurable computers ultimately will prompt a dramatic shift in personalized medicine, enabling physicians to design custom treatments based on a patient’s individual genetic profile.  “These computers will have the ability to customize and thereby rapidly process the unique data for each patient,” George says.  “You’ll be able to go to the doctor and have personalized treatment for whatever ailment you have.”

Lam agrees, adding that scientific research also likely will receive a huge boost from the new technology.

“A lot of science depends on computers, and some of it depends on computers doing a lot of things fast,” Lam says. “It’s not too effective if it takes two hours every time you try something. But if you can do an experiment that takes 30 seconds, instead of two hours, you could change the way you do science. I think that’s what’s going to happen. With reconfigurable computers, there could be a fundamental change in the way people do science.”

Share