Can enough federal petaflops whack the virus?

Over the weekend, my cable-provided on demand channels were unavailable. I figured it was a function of demand from the whole world shut in. In the current context, we didn’t bother to call Verizon for a temporary inconvenience.

Instead, we started watching a recording from last summer of a 90-minute documentary about Apollo 11. Three men went moonward in a rocket as tall as a 30-story building. Equally remarkable, though, was the army of people on the ground — at Cape Canaveral and all over the country. All those white shirted, necktied men and a number of impeccably bloused women sitting at consoles fitted with dial telephones, monochrome CRTs, and rows of analog switches and meters. They were backed by many industries. IBM had 4,000 people running the software doing the calculations.

Subsequent history has revealed stories of the “calculators” of NASA who worked mathematics with paper and pencil.

In taking on the current “moonshot” — the pandemic — NASA and other government agencies have something those punchcard-blackboard-era scientists and engineers did not: supercomputers. The Energy Department’s IBM Summit machine, for example, can, according to the company, do 200 quadrillion floating point calculations per second using 27,848 NVIDIA  and 9,219 IBM processors.

Advertisement

NASA likes Silicon Graphics/HPE. Its Pleiades machine can do nearly 6 petaflops with 241,324 processors.

If you like all of those speeds and feeds, several published lists track the worldwide supercomputer inventory.

Now, thanks to a collaboration led by NASA and the National Science Foundation, much of the U.S. supercomputing capacity will become available to researchers looking for answers about coronavirus. Lots and lots of petaflops, many quadrillions of floating point operations per second.

The COVID-19 High Performance Computing Consortium, announced by the White House barely a week ago, is already receiving proposals to use the offered supercomputers. That’s according to NASA’s Dr. Piyush Mehrotra, division chief of NASA Advanced Supercomputing.

“You can do the  modeling of the drugs or the virus, the proteins in the virus, how they fold, how they bind to each other. You can do quantum-mechanical simulations. And because you have such large machines, you can do it at a much faster rate,” Mehrotra said.

Orders of magnitude faster than possible with regular computers, that is. Runs that take seconds or minutes on supercomputers could take months or years on conventional systems.

Proposals started coming just a week ago. The committee — comprised of people from the federal government, academia and industry — is working fast to evaluate the proposals and, if they’re deemed worthy, assign them to the most suitable computer.

That’s a long list. The supercomputers of the Energy Department’s national labs plus those of the NSF and NASA are available. MIT, Rensselear Polytechnic Institute, and UC San Diego will contribute their machine. And from the corporate side supercomputers and cloud facilities will be available from Amazon, Google, IBM, Microsoft and Hewlett Packard Enterprise.

Mehrotra said that NASA computer scientists also have tools to help new users run their code on its machines, if the code was originally developed for another environment.

What about the data against which the research algorithms will run? Mehrotra said that from what he’s seen, the data sets associated with virus research or medication tests are tiny compared to the data sets NASA uses in its own research and simulations. From a data standpoint, the supercomputers would be loafing.

Sheer horsepower alone won’t get the nation out of the problem. Focused, coordinated brain power, backed by the computing equivalent of a Saturn V rocket, just may.