Inside the supercomputer being used in the fight against coronavirus

The Energy Department is bringing something a little larger to the coronavirus fight, the IBM Summit supercomputer.

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Every television station has been running b-roll of coronavirus research. Endless loops of test tubes, tiny little needles and pipettes. But the Energy Department is bringing something a little larger to the fight. Namely, the IBM Summit supercomputer. It won’t fit into a petrie dish. With more on exactly what Summit is doing, the vice president for technical computing at IBM Cognitive Systems, Dave Turek joined Federal Drive with Tom Temin.

Interview transcript:

Tom Temin: Mr. Turek good to have you on.

Dave Turek: Thanks for having me.

Tom Temin: Tell us about IBM Summit. What is it going to do? What does it do and how can it be applied to the questions regarding coronavirus?

Dave Turek: The Summit system is the world’s fastest supercomputer, and it’s situated Oak Ridge National Laboratory. It’s owned by the Department of Energy, and scientists at Oak Ridge in the University of Tennessee have collaborated to use the machine to investigate the molecular properties of the virus and to begin the exploration of different sets of compounds in terms of how they might affect that virus. So it’s all in search of the cure. In a sense. Oor a series of therapeutic agents, it would mitigate the disease,

Tom Temin: So this is very markedly different type of computing process than might be done by, say, IBM Watson, where you ingest all of this data and it tries to answer questions. This is more in the basic scientific research realm, sounds like.

Dave Turek: That’s right. So what they’ll do is they’ll actually use mathematical formula to characterize molecular structure of the virus, the associated compounds that they’re exploring to use against it and run gigantic sets of simulations that will dictate which of these compounds have the greatest likelihood to actually have an impact on the virus. And to date, they began with a set of 8000 compounds, and they whittled it down to 77. Based on the theory of molecular dynamics and everything else that are suggestive is the ones that have the highest likelihood of having an impact on the disease.

Tom Temin: And so is there a visualization piece such that you can see. I mean, I keep seeing pictures on TV of this looks like a balloon with all kinds of red spots all over it. I mean, is there a visualization aspect of this that gives the answers, or does it come out as ones and zeros?

Dave Turek: Well, it’s it comes out principally in in forms that I guess you would say, or ones and zeros. It’s more of a mathematical representation of what’s going on. But the processes themselves can be visualized. It’s just that it doesn’t impart that much direct new information over what the calculations themselves produced. And this, of course, gives rise to follow on steps simulations under different kinds of scenarios of not 77 selected compounds that further refined what the probability is their success rates. And ultimately this gets clinical trials. So as they narrow the sounds, they get deeper understanding of how these processes work. Eventually, you’ll see the investigation moved from sort of this in kind of environment, computing environment back to the world where they’re actually testing things against live viruses and so on to see if there is effective as the computer predicts they will be.

Tom Temin: Give us a sense of the size of this computer. We do have a lot of speeds and feeds people that listen and read here and, just tell us about some of the specs of Summit.

Dave Turek: So it’s composed of what we call nodes. Each node is itself a self contained computer and there are 4600 of those. And each of those nodes is equipped with a couple of IBM Power Nine processors, and in turn, each of those are attached to three NVIDIA GPUs each. So you total you have roughly 25,000 NVIDIA GPUs. You have a little more than 9000 Power Nine microprocessors, and these are configured in cabinets that are substantial number, and they cover a fair amount of states. And you’re not talking football field or anything like that. I think of that more perhaps the size of a tennis court.

Tom Temin: And I think somewhere along the line I read that there are so many interconnections in this machine that it’s almost like a tiny portion of the human brain.

Dave Turek: It’s certainly suggestive of that. The way these computers work, of course, they employ, these 4600 nodes, and they connect them with the network. The network itself is orchestrated under the direction of software, and the software will take a problem that you’re trying to analyze and essentially decompose it into pieces where each piece gets parceled out to one of these nodes. So it’s very akin to the way you might think of a brain working with a whole bunch of different cells, working on a given problem and nerves and so on connecting things together. So that goes on. And then the software will orchestrate the execution. And then, as each of the nodes kind of reports back the results of it’s responsibility in the calculation, the over-arching software running on the system will kind of synthesize that all together and come up with the conclusion of what the calculations produced.

Tom Temin: Now you mentioned earlier this machine belongs to the Energy Department. This is not the type of research that Energy as I understand it would normally do so from a program standpoint. Are they simply clearing out the energy research for the time being and making it available to whom?

Dave Turek: Actually, Department of Energy in the U.S. Has responsibility for advanced computing strategically for the U.S. Government. And Department of Energy has many different departments in it, but they have a number of scientific laboratories. Oak Ridge is one are gone. Argon is another Pacific Northwest National Lab and so on. Those air distinguished from the laboratories that are working, for example, on nuclear weapons, etc. So the mission of Oak Ridge is actually to pursue scientific inquiry into many, many different basic scientific disciplines and to look at problems of diverse nature. So they do a lot of work even prior to the advent of summit, in the biological field, chemical field and so on. And it’s a natural extension of what their mission is to pursue basic science. So it’s not unusual to see them engaged in this fashion.

Tom Temin: But for people that might be new to using supercomputers and research that might be joining the fight here from different academic institutions, or maybe from some of the CDC or NIH components, how do they get their questions translated into the types of algorithms that this computer is optimized to handle?

Dave Turek: Well, it’s interesting because President Trump announced the formation of COVID-19 High Performance Computing Consortium that we’ve been working out with the Department of Energy to make this computer resource were generally available to researchers around the country and eventually internationally as well. And that joins a number of other computers that the Department of Energy owns, that IBM owns and other institutions as well. So what we’ve been trying to do over the last week is take the magnitude of what Summit represents and go bigger and beyond that with the amalgamation of all it’s incremental additional compute resource around the country as people encounter scientific problems related to COVID-19 and they want to explore the use of advanced computing to tackle these problems, there is a website and methodology that one can solicit to get access and to bring your problems to any of these centers, including Summit at Oak Ridge. And if, for example, they brought it to courage or the people of Oak Ridge that would help you get your software in your program set up to run on the system. So it’s a diverse set of very, very powerful resource is being brought to bear against a pandemic,

Tom Temin: And this 8000 possible compounds in that reduction in a short period to 77 promising ones, how long like that have taken on a regular mainframe set up.

Dave Turek: Well, it could have taken years. The mathematics are extraordinarily complex in the number of simulations are extraordinarily large and given that if you look at the size of the Summit computer, it’s thousands of times bigger and faster than the kind of computer that our regular industrial company might have, given the fact that they were able to crunch this problem in roughly two days, you multiply that by 1000. You’re getting maybe 3 years or 2.5 years to get the same result on a much smaller computer. So there’s a huge advantage in terms of bringing this amount of computer power to bear with respect to compressing the discovery cycle of trying to find out what the best pathways are to attack your disease.

Tom Temin: Dave Turek is vice president for technical computing at IBM Cognitive Systems. Thanks so much for joining me.

Dave Turek: You’re welcome.


Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories