© Morris MacMatzen/Getty An employee of the German Climate Computing Center (DKRZ, or Deutsches Klimarechenzentrum) poses next to the "Mistral" supercomputer, installed in 2016, at the German Climate Computing Center on June 7, 2017 in Hamburg, Germany. |
By Jason Murdock, Newsweek
Scientific efforts to analyze, map and track the global COVID-19 outbreak are being aided by powerful supercomputers that scientists say may soon help to combat the infectious disease.
Computers capable of ingesting and processing vast amounts of raw data are coming together with the aim of better understanding the novel coronavirus that has now spread to at least 582,000 people in the U.S. and close to two million people globally, according to Johns Hopkins University statistics.
Last month, the White House unveiled the COVID-19 High Performance Computing Consortium - a cocktail of technology giants, federal agencies and academia including IBM, Amazon Web Services, Microsoft, the Massachusetts Institute of Technology (MIT), NASA, Google, HP and more.
The collective, drawing on supercomputer systems from five U.S. Department of Energy National Laboratories, now has at least 17 active projects, using machines to radically reduce the time needed for science and research, which experts say includes epidemiology and molecular modeling.
"These experiments would take years to complete if worked by hand, or months if handled on slower, traditional computing platforms," Dario Gil, Director of IBM Research, wrote in a blog post this month detailing how his division was contributing to the project.
The collective, drawing on supercomputer systems from five U.S. Department of Energy National Laboratories, now has at least 17 active projects, using machines to radically reduce the time needed for science and research, which experts say includes epidemiology and molecular modeling.
"These experiments would take years to complete if worked by hand, or months if handled on slower, traditional computing platforms," Dario Gil, Director of IBM Research, wrote in a blog post this month detailing how his division was contributing to the project.
According to Gil, the work already showed signs of promise after the Summit supercomputer was used by researchers at the Oak Ridge National Laboratory and the University of Tennessee.
"[Researchers screened] 8,000 compounds to find those that are most likely to bind to the main 'spike' protein of the coronavirus, rendering it unable to infect host cells," he noted.
"They were able to recommend the 77 promising small-molecule drug compounds that could now be experimentally tested. This is the power of accelerating discovery through computation."
Nvidia, too, confirmed this month that a team of its computer scientists had joined the new research effort, under the leadership of Ian Buck, who is the firm's vice president of Accelerated Computing. NVIDIA graphics processing units (GPUs) are used in many of the leading supercomputers.
"The COVID-19 HPC Consortium is the Apollo program of our time," Buck said in a statement that was published to the company's website. "Not a race to the moon, this is a race for humanity. The rocket ships are GPU supercomputers, and their fuel is scientific knowledge."
"NVIDIA is going to help by making these rockets travel as fast as they can. Achieving progress will ultimately require combining three essential ingredients—domain scientists, computer scientists and high-performance computers. We're honored to play a role in this effort," he added.
The launch of the consortium has fueled a fresh wave of supercomputing activity, all dedicated to solving key riddles of the disease responsible for more than 119,000 deaths globally.
This month, the National Center for Atmospheric Research (NCAR) said its supercomputer, which is a 5.34-petaflop machine known as "Cheyenne," would now be used to help research the pandemic, including potential transmission patterns and whether it is affected by seasonal changes.
"[Researchers screened] 8,000 compounds to find those that are most likely to bind to the main 'spike' protein of the coronavirus, rendering it unable to infect host cells," he noted.
"They were able to recommend the 77 promising small-molecule drug compounds that could now be experimentally tested. This is the power of accelerating discovery through computation."
Nvidia, too, confirmed this month that a team of its computer scientists had joined the new research effort, under the leadership of Ian Buck, who is the firm's vice president of Accelerated Computing. NVIDIA graphics processing units (GPUs) are used in many of the leading supercomputers.
"The COVID-19 HPC Consortium is the Apollo program of our time," Buck said in a statement that was published to the company's website. "Not a race to the moon, this is a race for humanity. The rocket ships are GPU supercomputers, and their fuel is scientific knowledge."
"NVIDIA is going to help by making these rockets travel as fast as they can. Achieving progress will ultimately require combining three essential ingredients—domain scientists, computer scientists and high-performance computers. We're honored to play a role in this effort," he added.
The launch of the consortium has fueled a fresh wave of supercomputing activity, all dedicated to solving key riddles of the disease responsible for more than 119,000 deaths globally.
This month, the National Center for Atmospheric Research (NCAR) said its supercomputer, which is a 5.34-petaflop machine known as "Cheyenne," would now be used to help research the pandemic, including potential transmission patterns and whether it is affected by seasonal changes.
The same week, experts from University College London (UCL) said big data and supercomputing would be useful for identifying new antiviral medication by screening libraries of potential drugs, studying the spread of the virus within communities and analyzing COVID-19's structure.
Dan Stanzione, executive director of the Texas Advanced Computing Center, told Bloomberg this month that the use of supercomputers could drastically help to progress medical efforts.
"Fundamentally, it is size and speed," he said, noting the systems were running tens of thousands of epidemiology models at once—not a volume normal computers could manage. And looking ahead, Stanzione said they could play a role in the narrowing down of a potential vaccination.
"In the end, I think computing is not going to be our bottleneck," he elaborated. "We can run these things pretty fast, there are many many different chemical compounds we might want to try."
Professor Peter Coveney, of UCL, agreed that the consortium's supercomputer plan could speed up the hunt for viable COVID-19 treatments. Or, at the very least, flag some promising drugs.
"This is a much quicker way of finding suitable treatments than the typical drug development process," Professor Coveney elaborated in a statement. "It normally takes pharmaceutical companies 12 years and $2 billion to take one drug from discovery to market, but we are rewriting the rules by using powerful computers to find a needle in a haystack in a fraction of that time and cost."
Stanzione conceded that computing power is unlikely to be an issue, but noted there are limits to the work. He told Bloomberg: "Really, what the computers do is help reduce the number of things we need to try in the lab. Ultimately for vaccines [we are] upstream of what the medical chemists do."
"All we can do is say, of the ten to the 60th possible chemical compounds in the universe, here is a very large set that we know won't work, here's maybe a few dozen candidates to try. But they are still going to have to go through the clinical trials," he added. "That's where a lot of time will still be taken."
See more at: Newsweek
Dan Stanzione, executive director of the Texas Advanced Computing Center, told Bloomberg this month that the use of supercomputers could drastically help to progress medical efforts.
"Fundamentally, it is size and speed," he said, noting the systems were running tens of thousands of epidemiology models at once—not a volume normal computers could manage. And looking ahead, Stanzione said they could play a role in the narrowing down of a potential vaccination.
"In the end, I think computing is not going to be our bottleneck," he elaborated. "We can run these things pretty fast, there are many many different chemical compounds we might want to try."
Professor Peter Coveney, of UCL, agreed that the consortium's supercomputer plan could speed up the hunt for viable COVID-19 treatments. Or, at the very least, flag some promising drugs.
"This is a much quicker way of finding suitable treatments than the typical drug development process," Professor Coveney elaborated in a statement. "It normally takes pharmaceutical companies 12 years and $2 billion to take one drug from discovery to market, but we are rewriting the rules by using powerful computers to find a needle in a haystack in a fraction of that time and cost."
Stanzione conceded that computing power is unlikely to be an issue, but noted there are limits to the work. He told Bloomberg: "Really, what the computers do is help reduce the number of things we need to try in the lab. Ultimately for vaccines [we are] upstream of what the medical chemists do."
"All we can do is say, of the ten to the 60th possible chemical compounds in the universe, here is a very large set that we know won't work, here's maybe a few dozen candidates to try. But they are still going to have to go through the clinical trials," he added. "That's where a lot of time will still be taken."