Mathematicians are at frontline in the fight against Covid
The 20th century was marked by major scientific and technological breakthroughs: nuclear power, DNA sequencing, the development of the aerospace industry, and advanced weather forecasting, among many other remarkable examples. All these discoveries were only possible thanks to a new way of doing science in which the physical laws of the universe were emulated using complex computer systems. High-performance Computing (HPC) has now become a fundamental tool to unlock modern challenges, such as the modelling of financial markets, drug discovery, and developing response strategies to the Covid-19 pandemic.
In a recent UKRI project, together with colleagues at the University of Warwick, we are studying the use of HPC for determining optimised, localised lockdown policies. Both empirical and computational evidence indicates that optimised lockdowns are effective in mitigating the sanitary effects of the pandemic. Prolonged lockdowns however, come at a high societal and economic impact, with the risk of triggering severe economic turmoil with unforeseeable consequences. By harnessing the power of supercomputers, we can prevent this by determining optimal quarantine protocols, which safeguard NHS operation while minimising economic impact.
But, how do we integrate complex mathematical models with computing and data, to generate a forecast and trigger an analytics-based decision?
Integrating complex mathematical models with the ability to process vast amounts of data give us the ability to forecast the future and make better decisions.
Let’s illustrate this idea with a personal favorite: the weather forecast. It’s become a commonplace to make fun of the weather forecast, however the story behind the last 100 years of numerical weather prediction is nothing short of amazing. The fantasy of predicting the weather has accompanied humans for centuries. However, it was not until 1904, when Vilhelm Bjerknes, founder of the Bergen School of Meteorology, proposed a systematic approach using fluid dynamics, that this came closer to reality.
Bjerknes identified a set of equations, known as primitive equations, and a handful of physical variables such as wind, pressure, and temperature which characterise the state and evolution of the atmosphere. Such a modelling would generate an accurate weather forecast, with two vital ingredients: a complete set of observations describing the current state of the atmosphere; and sufficient computational power to calculate the primitive equations.
"The idea of complementing theory and experimentation/observation along with computation, once revolutionary, has now become one of the foundations of our modern information society."
However, in 1904, weather stations were scarce and digital computers non-existent. In his 1922 book Weather Prediction by Numerical Process, English mathematician and pioneer of numerical weather prediction, Lewis Fry Richardson, estimated that it would require 60,000 people working with slide rules - old mechanical computing devices - to predict tomorrow’s weather before it actually arrives. In less than two decades, weather forecasting had transitioned from witchcraft, to a physically meaningful - but still unreachable - computation.
In 1950, John and Klára Dán von Neumann used the ENIAC – the world’s first programmable digital computer, built in the US - to produce the first successful . Over the next 70 years, the development of supercomputers along with an ever-growing amount of measurements (satellite images, aircraft records, and weather balloons) have improved the accuracy of operational weather forecast systems from 25% to 85% for a 36-hour forecast.
However, during the Covid-19 pandemic, travel bans have grounded thousands of aircraft, depriving the numerical weather prediction leviathan from an important source of data. A recent paper in Geophysical Research Letters reports a substantial decrease in weather forecast accuracy during the pandemic due to air traffic observations being reduced by more than half, illustrating the complexity of the problem.
The idea of complementing theory and experimentation/observation along with computation, once revolutionary, has now become one of the foundations of our modern information society. A fundamental advantage of this “new scientific method” is the abstract framework provided by mathematical modelling and computation.
The methodology followed by the weather forecast pioneers is the same used in our project for the fight against Covid-19. We develop mathematical models for the spatio-temporal evolution of the pandemic, in an attempt to estimate the basic reproduction number, or to forecast the effect of public policies such as lockdowns, vaccination programs, or sanitary passports. The mathematical models are then fed with public healthcare data, mobility data collected from mobile phone activity, and even macroeconomic values to account for unemployment and productivity. Many of these databases are downscaled to the 300 local authority districts in England, creating a problem of huge computational complexity.
High-performance computing is a powerful and indispensable tool in our fight against the virus. It enables the calculation of localised lockdown policies in a matter of hours. As new data continuously enters the loop, models are re-calibrated, new forecasts are made, and policies modified. We do this research with a mathematical conviction that the methodology we create will outlast the pandemic and will be used for forecasting other complex systems where its evolution can be linked to both models and measurements. Hopefully, from this new way of making science also emerges a new approach to agile decision-making driven by data, modelling and computation.
Institute for Policy and Engagement
This blog appears as part of the series from the Institute for Policy and Engagement on the university’s ongoing research contribution to the Covid-19 effort.