“Flatten the curve” was a rallying cry in early 2020 as Covid-19 began sweeping across the globe. Despite limited understanding of the virus and how it was transmitted, public health officials emphasized one point: reducing transmission was the surest way to deny Covid-19 the oxygen it needed to sustain itself.
Top disease experts were quickly able to model and reasonably predict Covid-19’s early behavior. Within just two months of the first recorded infection in the U.S., public health officials had effectively offered Epidemiology 101 to a classroom of more than 325 million people. These models were powerful educational tools during a period of intense uncertainty, offering insight into how quickly the virus spreads, the likelihood of fatal infection, and what a cresting wave of cases could look like.
Nearly two years later, though, modeling no longer seems as important in the toolkit used to determine how we discuss or anticipate learning from Covid in the future.
This shift reflects how experts’ understanding of the virus has evolved along with the development of new insights into how best to build effective models. Models are limited to, and bound by, the information and variables on which they are based. The more comprehensive and accurate the data, the more detailed and precise a model becomes — put another way, the better the data, the better the model.
With limited information in the early days of Covid-19, experts had relatively few variables to help predict how the virus would spread and what could be done to contain it. Even so, modeling was an effective solution for getting the most relevant information delivered to the most people. But as scientists came to understand the genetic make-up of SARS-CoV-2, the virus that causes Covid-19, and forecasting was measured against the actual effectiveness of prevention methods such as the role of wearing masks, modeling was rendered more applicable in highly specialized cross-sections of the population.
The findings from this approach — that without a vaccine the virus would spread uncontrolled — bucked against the political and logistical realities of containment policies across the U.S., which tested the feasibility of extended lockdowns and mandated social distancing. It became increasingly clear that there was no silver bullet to stop Covid-19, so the emphasis on national modeling was replaced by decentralized, highly localized policies based on current levels of infection within any given community.
With the emergence of the Omicron variant spurring new infections across the globe, there is even greater urgency to revise traditional epidemiological thinking through the lens of modeling’s rise and fall. How can this trend improve our response to Covid and its variants, and better prepare public health officials for the rapid spread of unknown viruses in the future?
Revisiting the value of modeling
The specialized data sets and variables that limited modeling’s effectiveness on a national scale show the value it can bring to defining a new operational normal for communities, businesses, and governments. Introducing information such as vaccination rates or established preventive policies can help local constituencies forecast behaviors and even derive economic benefits.
The key, however, is improving how and where specialized data are deployed and studied in safe environments. Rather than leaning on traditional tools, such as randomized control trials, cohort studies, or case-control studies — which may each be subject to its own set of regulatory conditions, especially when used to test pharmaceuticals and vaccine response — solutions driven by artificial intelligence can yield highly tailored models at an unprecedented rate using the same initial data. These virtual laboratories can simulate any number of variables, improving the scale at which modeling can quickly provide reliable insight. By experimenting digitally, public health officials can bypass the slow-moving process of in-person studies and create models that can inform real-world actions, saving lives in the process.
For example, a virtual laboratory environment can safely measure how different preventive policies, such as social distancing, lockdowns, or vaccination rate reporting, would impact the spread of a specific strain of virus. By cross-referencing the known impact variables (the policies) with a newly introduced variant, scientists can determine the optimal avenue for disease containment in near real-time.
Transitioning modeling to highly digitized, AI-driven solutions will play an important role in mapping the future of Covid-19 behavior and better prepare experts for future pandemics. Given the very nature of AI — self-improvement through healthy data streams and analysis — virtual simulation mapping creates the foundation to apply any conceivable parameter. The more we understand different contagion factors, the easier it becomes to predict how quickly a pathogen may spread or what measures will be the most effective in preventing infection.
That’s not to say we should rely on AI alone: there is always the potential for bias when AI is involved, and it’s important for humans to work side by side with the technology to prevent erroneous conclusions. But as new threats emerge, AI’s use of historical data can help the scientific community anticipate potential infection rates and overall impact.
Within a virtual laboratory, for example, experts can cross-compare individual activities such as grocery shopping or attending a sporting event in specific population groups. It can help answer questions such as: “If a virus is airborne, how quickly might it spread among a group of adults in a movie theater?” Understanding how a virus behaves in specific settings makes it possible for local communities to set policies that are tailored for their needs — especially if they contain a higher-than-average population of at-risk individuals, such as elderly people. This hyperspecific modeling technique can provide public health officials and government leaders the flexibility they need to make the appropriate decisions to protect their communities.
The future of modeling will be forever shaped by Covid-19: redeploying a nationwide infrastructure to arm local communities with the best information possible to pursue the most effective action. Digital solutions, such as virtual laboratories and simulations, have the potential to take this work even further. Virtual modeling is the perfect realm for experimentation, so experts can ask the right questions, identify the right signals, forecast the most likely outcomes, and plan accordingly.
Amir Mokhtari is a chief scientist in the Strategic Innovation Group at Booz Allen Hamilton.
Create a display name to comment
This name will appear with your comment