The 25th Solar Cycle is about to begin, with new evidence for enormous solar storms

Foster Langbein and Paul Somerville, Risk Frontiers.

Solar Cycle 25 is the upcoming and 25th solar cycle since 1755, when extensive recording of solar sunspot activity began. It is expected to begin around April this year and continue past 2030 (see Figure 1). Stronger solar activity tends to occur during odd-numbered solar cycles with a number of events occurring during cycle 23 (see table at the end of this briefing), however solar events are more frequent near any maxima such as the Quebec geomagnetic storm in 1989 which coincided with cycle 22.

Solar sunspot cycle
Figure 1: The Solar sunspot cycle showing the roughly eleven-year periodicity and a recent forecast from NOAA for cycle 25.

Even moderate space weather events cause significant risks to airline communications and power industries through service interruptions as well as potential damage. Severe incidents are capable of damaging or destroying the very large high-voltage transformers on which our power networks depend with replacements for these custom-built components potentially taking years to implement. Internationally, such damage is often covered by traditional insurance policies through the prolonged effect of power outages. In Australia  insurers are likely to be less impacted, however the impact on business would be severe;  businesses would  need to have negotiated the inclusion of public utilities extension in policies for failure of electricity supply and utility companies are not liable for failure to supply electricity in the event of a natural disaster.

The most severe event in recorded history was the Carrington event of 1859, where auroral effects were clearly visible at mid-latitudes across the globe, for example in Sydney. It is estimated this event was at least twice as severe as the 1989 Quebec event. Extreme events such as these are a concern for re-insurers because their global scale limits the effectiveness of regional diversification.

Of particular note is the risk to satellites, with approximately two thirds of the 35 satellites launched annually covered by damage and liability insurance, up to a value of $700 million (Lloyds 2010). Between 1996 and 2005, insurers paid nearly US$2 billion to cover satellite damage, of which a significant proportion is solar-related and it is estimated that solar disruptions to satellites cost on the order of $100 million a year (Odenwald and Green, 2008).

The flow-on impacts of power cuts to other industries can be significant, with studies suggesting brownouts and blackouts in the USA causing $80 billion of economic losses every year (Odenwald & Green 2008). Between June 2000 and December 2001, it is estimated that solar storms increased the total cost of electricity in the US by $500 million. The capacity of even moderate events to cause significant cost is exemplified in the solar incidents, listed in the table at the end of this briefing, that occurred during recent solar maxima.

With the onset of the next odd-numbered solar maximum this year, an increased frequency of solar event activity as the cycle progresses is expected, especially in the moderate to severe intensity range. Moderate events have been easily dealt with by insurance companies and are unlikely to affect large portions of the planet. Solar storm impacts are most significant in areas close to the poles, with a reduced likelihood of widespread power failure in Australia. The risk is, however, non-zero with the drive to inter-connect our power networks through long transmission lines increasing the susceptibility to Geomagnetic Induced Currents (GICs). Although space-weather alerts issued by the Bureau of Meteorology are monitored to mitigate the risk by compartmentalizing the network during event occurrence, the residual risk has not been studied. The implications for emergency management, for example, of widespread power outage and the cascading effects of our increasingly interconnected networks are similarly unstudied in an Australian context, particularly if a system is already under stress, say, during a heatwave.

In addition to the consideration of purely local effects, the impacts of solar storms on communication and navigation systems world-wide and in space are likely to have flow-on effects for productivity to Australian businesses.

Although the incidence of moderate solar effects is expected to increase, extreme events at the Carrington scale are not well correlated with the solar cycle. A recent discovery of evidence in Greenland for a huge solar storm that occurred 2,500 years ago is thought to be of similar magnitude to an event in 774–775AD where there was an observed increase of 1.2% in the concentration of carbon-14 isotope in tree rings in Japan dated to the years 774 or 775 AD (Miyake et al., 2012). A surge in beryllium isotope 10Be, detected in Antarctic ice cores, has also been associated with this event, suggesting that it was a solar flare having global impact. Although a solar flare event will not have the geomagnetic effects on our power network that an event such as the Carrington or 1989 Quebec events would, the implications for our satellite systems – imaging, GPS, communications, and so on – would be catastrophic. The new discovery gives some frequency context to this, suggesting a return period on the order of 1000 years, which, although large, should not be ignored given the likely severity of such an event. An event at ARI 1000 would have a 3% chance to occur once in a 30-year period.

The following article, by Ian Sample, Science Editor of The Guardian, appeared on 12 March 2019 under the title “Radioactive particles from huge solar storm found in Greenland.”


Traces of an enormous solar storm that battered the atmosphere and showered Earth in radioactive particles more than 2,500 years ago have been discovered under the Greenland ice sheet. Scientists studying ice nearly half a kilometre beneath the surface found a band of radioactive elements unleashed by a storm that struck the planet in 660BC. It was at least 10 times more powerful than any recorded by instruments set up to detect such events in the past 70 years, and as strong as the most intense known solar storm, which hit Earth in AD775.

Raimund Muscheler, a professor of quaternary sciences at Lund University in Sweden, said: “What our research shows is that the observational record over the past 70 years does not give us a complete picture of what the sun can do.” The discovery means that the worst-case scenarios used in risk planning for serious space weather events underestimate how powerful solar storms can be, he said.

Solar storms are whipped up by intense magnetic fields on the surface of the sun. When they are pointed toward Earth they can send highly energetic streams of protons crashing into the atmosphere. The sudden rush of particles can pose a radiation risk to astronauts and airline passengers, and can damage satellites, power grids and other electrical devices.

Scientists have come to realise over the past decade that intense solar storms can leave distinct traces when they crash into the planet. When high energy particles slam into the stratosphere, they collide with atomic nuclei to create radioactive isotopes of elements such as carbon, beryllium and chlorine. These can linger in the atmosphere for a year or two, but when they reach the ground they can show up in tree rings and ice cores used to study the ancient climate.

Muscheler’s team analysed two ice cores drilled from the Greenland ice sheet and found that both contained spikes in isotopes of beryllium and chlorine that date back to about 660BC. The material appears to be the radioactive remnants of a solar storm that battered the atmosphere.

The scientists calculate that the storm sent at least 10bn protons per square centimetre into the atmosphere. “A solar proton event of such magnitude occurring in modern times could result in severe disruption of satellite-based technologies, high frequency radio communication and space-based navigation systems,” they write in Proceedings of the National Academy of Sciences.

Britain’s emergency plans for severe space weather are based on a worst-case scenario that involves a repeat of the 1859 Carrington event. This was a powerful geomagnetic storm set off by a huge eruption on the sun known as a coronal mass ejection. A 2015 Cabinet Office report anticipated only 12 hours warning of a similar storm that could lead to power outages and other disruption. The discovery of more powerful solar storms in the past 3,000 years suggests that space weather can be worse than the UK plans for. “The Carrington event is often used as a worst-case scenario, but our research shows that this probably under-estimates the risks,” said Muscheler.


REFERENCES

Brooks, M. (2009) Space storm alert: 90 seconds from catastrophe. New Scientist 2700

Burns, A.G., Killeen, T.L., Deng, W., Cairgnan, G.R., and Roble, R.G. (1995)

Dayton (1989) Solar storms halt stock market as computers crash. New Scientist 1681

Lloyd’s (2010) Insurance on the Final Frontier. Available from http://www.lloyds.com/News-and-Insight/News-and-Features/Specialist/Specialist-2010/Insurance_on_the_final_frontier

Marshall et al, A preliminary risk assessment of the Australian region power network to space weather, SPACE WEATHER, VOL. 9, S10004, doi:10.1029/2011SW000685, 2011

Miyake, F., K. Nagaya, K. Masuda, and T. Nakamura (2012).   A signature of cosmic-ray increase in AD 774-775 from tree rings in Japan. Nature, Volume 486, Issue 7402, pp. 240-242 (2012).

NASA (2008) A Super Solar Flare. Available from http://science.nasa.gov/science-news/science-at-nasa/2008/06may_carringtonflare/

NOAA (2003) October-November 2003 Solar Storm. Available from http://www.magazine.noaa.gov/stories/mag131b.htm

Odenwald, S.F. and Green, J.L. (2008) Bracing the Satellite Infrastructure for a Solar Superstorm. Scientific American, available from http://www.scientificamerican.com/article.cfm?id=bracing-for-a-solar-superstorm

Sample, Ian (2019). Radioactive particles from huge solar storm found in Greenland. https://www.theguardian.com/science/2019/mar/11/radioactive-particles-from-huge-solar-storm-found-in-greenland

Solar Storms (unknown date) Available from http://www.solarstorms.org/SRefStorms.html

Devil’s Staircase of Earthquake Occurrence: Implications for Seismic Hazard in Australia and New Zealand

Paul Somerville, Principal Geoscientist, Risk Frontiers

The temporal clustering of large surface faulting earthquakes that has been observed in the western part of Australia has been elegantly explained by the Devil’s Staircase fractal model of fault behaviour. Although the only available paleoseismic observations in eastern Australia are from the Lake Edgar fault in Tasmania, it seems likely that the Devil’s Staircase also describes large surface faulting occurrence in Eastern Australia and more generally worldwide.


Paleoseismic Observations of Surface Faulting Recurrence in Australia

Clark et al. (2012, 2014) showed that large surface faulting earthquakes in Australia are clustered within relatively short time periods that are separated by longer and variable intervals of quiescence. Figure 1 shows the time sequences of large earthquakes on a set of faults in Australia in the past million years inferred from paleoseismic studies. Most of these faults are in Western Australia and it is remarkable that earthquakes have occurred on three of these faults in historical time.  Before now, the most recent period of activity was about 10,000 years ago. Few observations of this kind are available in other stable continental regions of the world analogous to Australia.

Occurrence of surface faulting earthquakes on individual faults in the past million years.
Figure 1. Occurrence of surface faulting earthquakes on individual faults in the past million years. Source: Clark et al. (2012).

The Devil’s Staircase in Global Earthquake Catalogues

Clark et al. (2012) proposed the earthquake recurrence model shown in Figure 2 in which clusters of several earthquakes are separated by long intervals of seismic quiescence. Chen et al (2020) have shown that this irregular earthquake recurrence can be described mathematically by the “Devil’s Staircase” (Mandelbrot, 1982; Turcotte, 1997). The Devil’s Staircase is a fractal property of complex dynamic systems. The Devil’s Staircase is commonly found in nature, and fractal properties are scale invariant, and so they are observed on all scales. Fractal systems are characterized by self-organised criticality, in which large interactive systems self-organize into a critical state in which small perturbations result in chain reactions that can affect any number of elements within the system (Winslow, 1997).

Schematic model of earthquake recurrence on a fault in Australia
Figure 2. Schematic model of earthquake recurrence on a fault in Australia. Source: Clark et al. (2012)

Chen et al. (2020) fit the interevent time data with probability models using the maximum likelihood method to a set of earthquake catalogues, one of which is shown on the left of Figure 3. They tested five probability models (Poisson, gamma, Weibull, lognormal, and Brownian passage time [BPT]). The Poisson model assumes that, although the mean interval between events is known for a sequence, the exact occurrence time of each event is random. The interevent-time distribution of such a sequence follows an exponential distribution. The Poisson model is a simple one-parameter model commonly used in seismic hazard analysis, and is a special case of the more generalized gamma and Weibull distributions. Both the gamma and Weibull models fit the data for earthquakes of magnitude 6 and larger better than the Poisson model, whereas the lognormal and BPT models fit worse, as shown on the right of Figure 3.

Figure 3. Left: Cumulative number of earthquakes in the world with magnitudes 8.5 or larger since 1900; declustering indicates the removal of dependent events (aftershocks). Right: Comparison of the relative frequency histograms (rectangular columns) of the distribution of interevent times with probabilities predicted by five probability models (curves) for all earthquakes in the world with magnitude 6 or larger. Source: Chen et al. (2020).

The variation of the interevent times can be measured by the coefficient of variation (COV), or aperiodicity, which is defined by the ratio of the standard deviation of interevent times to the mean of interevent times (Salditch et al., 2019). For a sequence generated by a Poisson process, the COV value is 1. To measure the deviation from the Poisson model, Chen et al. (2020) use a normalized COV, called the burstiness parameter B (Goh and Barabási, 2008), whose value ranges from −1 to 1. B of −1 corresponds to a perfectly periodic sequence with a COV of 0; B of 1 corresponds to the most bursty sequence with infinite COV, and B of 0 corresponds to a sequence produced by an ideal Poisson process a COV of 1.  Thus, a sequence is “bursty” when 0 < B < 1 (Fig. 4b) and quasiperiodic (the opposite of “bursty”) when −1 < B < 0 (Fig. 4c).

Figure 4. (a) A sequence of events generated by a Poisson model. (b) A bursty sequence generated by the Weibull interevent-time. (c) A quasiperiodic sequence generated by the Gaussian interevent-time distribution. Source: Chen et al. (2020).

Implications of the Devil’s Staircase for Seismic Hazard Analysis in Australia and New Zealand

The Devil’s Staircase pattern of large earthquakes has important implications for earthquake hazard assessment. The mean recurrence time, a key parameter in seismic hazard analysis, can vary significantly depending on which part of the sequence the catalogue represents. This can be important in hazard assessment, because catalogues for large earthquakes are often too short to reflect their complete temporal pattern, and it is difficult to know whether the few events in a catalogue occurred within an earthquake cluster or spanned both clusters and quiescent intervals. Consequently, an event may not be “overdue” just because the time since the previous event exceeds a “mean recurrence time” based on an incomplete catalogue.

The Poisson model is a time-independent model in which each event in the sequence is independent of other events. However, Devil’s Staircase behaviour indicates that most earthquake sequences, especially when dependent events are not excluded, are burstier than a Poisson sequence and may be better fit by the gamma or Weibull distributions. The conditional probability of another large earthquake for both the gamma and Weibull models is higher than that of the Poisson model soon after a large earthquake.

This concept underlies the earthquake forecast for Central New Zealand developed by an international review panel convened by GNS Science in 2018 and published by Geonet (2018). This forecast relies in part on transfer of stress from the northeast coast of the South Island to the southeast coast of the North Island following recent earthquake activity in the region, notably the Mw 7.8 Kaikoura earthquake of 2016 which occurred off the northeast coast of the South Island. Risk Frontiers has implemented this time-dependent earthquake hazard model in our recent update of QuakeNZ. Earthquake clusters involving stress transfer are ubiquitous and have occurred recently on the Sumatra subduction zone (2004 – 2008) and along the North Anatolia fault in Turkey (1939 – 1999).

Given the pervasive occurrence of fractal phenomena in geology (Turcotte, 1997) and the identification by Chen et al. (2020) of Devil’s Staircase recurrence behaviour in a wide variety of earthquake catalogues, it is likely that this is a general feature of earthquake occurrence.

Temporal Clustering of Very Large Subduction Earthquakes

The left side Figure 3 reflects two clusters of very large subduction earthquakes. The first occurred in the middle of last century and included the 1952 Mw 9.0 Kamchatka earthquake, the 1960 Mw 9.5 Chile earthquake and the Mw 9.2 Alaska earthquake. The second cluster began with the occurrence of the Mw 9.15 Sumatra earthquake of 26 December 2004 and continued with the Mw 8.8 Chile earthquake on 27 February 2010 and the Mw 9.0 Tohoku earthquake on 11 March 2011. The usual approach to assessing the significance of this apparent clustering is to test statistically the hypothesis that the global earthquake catalogue is well explained by a Poisson process. Risk Frontiers analysed the power of such tests to detect non-Poissonian features, and showed that the low frequency of large events and the brevity of our earthquake catalogues reduce the power of the statistical tests and render them unable to provide an unequivocal answer to this question (Dimer de Oliveira, 2012).  This conclusion is consistent with the Devil’s Staircase behaviour shown in Figure 3.

References

Chen, Y., M. Liu, and G. Luo (2020). Complex Temporal Patterns of Large Earthquakes: Devil’s Staircases, Bull. Seismol. Soc. Am. XX, 1–13, doi: 10.1785/0120190148

Clark, D., A. McPherson, and T. Allen (2014). Intraplate earthquakes in Australia, in Intraplate Earthquakes, Cambridge University Press, New York, New York, 49 pp.

Clark, D., A. McPherson, and R. Van Dissen (2012). Long-term behaviour of Australian stable continental region (SCR) faults, Tectonophysics 566, 1–30.

Dimer de Oliveira, F. (2012). Can we trust earthquake cluster detection tests? Risk Frontiers Newsletter Vol. 11 Issue 3.

Dimer de Oliveira, F. (2012). Can we trust earthquake cluster detection tests? Geophysical Research Letters, Vol. 39, L17305, doi:10.1029/2012GL052130.

Geonet (2018). Updated earthquake forecast for Central New Zealand. https://www.geonet.org.nz/news/5JBSbLk9qw8OU4uWeI86KG

Goh, K.-I., and A.-L. Barabási (2008). Burstiness and memory in complex systems, Europhys. Lett. 81, 48002.

Mandelbrot, B. B. (1982). The Fractal Geometry of Nature, W. H. Freeman, New York, New York.

Salditch, L., S. Stein, J. Neely, B. D. Spencer, E. M. Brooks, A. Agnon, and M. Liu (2019). Earthquake supercycles and Long-Term Fault Memory, Tectonophysics 228/289, doi: 10.1016/j.tecto.2019.228289.

Somerville, Paul (2018). Updated GNS Central New Zealand Earthquake Forecast, Risk Frontiers Briefing Note 364.

Turcotte, D. L. (1997). Fractals and Chaos in Geology and Geophysics, Cambridge University Press, New York, New York.

Winslow, N.  (1997). Introduction to Self-Organized Criticality and Earthquakes http://www2.econ.iastate.edu/classes/econ308/tesfatsion/SandpileCA.Winslow97.htm

Future of bushfire fighting in Australia

Andrew Gissing, Risk Frontiers, Neil Bibby, People & Innovation

Australia needs to be ambitious in its thinking about how future bushfires are managed and fought. Recent bushfires caused significant damage and widespread disruption leaving some 3093 homes destroyed (AFAC) and 35 fatalities as well as major damage to community infrastructure. We must learn from this experience.

Today’s management of bushfire risk is largely reliant on long standing approaches that are resource intensive and which struggle to control fires when conditions are catastrophic. This issue is compounded under a warming climate with fire seasons becoming longer, and days of significant fire danger more frequent.

An inherent problem is that bushfire detection is complex and in the time it takes before resources can be tasked and targeted, bushfires have already spread to the point where suppression is difficult. This problem is exacerbated when bushfire ignition occurs in remote areas far from emergency management resources. Making the problem worse still is a growing bushland-urban interface where buildings and community infrastructure are highly vulnerable and exposure is growing.

Innovation to discover the next generation of firefighting capability should be a priority in any government response to the Black Summer bushfires. Our institutions must think big.

To explore blue sky thinking in respect of future firefighting capabilities and enhanced bushfire resilience, Risk Frontiers and People & Innovation hosted a forum with experts in construction, technology, aviation, insurance, risk management, firefighting and information technology. In what follows, insights and questions arising from this forum are outlined.

New thinking is required

There are two stages in considering future capabilities. The first stage is planning and investment to improve capabilities in the short term particularly before the next bushfire season, and the second stage is research and innovation to inspire the next generation of firefighting capability. What is needed is a blueprint of how bushfires will be fought in the future. This blueprint should be focused on a vision whereby bushfires can be rapidly managed and controlled in a coordinated manner informed by advanced predictive intelligence; and where the built environment is resilient. Key research questions to be answered in the development of such a blueprint include:

Bushfire detection and suppression

  • How can bushfires be detected more quickly?
  • How can bushfires be extinguished before they are able to spread?
  • How can the safety of firefighters be improved?

Coordination

  • How can communications enable effective coordination?
  • How can resources be tasked and tracked in a more effective manner?
  • How can situational awareness be enhanced to inform decision-making?

Community resilience

  • How can new buildings be made more resilient?
  • How can existing building stock be retrofitted for resilience?
  • How can community infrastructure such as energy distribution systems, telecommunications, water supplies and sewerage systems be designed with greater resilience?

Short term

It is widely agreed that in the short term there are many technologies and systems already existing that could enhance firefighting and broader disaster management capabilities. Specific opportunities identified by industry experts include:

  • Satellites, such as data sourced from the Himawari satellite, should be evaluated for their ability to enhance fire detection. High Altitude Platform Systems may be another option.
  • In the United States, Unmanned Aerial Vehicles (UAV) have been employed to provide enhanced imagery over firegrounds and if equipped with infrared sensors these can support monitoring of fire conditions at night. The Victorian Government has established a panel contract with UAV providers to assist with real-time fire detection and monitoring. Further policy regarding airspace management is required to support wider demand-based deployments of UAVs.
  • Existing agricultural monitoring technologies could be repurposed to monitor bushfire fuels and soil conditions.
  • Balloons equipped with radio communications could provide coverage when traditional communications technologies have been disrupted. Alternatively, small UAVs could create a mesh network to provide a wireless communications network or equipment fitted to aircraft.
  • Advances in the use of robotics in the mining sector may provide applications to firefighting, for example autonomous trucks.
  • Resource tracking technologies could be implemented to improve coordination and firefighter safety.
  • Emerging fire extinguisher technologies could help to suppress bushfires.

Operational decisions could be improved by enhanced collation and fusion of data already available. There are many data sources that are managed by different organisations, not just government agencies. Collating these datasets to provide a common operating picture across all organisations would improve situational awareness and data analytics.

The widespread adoption of artificial intelligence and greater digital connectedness across the economy and emergency management sector will find new ways to make sense of data and improve decisions. In the built environment, improved information to households about the resilience of their buildings along with programs to implement simple retrofitting measures should be considered. In the aftermath of bushfires, governments should consider land swaps and buy-outs to reduce exposure in high risk areas. Similarly, governments should better plan communities to ensure infrastructure is more resistant to failure when most needed in emergencies.

2030 and beyond

A key area for research and innovation investment over the coming decade should be how to rapidly suppress bushfires once detected. This could see swarms of large capacity UAVs supported by ground-based drones to target suppression and limit fire spread. Resources would be rapidly dispatched and coordinated autonomously once a bushfire was detected. Pre-staging of resources would be informed by advanced predictive analytics and enabled by unmanned traffic management systems. UAVs and drones would have applications beyond fire suppression including for rapid impact assessment, search and rescue, logistics and clearance of supply routes.

The way forward

A research and innovation blueprint is needed that outlines how technologies will be translated to enhance firefighting and resilience in the short term and, beyond this, how the next generation of capability will be designed and built. Its development should involve government, research and industry stakeholders in a collaborative manner. The final blueprint should be integrated with future workforce and asset planning to support broader change management.

Adopting new technologies will not be easy and existing cultural and investment barriers should be considered. In adopting new technologies, it is important to recognise that innovation is an iterative process of improvement and will rarely provide a perfect solution in the first instance.

Public private partnerships will be key to realising opportunities and government must seek to engage a broad range of stakeholders. In the aftermath of Hurricane Sandy in the United States in 2012, the US Government launched a competition called ‘re-build by design’ focused on proactive solutions to minimise risk. Already in Australia, numerous innovation challenges involving businesses and universities are being held to assist in inspiring ideas. There is an opportunity to harness and coordinate such challenges on a grand scale to promote new thinking and collaboration linking directly with responsible agencies.

We need to be bold in our thinking!

Acknowledgements

Forum participants included IAG, SwissRe, IBM, Defence Science and Technology, IAI, Cicada Innovations, Lend Lease and ARUP.

Bushfire impact research – NSW South Coast

Steven George, James O’Brien, Salomé Hussein, Jonathan van Leeuwen, Risk Frontiers

Risk Frontiers deployed a team to the NSW South Coast region in late January, 2020 to undertake damage surveys following the bushfires. This research was supported by the Bushfire and Natural Hazards CRC (BNHCRC). The areas surveyed included Moruya, Mogo, Malua Bay, Rosedale, the Catalina area of Batemans Bay and Lake Conjola. The majority of damage occurred on December 31, 2019 as catastrophic weather conditions (extreme temperatures and strong winds) intensified existing fire fronts. The conditions transported large quantities of embers into vulnerable communities, destroying hundreds of residential and several commercial buildings. In total, the survey identified 426 bushfire affected properties, most of which were destroyed. Industries/infrastructure affected included: bowling/services club, a unit block (12 units), heritage park, industrial complex with numerous businesses and extensive damage to electricity infrastructure (power poles and wires along the Princes Highway). This report complements our report for northern NSW (Risk Frontiers, 2020).

Building age and resilience

As the 2019/2020 fire season progressed, the scale of damage and losses experienced across the country engendered a growing interest in evaluating the resilience of buildings to bushfires. Aspects of buildings such as age, performance of construction materials and a structure’s vulnerability due to its proximity to bushland were the key focus of the NSW South Coast survey. To evaluate the performance of building archetypes impacted by fire, the Insurance Council of Australia (ICA) charted the year of construction of over 25,000 residential buildings located within bushfire impacted areas across four states (Figure 1). Categories range from Old Colonial (pre-Victorian) to post-2009, when bushfire building standards began to be improved and were mandated in certain locations.

Building construction types impacted by 2019 bushfires in Australia
Figure 1: The period of construction for over 25,000 buildings located within the current bushfire impacted areas across four states. Source: Insurance Council of Australia, 2019.

The ICA data shows that only 9.5% of residences were constructed post-2009, when changes were made to Australian Standard 3959 after the Black Saturday fires of February, 2009, to ensure that new buildings in bushfire-prone areas were safer and more likely to survive a fire (BNHCRC, 2019). It was apparent that the scale of residential losses occurring this fire season presented a small window of opportunity to conduct further damage surveys, prior to recovery and debris removal, and would provide a considerable ‘post-2009’ cohort to assess building performance and inform future design. In the near future, further analysis will be undertaken by Risk Frontiers to establish the construction age of the South Coast properties, with a focus on any post-2009, to expand existing research.

Observations of destruction/damage – construction materials

The survey team recorded aspects of fire affected buildings such as construction materials and damage ratios (destroyed/partially destroyed). The field observations from the South Coast survey are compared to those in Rappville (2019) and Tathra (2018) in Figure 2.

Proportion of buildings destroyed  South Coast, Tathra and Rappville
Figure 2: The proportion and number (in column) of buildings categorised as destroyed/partially destroyed. The South Coast and Rappville (2019) damage surveys used a building footprint method where partially destroyed references the building, not the lot. The sampling method figures from the Tathra fire in 2018 assigned partially damaged on the proportion of the whole lot – that is, if a shed was destroyed but the house was undamaged, then a partially damaged rating was assigned. The data shows that once a building is alight, the likelihood of it being destroyed is very high. The total destruction rate across the three events ranged between two-thirds and 100%. The number of properties destroyed also indicates the difference in scale of the fire events between locations.

The South Coast findings reinforce those from the Rappville (2020) and Tathra (2018) surveys, in that, once a building catches fire, regardless of construction material, it will likely be totally destroyed. The official Tathra figures have 68% of all fire affected premises as being ultimately destroyed. Data collected from the South Coast and Rappville surveys provides much stronger indications of this trend, where 92% and 100% respectively, of the buildings observed were destroyed. (The Rappville and South Coast results represent only those properties located and observed, not all fire-affected properties).

In terms of building specifics, The South Coast survey provided numerous examples of fire-affected residences, primarily constructed of ‘non-flammable’ materials (brick and blockwork (piers and walls)). These structures demonstrated some resilience to the fire, at times remaining wholly or partially intact. However, the remaining material comprising the premises (structural roof/wall timbers, internal walls and house contents), once alight, ultimately rendered the entire building unsalvageable (destroyed). Timber beams supporting house roofs and carports were uniformly level on the ground (as though dropped). Metal framed buildings (e.g. sheds) and structural elements (e.g. lintels) did not perform well – failing due to extreme heat and leading to the building warping and impacting brick/masonry when collapsing. There were numerous examples of vehicles completely burnt out in front and rear yards and some isolated examples of aluminium boats that had undergone some degree of melting.

For partially destroyed properties, the building features most often impacted were constructed from timber such as external stairs and decking as well as external cladding. There were numerous examples of destroyed properties categorised as ‘asbestos contaminated’ though this was less common than during the Rappville survey where asbestos was present at over 50% of properties. A large number of asbestos contaminated assessments were speculative based on observations and erred on the side of caution with further assessment and testing usually noted as necessary. The possible exception to this would be Rosedale which experienced near total destruction and where homes predominantly appeared older, were often constructed using fibro or sheeting and surrounded by bushland.

Statistical dependence of bushfire risk on distance to bush and the influence of ember attack

Previous field research conducted by Risk Frontiers (Chen and McAneney, 2004 and other more recent) has established that proximity to bushland is the most important factor in determining a building’s vulnerability. Figure 3 depicts bushfire damage based on aggregated data from recent major bushfires and shows the cumulative distribution of destroyed buildings in relation to distance from bushland.

As with previous fires studied, Figure 3 confirms the significant role that ‘proximity to bushland’ played in the South Coast losses where approximately 38% of destroyed buildings were situated within 1 metre of surrounding bush. The average distance from bushland of all 426 properties surveyed was 55 metres (satellite imagery). However, a feature not obvious from the South Coast data in Figure 3, but apparent in the Rappville and Duffy examples, is the impact of extreme conditions and the capacity of embers to propagate fire over large distances. Witness accounts from fire fighters and locals have described embers being transported by extreme winds across Lake Conjola, over distances greater than 1km. The South Coast survey data would appear to confirm such reports, as two properties surveyed were >1.3km from bushland and 73 were located >100 metres from bush.

Buildings destroyed in relation to distance from bushland
Figure 3: Cumulative distribution of buildings destroyed in relation to distance from nearby bushland for recent major events.

References

BNHCRC 2019. Black Saturday ten years on – what did we discover?

Insurance Council of Australia 2020. Period of residential building construction – chart. Posted by Karl Sullivan.

Risk Frontiers’ Newsletter Vol. 19, Issue 1.

Risk Frontiers’ Newsletter Vol. 17, Issue 3.

Chen, K., and K. J. McAneney, 2004: Quantifying bushfire penetration into urban areas in Australia. Geophys. Res. Lett., 31, L12212.

Acknowledgement

This research was funded through the Bushfire and Natural Hazards CRC Quick Response Fund.

 

Newsletter Volume 19, Issue 2 – April 2020

Bushfire impact research – NSW South Coast

Steven George, James O’Brien, Salomé Hussein, Jonathan van Leeuwen, Risk Frontiers

Risk Frontiers deployed a team to the NSW South Coast region in late January, 2020 to undertake damage surveys following the bushfires. This research was supported by the Bushfire and Natural Hazards CRC (BNHCRC). The areas surveyed included Moruya, Mogo, Malua Bay, Rosedale, the Catalina area of Batemans Bay and Lake Conjola. The majority of damage occurred on December 31, 2019 as catastrophic weather conditions (extreme temperatures and strong winds) intensified existing fire fronts. The conditions transported large quantities of embers into vulnerable communities, destroying hundreds of residential and several commercial buildings. In total, the survey identified 426 bushfire affected properties, most of which were destroyed. Industries/infrastructure affected included: bowling/services club, a unit block (12 units), heritage park, industrial complex with numerous businesses and extensive damage to electricity infrastructure (power poles and wires along the Princes Highway). This report complements our report for northern NSW (Risk Frontiers, 2020).

Building age and resilience

As the 2019/2020 fire season progressed, the scale of damage and losses experienced across the country engendered a growing interest in evaluating the resilience of buildings to bushfires. Aspects of buildings such as age, performance of construction materials and a structure’s vulnerability due to its proximity to bushland were the key focus of the NSW South Coast survey. To evaluate the performance of building archetypes impacted by fire, the Insurance Council of Australia (ICA) charted the year of construction of over 25,000 residential buildings located within bushfire impacted areas across four states (Figure 1). Categories range from Old Colonial (pre-Victorian) to post-2009, when bushfire building standards began to be improved and were mandated in certain locations.

Figure 1: The period of construction for over 25,000 buildings located within the current bushfire impacted areas across four states. Source: Insurance Council of Australia, 2019.

The ICA data shows that only 9.5% of residences were constructed post-2009, when changes were made to Australian Standard 3959 after the Black Saturday fires of February, 2009, to ensure that new buildings in bushfire-prone areas were safer and more likely to survive a fire (BNHCRC, 2019). It was apparent that the scale of residential losses occurring this fire season presented a small window of opportunity to conduct further damage surveys, prior to recovery and debris removal, and would provide a considerable ‘post-2009’ cohort to assess building performance and inform future design. In the near future, further analysis will be undertaken by Risk Frontiers to establish the construction age of the South Coast properties, with a focus on any post-2009, to expand existing research.

Observations of destruction/damage – construction materials

The survey team recorded aspects of fire affected buildings such as construction materials and damage ratios (destroyed/partially destroyed). The field observations from the South Coast survey are compared to those in Rappville (2019) and Tathra (2018) in Figure 2.

Figure 2: The proportion and number (in column) of buildings categorised as destroyed/partially destroyed. The South Coast and Rappville (2019) damage surveys used a building footprint method where partially destroyed references the building, not the lot. The sampling method figures from the Tathra fire in 2018 assigned partially damaged on the proportion of the whole lot – that is, if a shed was destroyed but the house was undamaged, then a partially damaged rating was assigned. The data shows that once a building is alight, the likelihood of it being destroyed is very high. The total destruction rate across the three events ranged between two-thirds and 100%. The number of properties destroyed also indicates the difference in scale of the fire events between locations.

The South Coast findings reinforce those from the Rappville (2020) and Tathra (2018) surveys, in that, once a building catches fire, regardless of construction material, it will likely be totally destroyed. The official Tathra figures have 68% of all fire affected premises as being ultimately destroyed. Data collected from the South Coast and Rappville surveys provides much stronger indications of this trend, where 92% and 100% respectively, of the buildings observed were destroyed. (The Rappville and South Coast results represent only those properties located and observed, not all fire-affected properties).

In terms of building specifics, The South Coast survey provided numerous examples of fire-affected residences, primarily constructed of ‘non-flammable’ materials (brick and blockwork (piers and walls)). These structures demonstrated some resilience to the fire, at times remaining wholly or partially intact. However, the remaining material comprising the premises (structural roof/wall timbers, internal walls and house contents), once alight, ultimately rendered the entire building unsalvageable (destroyed). Timber beams supporting house roofs and carports were uniformly level on the ground (as though dropped). Metal framed buildings (e.g. sheds) and structural elements (e.g. lintels) did not perform well – failing due to extreme heat and leading to the building warping and impacting brick/masonry when collapsing. There were numerous examples of vehicles completely burnt out in front and rear yards and some isolated examples of aluminium boats that had undergone some degree of melting.

For partially destroyed properties, the building features most often impacted were constructed from timber such as external stairs and decking as well as external cladding. There were numerous examples of destroyed properties categorised as ‘asbestos contaminated’ though this was less common than during the Rappville survey where asbestos was present at over 50% of properties. A large number of asbestos contaminated assessments were speculative based on observations and erred on the side of caution with further assessment and testing usually noted as necessary. The possible exception to this would be Rosedale which experienced near total destruction and where homes predominantly appeared older, were often constructed using fibro or sheeting and surrounded by bushland.

Statistical dependence of bushfire risk on distance to bush and the influence of ember attack

Previous field research conducted by Risk Frontiers (Chen and McAneney, 2004 and other more recent) has established that proximity to bushland is the most important factor in determining a building’s vulnerability. Figure 3 depicts bushfire damage based on aggregated data from recent major bushfires and shows the cumulative distribution of destroyed buildings in relation to distance from bushland.

As with previous fires studied, Figure 3 confirms the significant role that ‘proximity to bushland’ played in the South Coast losses where approximately 38% of destroyed buildings were situated within 1 metre of surrounding bush. The average distance from bushland of all 426 properties surveyed was 55 metres (satellite imagery). However, a feature not obvious from the South Coast data in Figure 3, but apparent in the Rappville and Duffy examples, is the impact of extreme conditions and the capacity of embers to propagate fire over large distances. Witness accounts from fire fighters and locals have described embers being transported by extreme winds across Lake Conjola, over distances greater than 1km. The South Coast survey data would appear to confirm such reports, as two properties surveyed were >1.3km from bushland and 73 were located >100 metres from bush.

Figure 3: Cumulative distribution of buildings destroyed in relation to distance from nearby bushland for recent major events.

References

BNHCRC 2019. Black Saturday ten years on – what did we discover?

Insurance Council of Australia 2020. Period of residential building construction – chart. Posted by Karl Sullivan.

Risk Frontiers’ Newsletter Vol. 19, Issue 1.

Risk Frontiers’ Newsletter Vol. 17, Issue 3.

Chen, K., and K. J. McAneney, 2004: Quantifying bushfire penetration into urban areas in Australia. Geophys. Res. Lett., 31, L12212.

Acknowledgement

This research was funded through the Bushfire and Natural Hazards CRC Quick Response Fund.

Future of bushfire fighting in Australia

Andrew Gissing, Risk Frontiers, Neil Bibby, People & Innovation

Australia needs to be ambitious in its thinking about how future bushfires are managed and fought. Recent bushfires caused significant damage and widespread disruption leaving some 3093 homes destroyed (AFAC) and 35 fatalities as well as major damage to community infrastructure. We must learn from this experience.

Today’s management of bushfire risk is largely reliant on long standing approaches that are resource intensive and which struggle to control fires when conditions are catastrophic. This issue is compounded under a warming climate with fire seasons becoming longer, and days of significant fire danger more frequent.

An inherent problem is that bushfire detection is complex and in the time it takes before resources can be tasked and targeted, bushfires have already spread to the point where suppression is difficult. This problem is exacerbated when bushfire ignition occurs in remote areas far from emergency management resources. Making the problem worse still is a growing bushland-urban interface where buildings and community infrastructure are highly vulnerable and exposure is growing.

Innovation to discover the next generation of firefighting capability should be a priority in any government response to the Black Summer bushfires. Our institutions must think big.

To explore blue sky thinking in respect of future firefighting capabilities and enhanced bushfire resilience, Risk Frontiers and People & Innovation hosted a forum with experts in construction, technology, aviation, insurance, risk management, firefighting and information technology. In what follows, insights and questions arising from this forum are outlined.

New thinking is required

There are two stages in considering future capabilities. The first stage is planning and investment to improve capabilities in the short term particularly before the next bushfire season, and the second stage is research and innovation to inspire the next generation of firefighting capability. What is needed is a blueprint of how bushfires will be fought in the future. This blueprint should be focused on a vision whereby bushfires can be rapidly managed and controlled in a coordinated manner informed by advanced predictive intelligence; and where the built environment is resilient. Key research questions to be answered in the development of such a blueprint include:

Bushfire detection and suppression

  • How can bushfires be detected more quickly?
  • How can bushfires be extinguished before they are able to spread?
  • How can the safety of firefighters be improved?

Coordination

  • How can communications enable effective coordination?
  • How can resources be tasked and tracked in a more effective manner?
  • How can situational awareness be enhanced to inform decision-making?

Community resilience

  • How can new buildings be made more resilient?
  • How can existing building stock be retrofitted for resilience?
  • How can community infrastructure such as energy distribution systems, telecommunications, water supplies and sewerage systems be designed with greater resilience?

Short term

It is widely agreed that in the short term there are many technologies and systems already existing that could enhance firefighting and broader disaster management capabilities. Specific opportunities identified by industry experts include:

  • Satellites, such as data sourced from the Himawari satellite, should be evaluated for their ability to enhance fire detection. High Altitude Platform Systems may be another option.
  • In the United States, Unmanned Aerial Vehicles (UAV) have been employed to provide enhanced imagery over firegrounds and if equipped with infrared sensors these can support monitoring of fire conditions at night. The Victorian Government has established a panel contract with UAV providers to assist with real-time fire detection and monitoring. Further policy regarding airspace management is required to support wider demand-based deployments of UAVs.
  • Existing agricultural monitoring technologies could be repurposed to monitor bushfire fuels and soil conditions.
  • Balloons equipped with radio communications could provide coverage when traditional communications technologies have been disrupted. Alternatively, small UAVs could create a mesh network to provide a wireless communications network or equipment fitted to aircraft.
  • Advances in the use of robotics in the mining sector may provide applications to firefighting, for example autonomous trucks.
  • Resource tracking technologies could be implemented to improve coordination and firefighter safety.
  • Emerging fire extinguisher technologies could help to suppress bushfires.

Operational decisions could be improved by enhanced collation and fusion of data already available. There are many data sources that are managed by different organisations, not just government agencies. Collating these datasets to provide a common operating picture across all organisations would improve situational awareness and data analytics.

The widespread adoption of artificial intelligence and greater digital connectedness across the economy and emergency management sector will find new ways to make sense of data and improve decisions. In the built environment, improved information to households about the resilience of their buildings along with programs to implement simple retrofitting measures should be considered. In the aftermath of bushfires, governments should consider land swaps and buy-outs to reduce exposure in high risk areas. Similarly, governments should better plan communities to ensure infrastructure is more resistant to failure when most needed in emergencies.

2030 and beyond

A key area for research and innovation investment over the coming decade should be how to rapidly suppress bushfires once detected. This could see swarms of large capacity UAVs supported by ground-based drones to target suppression and limit fire spread. Resources would be rapidly dispatched and coordinated autonomously once a bushfire was detected. Pre-staging of resources would be informed by advanced predictive analytics and enabled by unmanned traffic management systems. UAVs and drones would have applications beyond fire suppression including for rapid impact assessment, search and rescue, logistics and clearance of supply routes.

The way forward

A research and innovation blueprint is needed that outlines how technologies will be translated to enhance firefighting and resilience in the short term and, beyond this, how the next generation of capability will be designed and built. Its development should involve government, research and industry stakeholders in a collaborative manner. The final blueprint should be integrated with future workforce and asset planning to support broader change management.

Adopting new technologies will not be easy and existing cultural and investment barriers should be considered. In adopting new technologies, it is important to recognise that innovation is an iterative process of improvement and will rarely provide a perfect solution in the first instance.

Public private partnerships will be key to realising opportunities and government must seek to engage a broad range of stakeholders. In the aftermath of Hurricane Sandy in the United States in 2012, the US Government launched a competition called ‘re-build by design’ focused on proactive solutions to minimise risk. Already in Australia, numerous innovation challenges involving businesses and universities are being held to assist in inspiring ideas. There is an opportunity to harness and coordinate such challenges on a grand scale to promote new thinking and collaboration linking directly with responsible agencies.

We need to be bold in our thinking!

Acknowledgements

Forum participants included IAG, SwissRe, IBM, Defence Science and Technology, IAI, Cicada Innovations, Lend Lease and ARUP.

Risk Frontiers Seminar Series 2020
Save the dates

Due to the COVID-19 pandemic Risk Frontiers’ Annual Seminar Series for 2020 will be presented as a series of three one-hour webinars across three weeks.

Webinar 1. Thursday 17th September, 2:30-3:30pm
Webinar 2. Thursday 24th September, 2:30-3:30pm
Webinar 3. Thursday 1st October, 2:30-3:30pm

Further details to follow.

Modelling the Coronavirus Pandemic to Guide Policy in Real Time

Paul Somerville, Risk Frontiers


This briefing presents an article by Martin Enserink and Kai Kupferschmidt entitled “Mathematics of life and death: How disease models shape national shutdowns and other pandemic policies.” It ends by describing a three-way tussle between protecting physical health, protecting the economy, and protecting people’s well-being and emotional health. This tussle is now playing out in Australia and the United States, with state and local governments and their expert advisors ignoring the recommendations of Federal governments by prioritizing the first goal (health) over the second goal (the economy). As of 30 March, it appears that the United States government has now reversed course in favour of health.

As the following article points out, the models shown here, and being used to guide policy, have been posted as drafts on websites in most cases and not yet been peer reviewed. The numbers of these postings probably currently exceed the capacity of the research community to provide peer review. The examples shown here were selected because they are thought to have influenced government policy.

The Australian government has not yet disclosed the modelling methods it has been using to develop its policies, but an example of the modelling that has been done in Australian universities is given by Chang et al. (2020). As indicated in the following article, modelling by Walker et al. (2020) at Imperial College London has influenced current policies in the United Kingdom. In the United States, the government is expected to disclose its modeling methods in the next few days but has stated that its model produces results similar to those of Murray (2020) shown below. At present, New Zealand is the only major English-speaking country that has made its modeling reports and their use in policy making completely open (New Zealand Ministry of Health, 2020).

The underlying approach in these models is illustrated in the two figures below. First, the growth of the number of cases (and other parameters such as numbers of deaths) are modelled for a range of policies ranging from limited, or no action, to complete lockdown.  For example, Figure 1 shows the cumulative number of projected hospitalisations in California as a function of time, and a table of corresponding outcomes after three months, for a set of different policies. Next, the estimated demand is compared with the capacity of the hospital system to meet that demand for the policy that is in effect. Figure 2 shows the excess demand of Intensive Care Units (ICU’s) by state in the United States.  Other charts show the time when excess demand peaks, and the time when the daily death rate falls below 0.3 per million.

Rathi (2020) has pointed out two similarities between the underlying processes governing pandemics and climate charge (and hence the nature of their modelling): both require global models, and both are nonlinear processes, with each incremental increase causing successively severe increases unless mitigated.

Figure 1. Modelling of outcomes of alternative public policies in California. The dashed black line shows the number of available hospital beds. Source: CovidActNow (2020).

Figure 2. Modelling of Intensive Care Unit excess demand in the U.S.  Source: Murray (2020).


Mathematics of life and death: How disease models shape national shutdowns and other pandemic policies

Jacco Wallinga’s computer simulations are about to face a high-stakes reality check. Wallinga is a mathematician and the chief epidemic modeler at the National Institute for Public Health and the Environment (RIVM), which is advising the Dutch government on what actions, such as closing schools and businesses, will help control the spread of the novel coronavirus in the country.

The Netherlands has so far chosen a softer set of measures than most Western European countries; it was late to close its schools and restaurants and hasn’t ordered a full lockdown. In a 16 March speech, Prime Minister Mark Rutte rejected “working endlessly to contain the virus” and “shutting down the country completely.” Instead, he opted for “controlled spread” of the virus among the groups least at risk of severe illness while making sure the health system isn’t swamped with COVID-19 patients. He called on the public to respect RIVM’s expertise on how to thread that needle. Wallinga’s models predict that the number of infected people needing hospitalization, his most important metric, will taper off by the end of the week. But if the models are wrong, the demand for intensive care beds could outstrip supply, as it has, tragically, in Italy and Spain.

COVID-19 isn’t the first infectious disease scientists have modeled—Ebola and Zika are recent examples—but never has so much depended on their work. Entire cities and countries have been locked down based on hastily done forecasts that often haven’t been peer reviewed. “It has suddenly become very visible how much the response to infectious diseases is based on models,” Wallinga says. For the modelers, “it’s a huge responsibility,” says epidemiologist Caitlin Rivers of the Johns Hopkins University Center for Health Security, who co-authored a report about the future of outbreak modeling in the United States that her center released yesterday.

Just how influential those models are became apparent over the past 2 weeks in the United Kingdom. Based partly on modeling work by a group at Imperial College London, the U.K. government at first implemented fewer measures than many other countries—not unlike the strategy the Netherlands is pursuing. Citywide lockdowns and school closures, as China initially mandated, “would result in a large second epidemic once measures were lifted,” a group of modelers that advises the government concluded in a statement. Less severe controls would still reduce the epidemic’s peak and make any rebound less severe, they predicted.

But on 16 March, the Imperial College group published a dramatically revised model that concluded—based on fresh data from the United Kingdom and Italy—that even a reduced peak would fill twice as many intensive care beds as estimated previously, overwhelming capacity. The only choice, they concluded, was to go all out on control measures. At best, strict measures might be periodically eased for short periods, the group said (see graphic, below). The U.K. government shifted course within days and announced a strict lockdown.

Epidemic modelers are the first to admit their projections can be off. “All models are wrong, but some are useful,” statistician George Box supposedly once said—a phrase that has become a cliché in the field.

Textbook mathematics

It’s not that the science behind modeling is controversial. Wallinga uses a well-established epidemic model that divides the Dutch population into four groups, or compartments in the field’s lingo: healthy, sick, recovered, or dead. Equations determine how many people move between compartments as weeks and months pass. “The mathematical side is pretty textbook,” he says. But model outcomes vary widely depending on the characteristics of a pathogen and the affected population.

Because the virus that causes COVID-19 is new, modelers need estimates for key model parameters. These estimates, particularly in the early days of an outbreak, also come from the work of modelers. For instance, by late January several groups had published roughly similar estimates of the number of new infections caused by each infected person when no control measures are taken—a parameter epidemiologists call R0. “This approximate consensus so early in the pandemic gave modelers a chance to warn of this new pathogen’s epidemic and pandemic potential less than 3 weeks after the first Disease Outbreak News report was released by the WHO [World Health Organization] about the outbreak,” says Maia Majumder, a computational epidemiologist at Harvard Medical School whose group produced one of those early estimates.

Wallinga says his team also spent a lot of time estimating R0 for SARS-Cov-2, the virus that causes COVID-19, and feels sure it’s just over two. He is also confident about his estimate that 3 to 6 days elapse between the moment someone is infected and the time they start to infect others. From a 2017 survey of the Dutch population, the RIVM team also has good estimates of how many contacts people of different ages have at home, school, work, and during leisure. Wallinga says he’s least confident about the susceptibility of each age group to infection and the rate at which people of various ages transmit the virus. The best estimates come from a study done in Shenzhen, a city in southern China, he says.

Compartment models assume the population is homogeneously mixed, a reasonable assumption for a small country like the Netherlands. Other modeling groups don’t use compartments but simulate the day-to-day interactions of millions of individuals. Such models are better able to depict heterogeneous countries, such as the United States, or all of Europe. WHO organizes regular calls for COVID-19 modelers to compare strategies and outcomes, Wallinga says: “That’s a huge help in reducing discrepancies between the models that policymakers find difficult to handle.”

Still, models can produce vastly different pictures. A widely publicized, controversial modeling study published yesterday by a group at the University of Oxford [Lourenco et al., 2020] argues that the deaths observed in the United Kingdom could be explained by a very different scenario from the currently accepted one. Rather than SARS-CoV-2 spreading in recent weeks and causing severe disease in a significant percentage of people, as most models suggest, the virus might have been spreading in the United Kingdom since January and could have already infected up to half of the population, causing severe disease only in a tiny fraction. Both scenarios are equally plausible, says Sunetra Gupta, the theoretical epidemiologist who led the Oxford work. “I do think it is missing from the thinking that there is an equally big possibility that a lot of us are immune,” she says. The model itself cannot answer the question, she says; only widespread testing for antibodies can, and that needs to be done urgently.

Adam Kucharski, a modeler at the London School of Hygiene & Tropical Medicine, says the Oxford group’s new scenario is unlikely. Scientists don’t know exactly how many people develop very mild symptoms or none at all, he says, but data from the Diamond Princess—a cruise ship docked in Yokohama, Japan, for 2 weeks that had a big COVID-19 outbreak—and from repatriation flights and other sources argue against a huge number of asymptomatic cases. “We don’t know at the moment, is it 50% asymptomatic or is it 20% or 10%,” he says. “I don’t think the question is: Is it 50%  asymptomatic or 99.5%.”

Riding tigers

In their review of U.S. outbreak modeling, Rivers and her colleagues note that most of the key players are academics with little role in policy. They don’t typically “participate in the decision-making processes … they sort of pivot into a new world when an emergency hits,” she says. “It would be more effective if they could be on-site with the government, working side by side with decision makers.” Rivers argues for the creation of a National Infectious Disease Forecasting Center, akin to the National Weather Service. It would be the primary source of models in a crisis and strengthen outbreak science in “peacetime.”

Policymakers have relied too heavily on COVID-19 models, says Devi Sridhar, a global health expert at the University of Edinburgh. “I’m not really sure whether the theoretical models will play out in real life.” And it’s dangerous for politicians to trust models that claim to show how a little-studied virus can be kept in check, says Harvard University epidemiologist William Hanage. “It’s like, you’ve decided you’ve got to ride a tiger,” he says, “except you don’t know where the tiger is, how big it is, or how many tigers there actually are.”

Models are at their most useful when they identify something that is not obvious, Kucharski says. One valuable function, he says, was to flag that temperature screening at airports will miss most coronavirus-infected people.

There’s also a lot that models don’t capture. They cannot anticipate, say, the development of a faster, easier test to identify and isolate infected people or an effective antiviral that reduces the need for hospital beds. “That’s the nature of modeling: We put in what we know,” says Ira Longini, a modeler at the University of Florida. Nor do most models factor in the anguish of social distancing, or whether the public obeys orders to stay home. Recent data from Hong Kong and Singapore suggest extreme social distancing is hard to keep up, says Gabriel Leung, a modeler at the University of Hong Kong. Both cities are seeing an uptick in cases that he thinks stem at least in part from “response fatigue.”  “We were the poster children because we started early. And we went quite heavy,” Leung says. Now, “It’s 2 months already, and people are really getting very tired.” He thinks both cities may be on the brink of a “major sustained local outbreak”.

Long lockdowns to slow a disease can also have catastrophic economic impacts that may themselves affect public health. “It’s a three-way tussle,” Leung says, “between protecting health, protecting the economy, and protecting people’s well-being and emotional health.”

The economic fallout isn’t something epidemic models address, Longini says—but that may have to change. “We should probably hook up with some economic modelers and try to factor that in,” he says.

References

Chang, S.L., N. Harding, C. Zachreson, O. M. Cliff, and M. Prokopenko (2020). Modelling transmission and control of the COVID-19 pandemic in Australia. Preprint, March 24, 2020

Covid Act Now. https://covidactnow.org/model

Enserink, Martin and Kai Kupferschmidt (2020). Mathematics of life and death: How disease models shape national shutdowns and other pandemic policies. Posted in: HealthCoronavirus, doi:10.1126/science.abb8814, Mar. 25, 2020 , 6:40 PM.

Lourenco, Jose, Robert Paton, Mahan Ghafari, Moritz Kraemer, Craig Thompson, Peter Simmonds, Paul Klenerman, and Sunetra Gupta (2020). Fundamental principles of epidemic spread highlight the immediate need for large-scale serological surveys to assess the stage of the SARS-CoV-2 epidemic. medRxiv preprint doi: https://doi.org/10.1101/2020.03.24.20042291

Murray, Christopher J.L. (2020). Forecasting COVID-19 impact on hospital bed-days, ICU-days, ventilator-days, and deaths by US state in the next 4 months. Preprint submitted to MedRxiv 03.25.2020 – tracking ID MEDRXIV/2020/043752

http://www.healthdata.org/research-article/forecasting-covid-19-impact-hospital-bed-days-icu-days-ventilator-days-and-deaths

https://covid19.healthdata.org/projections

New Zealand Ministry of Health (2020).

https://www.health.govt.nz/publication/covid-19-modelling-reports

Rathi, A. (2020). The pandemic reveals how the science of risk shapes our lives. 31 March 2020.

https://www.linkedin.com/pulse/pandemic-reveals-how-science-risk-shapes-our-lives-akshat-rathi/

Walker, Patrick G.T. et al. (2020). The Global Impact of COVID-19 and Strategies for Mitigation and Suppression. Imperial College COVID-19 Response Team, March 26, 2020.