Macquarie University’s Lighthouse publication recently showcased research being undertaken by Risk Frontiers’ Andrew Gissing on planning and capability requirements for catastrophic events. This research, undertaken through the Bushfire and Natural Hazards Cooperative Research Centre, is investigating better practice approaches to planning and preparedness for extreme events that may overwhelm existing response frameworks. You can read the story highlighting the research in the context of the recent Sulawesi earthquake and tsunami here: https://lighthouse.mq.edu.au/article/october/sulawesis-earthquake-and-tsunami-provide-key-insights-into-catastrophe-response
Twenty-eight years on from the First Assessment Report in 1990, the IPCC’s most recent Special Report on Global Warming delivers an urgent warning to policymakers that we are reaching the point of no return for mitigating anthropogenic impacts on global warming and associated climate change. The report has divided opinion in Australia and further highlights the polarising power of climate change across government, academia and industry.
The report finds that limiting global warming to 1.5 °C, although “possible within the laws of chemistry and physics”, would now require rapid and unprecedented change in all aspects of society. Global net human-caused emissions of CO2 would need to fall by approximately 45 percent from 2010 levels by 2030, reaching ‘net zero’ around 2050. This means that any remaining emissions would need to be balanced by utilising as-yet under-developed technologies to remove CO2 from the air.
The report also highlights that we are already seeing the consequences of 1 °C of global warming through more extreme weather, rising sea levels and diminishing Arctic sea ice. One of the difficulties in communicating the impacts of seemingly small increases in mean temperatures is related to how this affects extreme weather events. The immediate reaction of many to “a 1 °C temperature increase” is to imagine oneself lying on a beach at 24 °C and then at 25 °C with global warming. Not that bad, right?
The key notion is that a small increase in the mean temperature also shifts the tails of the distribution, meaning the probability of extreme weather events increases just as much – and sometime more (depending on the shape of the distribution) – as the shift in the mean temperature (Figure 1). Prof Andy Pittman, the director of the ARC Centre of Excellence for Climate Extremes at the University of New South Wales, describes this nicely in an anecdote to BBC News back in January:
“the probability works a bit like if you stand at sea level and throw a ball in the air, and then gradually make your way up a mountain and throw the ball in the air again. The chances of the ball going higher increases dramatically. That’s what we’re doing with temperature.”
Figure 1 Small changes in the averages of many key climate variables can correspond to large changes in weather. Source: Solomon et al. (2007).
What the report says
Abridged findings from the report that have high confidence (80 % chance) are:
Global warming is likely to reach 1.5 °C between 2030 and 2052 if temperatures continue to increase at the current rate (Figure 2);
There are robust differences in climate model projections of regional climate characteristics between present-day and global warming of 1.5 °C and between 1.5 °C and 2 °C, most notably sea level rise and extreme heat;
Most climate change adaptation needs will be lower for global warming of 1.5 °C compared to 2 °C;
Estimates of the global emissions outcomes of current nationally stated mitigation ambitions as per the Paris Agreement would not limit warming to 1.5 °C, even if supplemented by challenging emissions reductions after 2030.
Figure 2 Observed monthly global mean surface temperature change and likely modelled responses to anthropogenic emission and forcing pathways relative to the 1.5 ° C threshold, extending to 2.0 ° C. Source: Figure SPM.1 in IPCC (2018).
The report advocates for anthropogenic climate change to be limited to 1.5 °C, and cites considerable additional impacts for land, energy, industry, buildings and transport in a “2° C world”. The marine world is singled out for particular impacts under a 2° C scenario, with modelling and observations suggesting the large-scale die-out of tropical coral reefs including, of course, the Great Barrier Reef (GBR).
Changes to the GBR not only have direct impacts for marine biodiversity, but also for cyclone risk along the adjacent mainland coast, which would potentially experience higher storm surge and wave exposure under a combination of rising sea levels and reduced energy dissipation by coral reefs.
The report is published at a time of international discord on climate mitigation, with most scientists acknowledging that the likelihood of achieving a plateau at the proposed 1.5 °C is very small. This is essentially a reflection on the myopic nature of global political institutions, and the opposing long-term nature of the problem at hand.
It also highlights the divisive nature of climate change in Australia. As elsewhere, it has become entangled with political agendas, class, energy and living standards. However, unlike elsewhere, adaptation to climate change has yet to occupy a central role in government policy as it has done, for example, in Europe. It has exposed an interesting divide between sectors that have come to the fore in recent years – with banking, insurance and industry at large leading the charge in understanding climate change risk and exposures, and the federal government lagging.
The righteous indignation of some in the public eye too often overshadows the high standards of objectivity, self-imposed on the science community, in delivering the most robust findings possible. This was highlighted last week by the coincidental media release of an ‘audit’ of climate data used by climate models, undertaken as part of a PhD at James Cook University, with the apparent intention of undermining the IPCC’s report.
The audit claims that the underlying data used by Global Climate Models (GCM) is unfit for purpose, citing concerns around temperature anomalies, coverage and sample size, and that GCM predictions cannot be relied on as a result.
While the audit was undertaken as part of a high-quality PhD thesis (McLean, 2017), it is as yet unpublished in the peer-reviewed scientific literature. The concerns over observational data coverage and sample size in years prior to the satellite era are well known and this is why climate reanalysis data should be handled with care – particularly in the Southern Hemisphere.
The assertion that a limited number of spurious temperature anomalies in observational records would distort the global suite of ensemble climate model output is difficult to prove, given the strict uncertainty estimates and sampling checks climate institutions such as the Bureau of Meteorology and the UK Met Office undertake. However, it is still important that end-users understand the multiple layers of uncertainty inherent in climate modelling.
By comparison, the IPCC’s report included the contributions of 91 climate experts from 40 different countries and draws on over 6,000 cited references. The simultaneous reporting of both the climate audit and the IPCC report in the media gives equal weighting to the two and undermines the climate science, at an important juncture for climate politics internationally.
The global impasse on mitigation efforts only serves to highlight the importance of climate change adaptation planning and risk management in Australia, as we transition to a period in which we look to accommodate climate change impacts rather than reduce them, or indeed to utilize a combination of the two.
It also suggests (fascinatingly, from a data science perspective) that, as anthropogenic warming proceeds, we may no longer be able to apply the near-past to predict near-future climate risk as relationships between climate variables in the short-term past become no longer valid.
McLean, J.D. (2017) An audit of uncertainties in the HadCRUT4 temperature anomaly dataset plus the investigation of three other contemporary climate issues. PhD thesis, James Cook University, available https://researchonline.jcu.edu.au/52041/.
Solomon, S., et al. (2007) Technical Summary. In: Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA.
Wednesday 31st October, 2018
at The Museum of Sydney
cnr Bridge and Phillip Streets, Sydney
2pm until 4.30pm followed by light refreshments in the foyer.
The focus on natural hazards, climate change and cyber risk is rising. The world economic forum has identified extreme weather as the number one global risk. Australian company directors must consider risks associated with a changing climate and the Commonwealth Government are set to deliver a national framework for disaster mitigation. Major advances have been made in modelling earthquake risks and cyber remains a significant challenge for industry. Cutting edge scientific research and policy thinking has never been more important.
The 2018 Risk Frontiers’ seminar series continues a well- forged tradition of sharing scientific knowledge with the Australian insurance and disaster management industry. Come along to hear from our experts about the latest in science, policy and modelling advances, and join the team for light refreshments. This year we also welcome Barry Hanstrum, formally the Regional Director for the Bureau of Meteorology, to deliver an informative key note about the risks posed by East Coast Lows. We look forward to seeing you on the 31st of October.
Shaking it up: QuakeAUS reborn (6.0) – Paul Somerville and Valentina Koschatsky
Stormy horizon: the East Coast Low effect – Barry Hanstrum
Speed Talks – Modelling
Phoenix rising: FireAUS 3.0 – Mingzhu Wang
Shaken not stirred: Quake NZ 4.0 – Niyas Madappatt
A family of floods: improving cross-catchment relationships in FloodAUS – Thomas Mortlock
Speed Talks – Research
Towards modelling cyber risk – Tahiry Rabehaja
The new normal: ICA List revisited – John McAneney
A tale of two catastrophes: what determines behavior during disasters? – Andrew Gissing
Employees of Sponsor Companies
Attendance is free for employees of our Sponsor companies and their subsidiaries (Aon Benfield, Guy Carpenter, IAG, QBE, Suncorp and Swiss Re). Please email your name, employer and email address to firstname.lastname@example.org.
Risk Frontiers through the Bushfire and Natural Hazards Cooperative Research Centre is undertaking research into catastrophic disasters. As part of this research we are exploring how businesses can become more involved in the response to and recovery after major disasters. Risk Frontiers’ Andrew Gissing has recently published a piece in the Asia Pacific Fire Magazine summarising some thoughts on the topic. See link below.
Paul Somerville, Chief Geoscientist, Risk Frontiers
The 28 September Mw 7.5 Sulawesi Earthquake occurred on the Palu-Koro fault, which ruptured southward from the epicenter to a location south of Palu. The Palu-Koro fault is a strike-slip fault on which the two sides slide horizontally past each other (east side north and west side south on a fault aligned north-south in this case), and usually do not cause much vertical movement of the ground. In contrast, thrust faults (including subduction earthquake faults) are caused when one side is thrust under the other. Consequently they are much more likely to trigger a tsunami because the vertical movement of the ground raises a column of seawater, setting a tsunami in motion. Although most media attention has been focused on the tsunami, it is clear that strong near-fault ground motions from the earthquake caused massive structural damage and large scale soil liquefaction (which also caused major structural damage) before the arrival of the tsunami in Palu.
Map of the region surrounding the 28 September Mw 7.5 Sulawesi earthquake showing forecast tsunami inundation and arrival time contours. The north-south alignment of aftershocks (red dots) approximately outlines the location of the Palu-Koru fault rupture zone. Sources: USGS/Indonesia Tsunami Early Warning System/Reuters.
Fifteen earthquakes with magnitudes larger than 6.5 have occurred near Palu in the past 100 years. The largest was a magnitude 7.9 event on January 1996, about 100km north of the September 2018 earthquake. Several of these large earthquakes have also generated tsunamis. In 1927, an earthquake and tsunami caused about 50 deaths and damaged buildings in Palu, and in 1968 a magnitude 7.8 earthquake near Donggala generated a tsunami that killed more than 200 people.
Despite this local history and the 2004 Boxing Day Sumatra earthquake and tsunami, many people in Palu were apparently unaware of the risk of a tsunami following the earthquake. The tsunami occurred in an area where there are no tide gauges that could give information about the height of the wave. The Indonesian Tsunami Warning System issued a warning only minutes after the earthquake, but officials were unable to contact officers in the Palu area. The warning was cancelled 34 minutes later, just after the third tsunami wave arrived in Palu. It is likely that the bay’s narrow V-shape intensified the effect of the wave as it funneled through the narrow opening of the bay, inundating Palu at the end of the bay.
While it is possible that a more advanced tsunami warning system could have saved lives if it had been fully implemented, a system currently in the prototype stage may not have helped the people of Palu, as the tsunami arrived at the shore within 20 minutes of the earthquake. Such early warning systems are most useful for areas several hundred kilometres from the tsunami source. In regions like Palu where the earthquake and tsunami source are very close, education is the most effective warning system. If people feel more than 20 seconds of ground shaking, that should form the warning to immediately move to higher ground.
It is not yet clear whether the tsunami was caused by fault movement or by submarine landslides within Palu Bay triggered by shaking from the earthquake. It is possible that the fault cut through a submarine slope, with the horizontal displacement of the sloping sea floor pushing the water aside horizontally, causing it to pile up in a wave. Alternatively, as seems more likely, the tsunami may have been generated by a submarine landslide within the bay. The sides of the bay are steep and unstable, and maps of the sea floor suggest that submarine landslides have occurred there in the past. In that case, even if there had been tsunami sensors or tide gauges at the mouth of the bay, they would not have sensed the tsunami before it struck the shore in Palu.
It is clear from images of building damage that there was strong ground shaking in Palu and surrounding regions, as would be expected in the near-fault region of an earthquake of this magnitude. This shaking damage would have made structures even more vulnerable to the ensuing tsunami in low lying areas.
Another major cause of damage was the soil liquefaction in large areas within Palu and surrounding regions. Palu is situated on a plain composed of water saturated soft sandy soils. Images from the disaster area show large scale lateral spreading, in which buildings on chunks of thin brittle crust slide across the underlying liquefied sands as if they are flowing in the water. This has resulted in the total destruction of buildings in large areas, leaving a churned landscape composed of debris and buildings that have sunk into the liquefied soil.
Australian Tsunami Risk and Warning
Australia is sufficiently remote from major subduction earthquake source zones that there is enough time (a few hours) for tsunami warning for such events, and in any case the hazard from such tsunamis is quite low. The main source of tsunami hazard may come from the occurrence of local earthquakes offshore that trigger submarine landslides on the continental slope. Such earthquakes are thought to be infrequent, and so the hazard from them is thought to be low. Marine surveys have been undertaken to identify potential locations of past underwater landslides and estimate their recency and frequency of occurrence. Such landslides would generate local tsunamis that would give little time for effective tsunami warning.
The Australian east coast has experienced at-least 47 tsunami events in historical time. The largest occurred in 1960 as a result of the 22 May 1960 Mw 9.6 earthquake in Chile, the largest earthquake in recorded history. The recorded wave height at Fort Denison in Sydney Harbour was 1 metre, strong flow velocities caused damage to boats in Sydney Harbour and the Hunter River, and there was some minor inundation at Batemans Bay.
The Australian Government operates the Australian Tsunami Warning System, and states and territories maintain disaster plans and education programs. In the rare event of a large tsunami generated by a local source, emergency services would likely be overwhelmed and faced with significant challenges in achieving access to impacted areas due to damage to infrastructure.