Risk Frontiers through the Bushfire and Natural Hazards Cooperative Research Centre is undertaking research into catastrophic disasters. As part of this research we are exploring how businesses can become more involved in the response to and recovery after major disasters. Risk Frontiers’ Andrew Gissing has recently published a piece in the Asia Pacific Fire Magazine summarising some thoughts on the topic. See link below.
Paul Somerville, Chief Geoscientist, Risk Frontiers
The 28 September Mw 7.5 Sulawesi Earthquake occurred on the Palu-Koro fault, which ruptured southward from the epicenter to a location south of Palu. The Palu-Koro fault is a strike-slip fault on which the two sides slide horizontally past each other (east side north and west side south on a fault aligned north-south in this case), and usually do not cause much vertical movement of the ground. In contrast, thrust faults (including subduction earthquake faults) are caused when one side is thrust under the other. Consequently they are much more likely to trigger a tsunami because the vertical movement of the ground raises a column of seawater, setting a tsunami in motion. Although most media attention has been focused on the tsunami, it is clear that strong near-fault ground motions from the earthquake caused massive structural damage and large scale soil liquefaction (which also caused major structural damage) before the arrival of the tsunami in Palu.
Map of the region surrounding the 28 September Mw 7.5 Sulawesi earthquake showing forecast tsunami inundation and arrival time contours. The north-south alignment of aftershocks (red dots) approximately outlines the location of the Palu-Koru fault rupture zone. Sources: USGS/Indonesia Tsunami Early Warning System/Reuters.
Fifteen earthquakes with magnitudes larger than 6.5 have occurred near Palu in the past 100 years. The largest was a magnitude 7.9 event on January 1996, about 100km north of the September 2018 earthquake. Several of these large earthquakes have also generated tsunamis. In 1927, an earthquake and tsunami caused about 50 deaths and damaged buildings in Palu, and in 1968 a magnitude 7.8 earthquake near Donggala generated a tsunami that killed more than 200 people.
Despite this local history and the 2004 Boxing Day Sumatra earthquake and tsunami, many people in Palu were apparently unaware of the risk of a tsunami following the earthquake. The tsunami occurred in an area where there are no tide gauges that could give information about the height of the wave. The Indonesian Tsunami Warning System issued a warning only minutes after the earthquake, but officials were unable to contact officers in the Palu area. The warning was cancelled 34 minutes later, just after the third tsunami wave arrived in Palu. It is likely that the bay’s narrow V-shape intensified the effect of the wave as it funneled through the narrow opening of the bay, inundating Palu at the end of the bay.
While it is possible that a more advanced tsunami warning system could have saved lives if it had been fully implemented, a system currently in the prototype stage may not have helped the people of Palu, as the tsunami arrived at the shore within 20 minutes of the earthquake. Such early warning systems are most useful for areas several hundred kilometres from the tsunami source. In regions like Palu where the earthquake and tsunami source are very close, education is the most effective warning system. If people feel more than 20 seconds of ground shaking, that should form the warning to immediately move to higher ground.
It is not yet clear whether the tsunami was caused by fault movement or by submarine landslides within Palu Bay triggered by shaking from the earthquake. It is possible that the fault cut through a submarine slope, with the horizontal displacement of the sloping sea floor pushing the water aside horizontally, causing it to pile up in a wave. Alternatively, as seems more likely, the tsunami may have been generated by a submarine landslide within the bay. The sides of the bay are steep and unstable, and maps of the sea floor suggest that submarine landslides have occurred there in the past. In that case, even if there had been tsunami sensors or tide gauges at the mouth of the bay, they would not have sensed the tsunami before it struck the shore in Palu.
It is clear from images of building damage that there was strong ground shaking in Palu and surrounding regions, as would be expected in the near-fault region of an earthquake of this magnitude. This shaking damage would have made structures even more vulnerable to the ensuing tsunami in low lying areas.
Another major cause of damage was the soil liquefaction in large areas within Palu and surrounding regions. Palu is situated on a plain composed of water saturated soft sandy soils. Images from the disaster area show large scale lateral spreading, in which buildings on chunks of thin brittle crust slide across the underlying liquefied sands as if they are flowing in the water. This has resulted in the total destruction of buildings in large areas, leaving a churned landscape composed of debris and buildings that have sunk into the liquefied soil.
Australian Tsunami Risk and Warning
Australia is sufficiently remote from major subduction earthquake source zones that there is enough time (a few hours) for tsunami warning for such events, and in any case the hazard from such tsunamis is quite low. The main source of tsunami hazard may come from the occurrence of local earthquakes offshore that trigger submarine landslides on the continental slope. Such earthquakes are thought to be infrequent, and so the hazard from them is thought to be low. Marine surveys have been undertaken to identify potential locations of past underwater landslides and estimate their recency and frequency of occurrence. Such landslides would generate local tsunamis that would give little time for effective tsunami warning.
The Australian east coast has experienced at-least 47 tsunami events in historical time. The largest occurred in 1960 as a result of the 22 May 1960 Mw 9.6 earthquake in Chile, the largest earthquake in recorded history. The recorded wave height at Fort Denison in Sydney Harbour was 1 metre, strong flow velocities caused damage to boats in Sydney Harbour and the Hunter River, and there was some minor inundation at Batemans Bay.
The Australian Government operates the Australian Tsunami Warning System, and states and territories maintain disaster plans and education programs. In the rare event of a large tsunami generated by a local source, emergency services would likely be overwhelmed and faced with significant challenges in achieving access to impacted areas due to damage to infrastructure.
Local sociality, which is local people’s everyday lives in and with their community, influences recovery in disaster-affected communities. This paper examines recovery in four disaster-impacted communities. In the two Australian examples rural communities were impacted by the 2011 Queensland floods. The two Japanese communities discussed suffered in the 2011 Tohoku earthquake and tsunami and, in one case, from radiation contamination arising from the damaged Fukushima Daiichi nuclear power plant. We argue that local sociality is often poorly understood by external parties such as disaster recovery experts and agencies. The Japanese planning concept of machizukuri – literally “creating communities” – incorporates physical, structural and social aspects in urban planning practices and was successfully applied to recovery processes in one of the Japanese cases. Drawing on that case, the paper concludes that machizukurioffers a valuable tool to foster better consideration of local sociality – both pre- and post-disaster – as an intrinsic component of communities’ vulnerability and resilience.
As the Earth’s atmosphere warms, the atmospheric circulation changes. Understanding how tropical cyclone activity may change in response to this warming is no easy task, with recent studies showing considerable dispersion in projected changes in activity for the Australian region. For example, Knutson et al. (2015) projected a decrease in tropical cyclone activity, including Cat 4-5 storms, around northeast Australia. Earlier, in 2014, the IPCC Fifth Assessment Report (Reisinger et al. 2014) summarised the projected changes as “Tropical cyclones are projected to increase in intensity and stay similar or decrease in numbers and occur further south (low confidence)”.
Identifying anthropogenic climate change influences on observational records of tropical cyclone activity is also challenging. Reliable records are relatively short and contain high year-to-year variability. Most research effort has focused on identifying changes in frequency and intensity: Callaghan and Power (2010), for example, documented a long-term decline in numbers of Cat 3-5 events making landfall over eastern Australia. Other recent studies have begun to consider changes in the distribution of events and other characteristics, including their forward motion (referred to as the translation speed). Sharmila and Walsh (2018) showed that events in the Australian region may reach further south while Kossin (2018), reported on by Risk Frontiers in Briefing Note 370 in July this year, found a global slowing of translation speeds. Here we further discuss the findings of Kossin (2018) for the Australian region.
Anthropogenic warming may also cause a general weakening of summertime tropical circulation (Vecchi et al., 2006; Mann et al., 2017) and, because tropical cyclones are carried along within their ambient environmental wind, the translation speed of tropical cyclones may slow, thereby increasing the potential for flooding and longer duration sustained high wind speeds (Kossin 2018). Tropical Cyclone Debbie (Queensland, March 2017) and Hurricane Harvey (Texas, August 2017) are two recent examples of slow-moving events.
In addition to the reported global slowdown in tropical cyclone translation speeds, Kossin (2018) also analysed trends across various regions. While those for the Northern Hemisphere were strong, those for the Australian region, both over land and over water, were only marginally significant and exhibited high multi-annual variability.
Here we present an exploratory investigation of the extent to which changes in tropical cyclone translation speeds around Australia (Kossin, 2018) are driven by internal climate variability, in addition to any possible anthropogenic warming signal. The proxy for translation speeds is the ambient winds that control the movement of tropical cyclones. We begin with the tropical Indian Ocean, < 100 ° E (Fig. 1), where Kossin (2018) reported a -0.01 km/hr/yr trend between 1949 and 2016.
Chan and Gray (1981) suggested that winds between 500 and 700 mB are the most relevant measure of ambient winds that transport tropical cyclones. We extracted the 500 mB scalar wind speed monthly means (November to April – coinciding with our tropical cyclone season) from 1980/81 to 2017/18, using the NCEP-NCAR Reanalysis, for the region between 5 and 20 °S and 50 and 100 °E. (Prior to 1980, the homogeneity of the reanalysis record is questionable.)
We then compared the year-on-year scalar wind speeds (averaged within the analysis region) to the Pacific Decadal Oscillation (PDO) Index. The PDO is the leading principal component of North Pacific monthly sea surface temperature variability and can be seen as a long-lived (multi-decadal) ENSO-like pattern of Pacific climate variability. While the PDO is a Pacific-origin index, the tropical cyclone climatologies in Queensland, Northern Territory and Western Australia are principally influenced by Pacific ENSO variability, in addition to other regional climate indices such as the Indian Ocean Dipole and the Madden-Julian Oscillation.
Our results show a strong correlation between the ambient environmental winds in the tropical Indian Ocean (TIO) and the PDO (average of Nov-Apr PDO values for each year)during the period 1981 – 2000 (R = 0.54, p < 0.05, Fig. 2b), but this is much diminished during the period 2001 – 2018 (R = 0.23, p < 0.05, Fig. 2c). The two time-series in Fig. 2a show a change in the relationship between the variables occurred around the year 2000. They also show that wind speeds are consistently higher post-2000.
The PDO was in a sustained ‘warm’ phase (i.e. PDO positive, or El Niño–like) from approximately 1977 to 1999, after which it has experienced less coherent polarity (Fig. 3). Our analysis suggests that during this period, ambient winds (and by inference, tropical cyclone translation speeds) in the Indian Ocean were closely related to variability in the PDO. Post-2000, a weakening of the PDO signal coincides with a much-reduced level of correlation, and a jump to higher wind speeds.
It is well known that the PDO influences interdecadal variability of tropical cyclogenesis in northern Australia (Grant and Walsh, 2001). However, the importance of the PDO on cyclone translation speeds for this region remains unclear. Our brief analysis suggests PDO positive conditions suppress wind speeds in the upper atmosphere in the TIO and, by inference, reduce tropical cyclone translation speeds in this region. This is because during PDO positive (El Niño–like) conditions, sea surface temperature anomalies occur further east in the Pacific Ocean – causing the area of cyclogenesis to move eastwards away from Australia.
When the PDO signal becomes more La Niña to ENSO neutral-like (i.e. post-2000, Fig. 3), wind speeds in the TIO increase but become less correlated to the PDO Index. This suggests a more complex relationship between upper atmosphere winds in this region and other regional climate indices (like the Indian Ocean Dipole or Madden-Julien Oscillation), during multi-decadal periods where the PDO signal is not strong.
Further work is needed to fully explore these relationships, and to extend the analysis into the Pacific. What can be concluded at this juncture is that the role of internal climate variability needs also be considered when analysing tropical cyclone records.
Callaghan, J. & Power, S. (2010). Variability and decline in the number of severe tropical cyclones making land-fall over eastern Australia since the late nineteenth century. Clim. Dyn., 37, 647-662.
Chan, J.C. and Gray, W.M. (1981). Tropical Cyclone Movement and Surrounding Flow Relationships. Mon. Weather Rev., 110, 1354-1374.
Grant, A.P. and Walsh, K.J.E. (2001). Interdecadal variability in north-east Australian tropical cyclone formation. Atmos. Sci. Let., 1530-261X.
Kossin, J.P. (2018). A global slowdown of tropical-cyclone translation speed. Nature, 558, 104-107.
Knutson, T.R. et al. (2015). Global projections of intense tropical cyclone activity for the late twenty-first century from dynamical downscaling of CMIP5/RCP4.5 scenarios. J. Clim., 28, 7203–7224.Mann, M. E. et al. (2017). Influence of anthropogenic climate change on planetary wave resonance and extreme weather events. Sci. Rep. 7, 19831.
Reisinger, A., R.L. Kitching, F. Chiew, L. Hughes, P.C.D. Newton, S.S. Schuster, A. Tait, and P. Whetton, 2014: Australasia. In: Climate Change 2014: Impacts, Adaptation, and Vulnerability. Part B: Regional Aspects. Contribution of Working Group II to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change [Barros, V.R., C.B. Field, D.J. Dokken, M.D. Mastrandrea, K.J. Mach, T.E. Bilir, M. Chatterjee, K.L. Ebi, Y.O. Estrada, R.C. Genova, B. Girma, E.S. Kissel, A.N. Levy, S. MacCracken, P.R. Mastrandrea, and L.L. White (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, pp. 1371-1438.
Sharmila, S & Walsh, K.J.E. (2018). Recent poleward shift of tropical cyclone formation linked to Hadley cell expansion. Nature, 8, 730-736.
Vecchi, G. A. et al. (2006). Weakening of tropical Pacific atmospheric circulation due to anthropogenic forcing. Nature 441, 73–76.
Risk Frontiers’ new Australian earthquake loss model is now available.
We are excited to announce the release of our new probabilistic earthquake loss model for Australia.
The updated model incorporates Geoscience Australia’s recent revision of the Australian Earthquake Catalogue and, for the first time, the inclusion of an active fault model.
The model also includes a number of updates incorporating the latest data and methodologies.
Estimated losses have generally decreased across the country due to the update of the historical earthquake catalogue. This effect is partly mitigated at longer return periods in regions where active faults have now been modelled.
This briefing contains excerpts from a recently-published article in the journal Proceedings of the National Academy of Sciences (PNAS) by Will Steffen and colleagues. The paper has sparked recent media interest and scientific discussion on the possibility of abrupt climate change that lies outside ‘likely’ projections, by surpassing climate thresholds and instigation of positive feedback loops. It calls for stronger action on climate mitigation because of this risk. Will Steffen is Emeritus Professor at the Climate Change Institute at ANU, and a Councillor for the Climate Council, an Australian climate change communications organisation.
The following are some extracts from Steffen’s paper, followed by some comments on this work. The full article and associated references can be accessed here.
Steffen et al.’s article – in short
We explore the risk that self-reinforcing feedbacks could push the Earth System toward a planetary threshold that, if crossed, could prevent stabilization of the climate at intermediate temperature rises and cause continued warming on a “Hothouse Earth” pathway even as human emissions are reduced. Crossing the threshold would lead to a much higher global average temperature than any interglacial in the past 1.2 million years and to sea levels significantly higher than at any time in the Holocene.
We examine the evidence that such a threshold might exist and where it might be. If the threshold is crossed, the resulting trajectory would likely cause serious disruptions to ecosystems, society, and economies. Collective human action is required to steer the Earth System away from a potential threshold and stabilize it in a habitable interglacial-like state.
Such action entails stewardship of the entire Earth System—biosphere, climate, and societies—and could include decarbonization of the global economy, enhancement of biosphere carbon sinks, behavioral changes, technological innovations, new governance arrangements, and transformed social values.
Our analysis suggests that the Earth System may be approaching a planetary threshold that could lock in a continuing rapid pathway toward much hotter conditions—Hothouse Earth. This pathway would be propelled by strong, intrinsic, biogeophysical feedbacks difficult to influence by human actions, a pathway that could not be reversed, steered, or substantially slowed.
Where such a threshold might be is uncertain, but it could be only decades ahead at a temperature rise of ∼2.0 °C above preindustrial, and thus, it could be within the range of the Paris Accord temperature targets. The impacts of a Hothouse Earth pathway on human societies would likely be massive, sometimes abrupt, and undoubtedly disruptive. Avoiding this threshold by creating a Stabilized Earth pathway can only be achieved and maintained by a coordinated, deliberate effort by human societies to manage our relationship with the rest of the Earth System, recognizing that humanity is an integral, interacting component of the system. Humanity is now facing the need for critical decisions and actions that could influence our future for centuries, if not millennia.
The idea of abrupt climate change and threshold events is well established and there is evidence in the sedimentary record that such events have occurred multiple times in the past. To provide some context, we are talking about the transition from glacial to interglacial type climate (or vice-versa) within a matter of decades. These thresholds are difficult to forecast but readily identifiable in hindsight. Once a threshold is passed, a feedback loop can develop that reinforces and amplifies the climate signal – and this is the scenario that Steffen et al. explore. However, it is important to highlight that they can equally lead to an abrupt climate signal that is opposite to the initial forcing.
A well-cited example of this is the ‘8.2 event’, where a warming trend led to a sudden decrease in atmospheric temperatures, most notably over the North Atlantic and Europe, around 8,200 years before present. One theory is that warming ocean temperatures in the Arctic led to sea ice melt, which freshened and warmed the surface ocean and inhibited the sinking of salty, cold water to the ocean floor. This mechanism is required to sustain the ocean’s thermohaline circulation, of which the Gulf Stream (which transports warm water to NW Europe) is the surface signal. The slowing or closing down of this mechanism around the Arctic may have led to a slowing or deviation of the Gulf Stream, and abruptly cooler air temperatures (on the order of 3 to 4 ° C) over NW Europe. Paleo-climatic evidence suggests this all happened in the space of 20 years. Similarly, today, there is a strong ice melt and positive temperature signal around the Arctic. The climate response is highly complex and difficult to predict.
In their paper, Steffen et al. also use the term ‘Anthropocene’. This is a somewhat politically-charged term proposed for the present geological epoch dating from the commencement of significant human impact on the Earth’s environment and ecosystems, including, but not limited to, anthropogenic climate change (Waters et al., 2016). The past 10,000 years or so is known as the Holocene (the present inter-glacial period), thus the ‘Anthropocene’ would be a sub-division of this. There are suggestions that the Anthropocene should start from the beginning of the Industrial Revolution, or even the detonation of the first Atomic Bomb.
However, the International Commission on Stratigraphy (ICS), which has the prerogative of naming geological epochs, does not concur. Almost coincident with the publication of Steffen’s paper, the ICS ratified the subdivision of the Holocene and renamed the Late Holocene as the Meghalayan Epoch, snubbing the term Anthropocene. According to the ICS, the Meghalayan started about 4,250 years ago with a mega-drought that caused the collapse of a number of civilisations in Egypt, the Middle East, India and China about 2,250 years BCE. The ICS objects that the Anthropocene does not arise from geology and is not associated with a “stratigraphic unit” (rock layer); it is based more on the future than the past; is more a part of human history than the immensely long history of Earth; and is a political statement, rather than a scientific one (The Australian, August 11, 2018).
As reported by Mark Maslin (Professor of Earth System Science at University College London) in The Conversation (August 9, 2018), the ICS’s decision is a blow to those pushing for tough action on climate change, and “has profound philosophical, social, economic and political implications”. Maslin says “there is a huge difference to the story of humanity if we are living in the Meghalayan Age that makes no mention of the human impact on the environment — or in the Anthropocene Epoch, which says human actions constitute a new force of nature. The Meghalayan Age says the present is just more of the same as the past. The Anthropocene rewrites the human story, highlighting the need for planetary stewardship.”
The call to arms for stronger mitigation on climate change is a positive one, because it is unlikely any level of planning or adaptation could cope with temperature changes (and associated hazards) of 3 – 4 ° C occurring over a couple of decades. However, inertia – in both the climate system and on a political level – may result in it being too little too late.
Colin N. Waters, Jan Zalasiewicz, Colin Summerhayes, Anthony D. Barnosky, Clément Poirier, Agnieszka Gałuszka, Alejandro Cearreta, Matt Edgeworth, Erle C. Ellis, Michael Ellis, Catherine Jeandel, Reinhold Leinfelder, J. R. McNeill, Daniel deB. Richter, Will Steffen, James Syvitski, Davor Vidas, Michael Wagreich, Mark Williams, An Zhisheng, Jacques Grinevald, Eric Odada, Naomi Oreskes, Alexander P. Wolfe (2016), The Anthropocene is functionally and stratigraphically distinct from the Holocene. Science, 351, 6269.
Lloyd, G. (2018). Will Steffen’s paper gets scientists hot under the collar. The Australian, August 11, 2018.
Maslin, Mark (2018). Anthropocene vs Meghalayan: why geologists are fighting over whether humans are a force of nature. Article published in The Conversation, August 9, 2018.
Steffen, Will, Johan Rockström, Katherine Richardson, Timothy M. Lenton, Carl Folke, Diana Liverman, Colin P. Summerhayes, Anthony D. Barnosky, Sarah E. Cornell, Michel Crucifix, Jonathan F. Donges, Ingo Fetzer, Steven J. Lade, Marten Scheffer, Ricarda Winkelmann, and Hans Joachim Schellnhuber (2018). Trajectories of the Earth System in the Anthropocene. Proceedings of the National Academy of Science, August 6, 2018. 201810141.
We are excited to announce we have released our new probabilistic earthquake loss model for Australia, QuakeAUS 6.0. The updated model, developed by Dr Valentina Koschatzky with input from Risk Frontiers’ Chief Geoscientist, Dr Paul Somerville, incorporates Geoscience Australia’s recent revision of the Australian Earthquake Catalogue (Allen et al., 2017), which has more than halved the rate of earthquakes exceeding 4.5 in magnitude. The main features of the new model are:
New Distributed Earthquake Source Model (based on RF analysis of the new GA catalogue – 2018)
Inclusion of an Active Fault Model
Updated Soil Classification (McPherson 2017)
Updated Soil Amplification Model (Campbell & Bozorgnia 2014)
A new distributed earthquake source model was implemented using the revised Geoscience Australia earthquake catalogue from the National Seismic Hazard Assessment (NSHA18) project (Allen et al., 2017), which will be released by GA in September 2018.
Active Fault Model
The active fault model incorporates earthquakes on potentially active faults based on GA’s Neotectonic Feature Database (Clark, 2012). These geologically identified rare and large prehistorical events are not represented in the short historical record of earthquakes in Australia.
Updated Soil Class and Soil Amplification Models
We implement the Australian Seismic Site Conditions MAP (ASSCM) released by GA in June 2017 in the calculation of site amplification. This is a significant revision and upgrade of the previous map published in 2007. The site amplification model has also been updated (Campbell & Bozorgnia, 2014).
Variable Resolution Grid
We implemented the latest GNAF (2018) and Nexis (2018 V.9) data in the design of an updated variable resolution grid (VRG) and market portfolio to best reflect the current property exposure across all lines of business.
Effects on Losses
Compared with the previous version of Risk Frontiers’ QuakeAUS model, losses have generally decreased across the country (average annual loss is 80% and the 200-year return period loss is 63% of former values on a testing national portfolio) due to the update of the historical earthquake catalogue. This effect is partly mitigated at longer return periods in regions where active faults have now been modelled. The changes in losses are not uniform spatially or temporally. Sydney, for example, shows a drastic reduction in losses at every return period, while the losses for Melbourne show a slight increment. In other areas such as Adelaide the losses are lower than in the previous model for short return periods, but that trend is reversed for return periods greater than 1,000 years
Allen, T., J. Griffin, M. Leonard, D. Clark and H. Ghasemi (2017). An updated National Seismic Hazard Assessment for Australia: Are we designing for the right earthquakes? Proceedings of the Annual Conference of the Australian Earthquake Engineering Society in Canberra, November 24-26, 2017.
Campbell, Kenneth & Bozorgnia, Yousef. (2014). NGA-West2 Ground Motion Model for the Average Horizontal Components of PGA, PGV, and 5% Damped Linear Acceleration Response Spectra. Earthquake Spectra. 30. 1087-1115.
McPherson, A. A. (2017). A Revised Seismic Site Conditions Map for Australia. Record 2017/XX. Geoscience Australia, Canberra. DOI
Clark, D. (2012). Neotectonic Features Database. Commonwealth of Australia (Geoscience Australia).
Australian Journal of Emergency Management. July 2018 edition.
Flood levees are a commonly used method of flood protection. Previous research has proposed the concept of the ‘levee paradox’ to describe the situation whereby the construction of levees leads to a lowered community awareness of the risks of flooding and increased development in the ‘protected’ area. The consequences of this are the risks of larger losses in less frequent but deeper floods when levees overtop or fail. This paper uses the recent history of flooding and levee construction to investigate the ‘levee paradox’ through a study of flood preparedness and floodplain development in Lismore, NSW.
As reported in the San Francisco Chronicle on 21 June 2018, Pacific Gas and Electric Co. and its parent company, PG&E Corp., reported Thursday that they will take a $2.5 billion charge to cover expected losses from October’s deadly Wine Country wildfires. PG&E is blamed for sparking some of the most destructive blazes in California history, and warned investors that the financial pain may just be beginning. The damage charge, which will be recorded in the current quarter, is larger than PG&E Corp.’s 2017 profit of $1.66 billion. But PG&E executives said that it represents just the low end of the utility’s potential losses from the fires; the final amount could be much higher. In Australia, there were numerous large court cases against power companies in the aftermath of Black Saturday fire.
The following article, written by David R. Baker, appeared in the San Francisco Chronicle on 9 June 2018.
Firefighters were still struggling to contain the flames scorching the North Bay last October when residents first started lining up to sue Pacific Gas and Electric Co. State fire officials had already named PG&E’s power lines as possible ignition sources for the dozens of fires that erupted during a windstorm on Oct. 8, destroying more than 8,800 buildings across Northern California and killing 45 people. But they cautioned that their investigation was just getting under way.
Many survivors, however, were convinced that the culprit was PG&E. Eight days after the fires began, at least 175 people gathered at the Santa Rosa Hyatt to hear from three law firms preparing to take on the utility. In the months that followed, more than 150 individual suits would be filed against PG&E. Investigators with the California Department of Forestry and Fire Protection, or Cal Fire, are now finally releasing their reports on the causes of the fires. In every case so far, Cal Fire has traced the flames back to PG&E’s equipment.
Even more damning, in 11 of the 16 fires for which Cal Fire has issued reports, investigators found reason to believe that PG&E had broken state safety rules. They sent their findings to the district attorneys in the counties involved to explore possible prosecution. Cal Fire still has not named a cause for the biggest blaze that night — the Tubbs Fire, which raced from Calistoga to Santa Rosa, leveled whole neighborhoods and killed 24 people. The Cal Fire reports issued to date, however, could lead to criminal charges against PG&E, which was convicted on six felony charges following the fatal 2010 San Bruno gas pipeline explosion.
Even some of the lawyers now suing the company, however, consider criminal charges unlikely. Instead, the agency’s findings could give the survivors suing PG&E a way to hold the utility liable for economic damages caused by the fires, even in the instances in which Cal Fire did not accuse the utility of doing anything wrong. Under a legal concept called inverse condemnation, California utilities can be made to pay economic damages for fires tied to their equipment, regardless of whether they followed the state’s safety regulations.
Furthermore, by raising the possibility of wrongdoing, the reports could end up blocking PG&E’s ability to pass along any of those costs to its more than 5 million customers. California regulators have refused to let utilities incorporate wildfire lawsuit costs into their rates when negligence is involved. PG&E and the state’s other big utilities have been waging a lobbying campaign in Sacramento to change the state’s liability laws and shield them from wildfire lawsuits, or at least let them make their customers pay the costs. That effort may now be moot.
“I think that is now off the table,” said Patrick McCallum, a Sacramento lobbyist who lost his own home in the fires and has been trying to thwart PG&E’s push on liability laws. He leads a campaign called Upfrom the Ashes funded by some of the lawyers suing PG&E. “In my opinion, there are not the votes in the Legislature today to change inverse condemnation or strict liability,” McCallum said. “These reports show the Legislature and their staff what others have known, that there’s a history of mismanagement at PG&E.”
PG&E said it will continue pushing for liability changes, as well as work with state officials on fire prevention measures. “Liability regardless of negligence undermines the financial health of the state’s utilities, discourages investment in California and has the potential to materially impact the ability of utilities to access the capital markets to fund utility operations and California’s bold clean energy vision,” the company said.
The stakes for PG&E are high. Damage estimates from all of the Northern California wildfires, viewed together, stand at nearly $10 billion, according to the California Department of Insurance. Wall Street analysts don’t believe liability for the fires would bankrupt PG&E, but it would at the very least raise insurance prices for the company, and its customers would bear that extra cost. PG&E Corp., the utility’s parent company, made a $1.7 billion profit last year, on $17.1 billion in revenue. PG&E in December suspended its dividend to stockpile money, should it be held responsible for the fires.
Much still hinges on whether Cal Fire blames the Tubbs Fire on PG&E’s equipment. The company has claimed that a power line installed and owned by a private property owner started the blaze. Gerald Singleton, one of the attorneys suing PG&E, estimates that the Tubbs Fire alone accounts for close to half of the liability PG&E could face. “If the Tubbs report comes back, and they say, ‘No, remarkably, PG&E’s equipment wasn’t involved,’ then PG&E no longer has an immediate financial problem,” said Singleton, with the Singleton law firm.
PG&E said in its first-quarter financial report that it could need to raise money to deal with the fallout. Already, it has spent $259 million on repairs and service restoration. It has approximately $840 million in liability insurance — far short of what it might be required to pay.
The findings issued to date should make it easier for insurance companies, fire departments, cities and others to sue PG&E for losses caused by the fires, a process known as subrogation. Insurance companies will seek to recoup at least some of the claims they paid out to policyholders, the same way auto insurers after an accident will pay their customers first, then seek reimbursement from the at-fault party or his insurer. “It’s part of the normal process on how issues like this are resolved, making sure that the responsible party pays for the damage they cause,” said Mark Sektan, vice president with the Property Casualty Insurers Association of America. Sektan said any result from industry lawsuits against PG&E “will be a couple years away. Where it will help homeowners who have insurance is that when the insurers receive a settlement, they will refund whatever deductible the homeowner has paid.”
A glass sculptor who lost his life’s work in Napa’s Atlas Fire, Clifford Rainey is among many victims who are suing PG&E. Rainey also lost the Napa home he shared with his partner, Rachel Raiser, a floral designer, who also lost her studio. “We’re in a pickle financially,” Rainey said, noting that he had no insurance on his art studio. “The only way I can see to get any compensation is through one of these lawsuits.” So Friday’s news that Cal Fire investigators have determined that the Atlas Fire started with a PG&E power line gives him hope that he and Raiser will one day be able to rebuild. But he fears that PG&E power lines will remain unsafe, despite the finding. “It’s amazing that in California we still have these power cables above ground,” he said. “I’m from the U.K., and across most of Europe, electrical wires are always underground. In Napa, where I live, the power cables actually weave through trees. I cringe.”
Chronicle staff writers Kathleen Pender and Nanette Asimov contributed to this report.
In December 2017, the credit rating agency Moody’s warned U.S. cities and states to prepare for the effects of climate change or risk being downgraded. It explained how it assesses the credit risks to a city or state that’s being impacted by climate change — whether that impact be a short-term “climate shock” like a wildfire, hurricane or drought, or a longer-term “incremental climate trend” like rising sea levels or increased temperatures. It also takes into consideration communities’ preparedness for such shocks and their activities adapting to climate trends.
A recent report by Charles Donovan and Christopher Corbishley of Imperial College predicts that countries disproportionately impacted by climate change could have to pay an extra $170 billion in interest rates over the next 10 years. The following article by Henry Grabar, which appeared on Slate on Oct. 28 2017, explains why the bond market is not more worried by climate change.
The article draws examples from recent flooding in US cities and the infamous National Flood Insurance Program. Parts of the US like Miami, New Orleans and New York are feeling the effects of sea level rise now during extreme weather events, in part because of the low-lying topography and high population density of these coastal areas. In Eastern Australia, shorelines have – until now – broadly been able to keep pace with a rising tidal prism because of antecedent sediment conditions and a relatively steep coastal hinterland.
However, high and rising coastal populations and expanding infrastructure (~ 85 % of Australia’s population currently lives near the coast) leave some big east coast cities like Newcastle, Brisbane and Cairns with significant exposure to higher sea levels in the coming decades.
We should perhaps be looking to examples in the US and elsewhere as a present-day ‘litmus test’ of financial markets response, and a window to the near-future time when sea level rise begins to have a more significant impact on some of the big east coast cities in Australia.
Early this month, when the annual king tide swept ocean water into the streets of Miami, the city’s Republican mayor, Tomás Regalado, used the occasion to stump for a vote. He’d like Miami residents to pass the “Miami Forever” bond issue, a $400-million property tax increase to fund seawalls and drainage pumps (they’ll vote on it on Election Day). “We cannot control nature,” Regalado says in a recent television ad, “but we can prepare the city.”
Miami is considered among the most exposed big cities in the U.S. to climate change. One study predicts the region could lose 2.5 million residents to climate migration by the end of the century. As on much of the Eastern Seaboard, the flooding is no longer hypothetical. Low-lying properties already get submerged during the year’s highest tides. So-called “nuisance flooding” has surged 400 percent since 2006.
Business leaders are excited about the timing of the vote in part because Miami currently has its best credit ratings in 30 years, meaning that the city can borrow money at low rates. Amid the dire predictions and the full moon floods, that rating is a bulwark. It signifies that the financial industry doesn’t think sea level rise and storm risk will prevent Miami from paying off its debts. In December, a report issued by President Obama’s budget office outlined a potential virtuous cycle: Borrow money to build seawalls and the like while your credit is good, and your credit will still be good when you need to borrow in the future.
The alternative: Flood-prone jurisdictions go into the financial tailspin we recognize from cities like Detroit, unable to borrow enough to protect the assets whose declining value makes it harder to borrow. The long ribbon of vulnerable coastal homes from Brownsville to Acadia has managed to stave off that cycle in part thanks to a familiar, federally backed consensus between homebuyers and politicians. Homebuyers continue to place high values on homes, even when they’ve suffered repeated flood damage. That’s because the federal government is generous with disaster aid and its subsidy of the National Flood Insurance Program, which helps coastal homeowners buy new washing machines when theirs get wrecked. Banks require coastal homeowners with FHA-backed mortgages to purchase flood insurance, and in turn, coastal homes are rebuilt again and again and again—even when it might no longer be prudent.
But there’s another element that helps cement the bargain: investors’ confidence that coastal towns will pay back the money they borrow. Homebuyers are irrational. Politicians are self-interested. But lenders—and the ratings agencies that help direct their investments—ought to have a more clinical view. Evaluating long-term risk is exactly their business model. If they thought environmental conditions threatened investments, they would sound the alarm—or just vote with their wallets. They’ve done it before—cities like New Orleans, Galveston, Texas, and Seaside Heights, New Jersey were all downgraded by rating agencies after damage from Hurricanes Katrina, Ike, and Sandy. But all have since rebounded. There does not appear to be a single jurisdiction in the United States that has suffered a credit downgrade related to sea level rise or storm risk. Yet.
To understand why, it helps to look at communities like Seaside Heights, the boardwalk enclave along the Jersey Shore whose marooned roller coaster provided the definitive image of the 2012 storm. Seaside Heights was given an A3 rating from Moody’s in 2013, meaning “low credit risk.” Ocean County, New Jersey—the county in which Seaside Heights sits—has a AAA rating. In the summer of 2016, before Ocean County sold $31 million in 20-year bonds, neither Moody’s Investor Services nor S&P Global Ratings asked about how climate change might affect its finances, the county’s negotiator told Bloomberg this summer. “It didn’t come up, which says to me they’re not concerned about it.”
The credit rating agencies would deny that characterization—to a point. They do know about sea level rise. They just don’t think it matters yet. In 2015, analysts from Fitch concluded, “sea level rise has not played a material role” in assessing creditworthiness, despite “real threats.” Hurricane Sandy had no discernible effect on the median home prices in Monmouth, Ocean, and Atlantic Counties, which make up New Jersey’s Atlantic Coast. The effect on tourism spending was also negligible.
“We take a lot from history, and historically what’s happened is that these places are desirable to be in,” explains Amy Laskey, a managing director at Fitch Ratings. “People continue to want to be there and will rebuild properties, usually with significant help from federal and state governments, so we haven’t felt it affects the credit of the places we rate.”
There are three reasons for that. The first is that disasters tend to be good for credit, thanks to cash infusions from FEMA’s generous Disaster Relief Fund. “The tax base of New Orleans now is about twice what it was prior to Katrina,” Laskey says, despite a population that remains 60,000 persons shy of its 2005 peak. “Longer term what tends to happen is there’s rebuilding, a tremendous influx of funds from the federal and state governments and private insurers.” Local Home Depots are busy. Rental apartments fill up with construction workers. Contractors have to schedule work months in advance. Look at Homestead, Florida, Laskey advised, a sprawling city south of Miami that was nearly destroyed by Hurricane Andrew. Today it is bigger than ever. “If there was going to be a place that wasn’t going to come back, that would have been it.”
What emerges from the destruction, for the most part, are communities full of properties that are more valuable than they were before, because they’re both newer and better prepared for the next storm. Or as a Moody’s report on environmental risk puts it, “generally disasters have been positive for state finances.” But this is entirely dependent on federal largesse: After Massachusetts brutal winter of 2015, FEMA granted only a quarter of the state’s request for aid. Moody’s determined that could negatively impact the credit ratings of local governments that had to shoulder the cost of snow and ice removal.
Second is that people still want to live on the shore. “The amenity value of the beach is something you can enjoy every day of the summer,” says Robert Muir-Wood, the chief research officer at Risk Management Solutions. “People may say, ‘The benefits of living on the beach to my health and wellbeing outweigh the impact of the flood.’” That calculus is strongly influenced by affordable flood insurance policies, but it has not changed. In a way, despite the risks, the sea is a more dependable economic engine for a community than, say, a factory that could shut its doors and move away any minute. Most bonds get paid off from property taxes. If property values remain high, bondholders have little to worry about. If, on the other hand, property values fall, tax rates must rise. If buildings go into foreclosure, or neighborhoods undergo “buy-outs” to restore wetlands or dunes, more of the burden to pay off that new seawall falls on everyone else.
Third: Most jurisdictions are large. New Jersey’s coastal counties also contain thousands of inland homes whose risk exposure is much, much lower. Adam Stern, a co-head of research at Boston’s Breckinridge Capital Advisors, argues that the first credit problems will come for small communities devastated by major storms.
Still, Stern said, his firm looks at these issues. “One of the things we try to get at when we look at an issuer of bonds that’s on the coast: Do you take climate change seriously? Are you planning for that?” Still, he said, bond buyers—like everyone else—discount the value of future money, and hence future risk. When could the breaking point for the muni market come? Stern predicts that will happen when property values start to discernibly change in reaction to climate risk. It’s a game of chicken between infrastructure investors and homeowners.
As the Earth’s atmosphere warms, the atmospheric circulation changes. These changes vary by region and time of year, but there is evidence to suggest that anthropogenic warming causes a general weakening of summertime tropical circulation. Because tropical cyclones are carried along within the ambient environmental wind, there is an expectation that the translation speed of tropical cyclones has or will slow with warming.
Severe Tropical Cyclone Debbie, which made landfall near Mackay in March 2017, was an unusually slow event, crossing the coast at only seven kilometers per hour. Likewise, the “stalling” of Hurricane Harvey over Texas in August 2017 is another example of a recent, slow-moving event. While two events by no means constitute a trend, slow-moving cyclones can be especially damaging in terms of the rainfall volumes that are precipitated out over a single catchment or town (Fig. 1). A slow translation speed means strong wind speeds are sustained for longer periods of time and it can also increase the surge-producing potential of a tropical cyclone.
But have changes in the translation speeds of tropical cyclones been observed in the Australian region and can we draw any conclusions about any impact of these changes on related flooding?
A recent article published in the journal Nature by James Kossin of NOAA looks at tropical cyclone translation speeds from 1949 through to 2016, using data from the US National Hurricane Center (NHC) and Joint Typhoon Warning Center (JTWC), and finds a 10 percent global decrease. For western North Pacific and North Atlantic tropical cyclones, he reports a slowdown over land areas of 30 percent and 20 percent respectively, and a slowdown of 19 percent over land areas in Australia.
The following is an extract from Kossin’s article, followed by some comments on the significance of his work for the Australian region. The full article and associated references are available here.
Kossin’s article – in short
Anthropogenic warming, both past and projected, is expected to affect the strength and patterns of global atmospheric circulation. Tropical cyclones are generally carried along within these circulation patterns, so their past translation speeds may be indicative of past circulation changes. In particular, warming is linked to a weakening of tropical summertime circulation and there is a plausible a priori expectation that tropical-cyclone translation speed may be decreasing. In addition to changing circulation, anthropogenic warming is expected to increase lower-tropospheric water-vapour capacity by about 7 percent per degree (Celsius) of warming. Expectations of increased mean precipitation under global warming are well documented. Increases in global precipitation are constrained by the atmospheric energy budget but precipitation extremes can vary more broadly and are less constrained by energy considerations.
Because the amount of local tropical-cyclone-related rainfall depends on both rain rate and translation speed (with a decrease in translation speed having about the same local effect, proportionally, as an increase in rain rate), each of these two independent effects of anthropogenic warming is expected to increase local rainfall.
Time series of annual-mean global and hemispheric translation speed are shown in Fig. 2, based on global tropical-cyclone ‘best-track’ data. A highly significant global slowdown of tropical-cyclone translation speed is evident, of −10 percent over the 68-yr period 1949–2016. During this period, global-mean surface temperature has increased by about 0.5 °C. The global distribution of translation speed exhibits a clear shift towards slower speeds in the second half of the 68-yr period, and the differences are highly significant throughout most of the distribution.
This slowing is found in both the Northern and Southern Hemispheres (Fig. 2b) but is stronger and more significant in the Northern Hemisphere, where the annual number of tropical cyclones is generally greater. The times series for the Southern Hemisphere exhibits a change-point around 1980, but the reason for this is not clear.
The trends in tropical-cyclone translation speed and their signal-to-noise ratios vary considerably when the data are parsed by region but slowing over water is found in every basin except the northern Indian Ocean. Significant slowing of −20 percent in the western North Pacific Ocean and of −15 percent in the region around Australia (Southern Hemisphere, east of 100° E) are observed.
When the data are constrained within global latitude belts, significant slowing is observed at latitudes above 25° N and between 0° and 30° S. Slowing trends near the equator tend to be smaller and not significant, whereas there is a substantial (but insignificant) increasing trend in translation speed at higher latitudes in the Southern Hemisphere.
Changes in tropical-cyclone translation speed over land vary substantially by region (Fig. 3). There is a substantial and significant slowing trend over land areas affected by North Atlantic tropical cyclones (20 percent reduction over the 68-yr period), by western North Pacific tropical cyclones (30 percent reduction) and by tropical cyclones in the Australian region (19 percent reduction, but the significance is marginal).
Contrarily, the tropical-cyclone translation speeds over land areas affected by eastern North Pacific and northern Indian tropical cyclones, and of tropical cyclones that have affected Madagascar and the east coast of Africa, all exhibit positive trends, although none are significant.
In addition to the global slowing of tropical-cyclone translation speed identified here, there is evidence that tropical cyclones have migrated poleward in several regions. The rate of migration in the western North Pacific was found to be large, which has had a substantial effect on regional tropical-cyclone-related hazard exposure.
These recently identified trends in tropical-cyclone track behaviour emphasize that tropical-cyclone frequency and intensity should not be the only metrics considered when establishing connections between climate variability and change and the risks associated with tropical cyclones, both past and future.
These trends further support the idea that the behaviours of tropical cyclones are being altered in societally relevant ways by anthropogenic factors. Continued research into the connections between tropical cyclones and climate is essential to understanding and predicting the changes in risk that are occurring on a global scale.
Significance for the Australian region
While an interesting piece of work, the results for the Southern Hemisphere and the Australian region, are less clear than for the North Atlantic and North Pacific basins.
The trend shown in Fig. 2b for the whole of the Southern Hemisphere is not significant and is clearly composed of two separate trends, each spanning around 30 years. Assuming a homogenous dataset, the time series may be reflecting the strong influence of inter-decadal climate forcing.
In the Southern Hemisphere, the role of multi-decadal climate-ocean variability, like the Pacific Decadal Oscillation (PDO) or the Indian Ocean Dipole (IOD) has a large influence on decadal-scale climate variability (particularly in Australia) and can mask a linear, anthropogenically-forced trend.
The paper also mentions that global slowdown rates are only significant over-water (which makes up around 90 percent of the best track data used), whereas the trend for the 10 percent of global data that corresponds to cyclones over land (where rainfall effects become most societally relevant) is not significant. Therefore, it is unclear, at a global scale, whether tropical cyclones have slowed down over land or not. The trend for the Australian region (Fig. 3f, Southern Hemisphere > 100 °E), for both over land and over water slowdowns (approx. -19 percent), is only marginally significant. Further work could analyse translation speeds in the Australian region using our Bureau of Meteorology tropical cyclone database.
As with previous studies of changes to tropical cyclone behaviour in Australia, results are unclear. The relatively short time span of consistent records, combined with high year-to-year variability, makes it difficult to discern any clear trends in tropical cyclone frequency or intensity in this region (CSIRO, 2015).
For the period 1981 to 2007, no statistically significant trends in the total numbers of cyclones, or in the proportion of the most intense cyclones, have been found in the Australian region, South Indian Ocean or South Pacific Ocean (Kuleshov et al. 2010). However, observations of tropical cyclone numbers from 1981–82 to 2012–13 in the Australian region show a decreasing trend that is significant at the 93-98 percent confidence level when variability associated with ENSO is accounted for (Dowdy, 2014). Only limited conclusions can be drawn regarding tropical cyclone frequency and intensity in the Australian region prior to 1981, due to a lack of data. However, a long-term decline in numbers on the Queensland coast has been suggested (Callaghan and Power, 2010) and northeast Australia is also a region of projected decrease in tropical cyclone activity, including cat 4-5 storms, according to Knutson et al. (2015).
In summary, based on global and regional studies, tropical cyclones are in general projected to become less frequent with a greater proportion of high intensity storms (stronger winds and greater rainfall). This may be accompanied with a general slow-down in translation speed. A greater proportion of storms may reach south (CSIRO, 2015).
The take home message? The known-unknowns are still quite a bit greater than the known-knowns.
CALLAGHAN, J. & POWER, S. 2010. A reduction in the frequency of severe land-falling tropical cyclones over eastern Australia in recent decades. Clim Dynam.
CSIRO and BoM [CSIRO] 2015. Climate Change in Australia Information for Australia’s Natural Resource Management Regions: Technical Report, CSIRO and Bureau of Meteorology, Australia, pp 222.
DOWDY, A. J. 2014. Long-term changes in Australian tropical cyclone numbers. Atmospheric Science Letters.
KNUTSON, T.R., SIRUTIS, J.J., ZHAO, M., TULEYA, R.E., BENDER, M., VECCHI, G.A., VILLARINI, G. & CHAVAS, D. 2015. Global Projections of Intense Tropical Cyclone Activity for the Late Twenty-First Century from Dynamical Downscaling of CMIP5/RCP4.5 Scenarios. Journal of Climate, 28, 7203-7224.
KOSSIN, J.P. 2018. A global slowdown of tropical-cyclone translation speed. Nature 558, 104-107.
KULESHOV, Y., FAWCETT, R., QI, L., TREWIN, B., JONES, D., MCBRIDE, J. & RAMSAY, H. 2010. Trends in tropical cyclones in the South Indian Ocean and the South Pacific Ocean. Journal of Geophysical Research-Atmospheres, 115.
OFFICE OF THE INSPECTOR-GENERAL EMERGENCY MANAGEMENT 2017. The Cyclone Debbie Review: Lessons for delivering value and confidence through trust and empowerment. Report 1: 2017-18.