A second earthquake with magnitude larger than 5 occurred today (November 9, 2018) near Lake Muir in southwestern Western Australia, and Geoscience Australia assigned it a magnitude of 5.4. This earthquake, shown by the large red dot in Figure 1, occurred about 10 km southeast of the magnitude 5.7 earthquake that occurred on 17 September 2018. The aftershocks of today’s earthquake, shown by small red dots in Figure 1, are located at the southern end of the aftershock zone of the September 17 event, shown by the yellow dots. Today’s earthquake was preceded by a series of foreshocks that occurred overnight, and was felt between Albany and Perth. The shaking intensities of the two earthquakes are shown in Figures 2 and 3.
The orientation of the fault plane on which the 17 September earthquake occurred is shown on a Wulf net projection in Figure 4. This shows that the earthquake had a thrust mechanism on a fault plane striking east-northeast. The InSAR (interferometric Synthetic Aperture Radar) map in Figure 5 shows that the west side moved up and the east side moved down on a plane dipping down to the west-northwest. The change in elevation along this dip direction is shown at the upper left of Figure 5.
The two earthquakes are shown by the yellow stars on a map of historical earthquake epicentres in the Southwest Western Australia Seismic Zone (SWWASZ) in Figure 6. The contours show annual probability of events of magnitude 5 and above from QuakeAUS6. The two earthquakes occurred on the southwestern edge of the SWWASZ.
As described in our Briefing Note 373, we have recently released our new probabilistic earthquake loss model for Australia, QuakeAUS 6.0. The updated model, developed by Dr Valentina Koschatzky with input from Risk Frontiers’ Chief Geoscientist, Dr Paul Somerville, incorporates Geoscience Australia’s recent revision of the Australian Earthquake Catalogue (Allen et al., 2017), which has more than halved the rate of earthquakes exceeding 4.5 in magnitude. The main features of the new model are:
New Distributed Earthquake Source Model (based on RF analysis of the new GA catalogue – 2018)
Inclusion of an Active Fault Model
Updated Soil Classification (McPherson 2017)
Updated Soil Amplification Model (Campbell & Bozorgnia 2014)
After the Christchurch earthquake sequence we are very aware of liquefaction and the large scale damage it was responsible for. It may come as a surprise to learn that liquefaction is a big safety issue for the shipping industry where it is sometimes called dry cargo liquefaction or dynamic separation. The article below was written by Susan Gourvenec a professor of offshore geotechnical engineering at the University of Southampton. It was printed in Ars Technica, a website covering news and opinions in technology, science, politics, and society. It publishes news, reviews, and guides on issues such as computer hardware and software, science, technology policy, and video games.
A lot is known about the physics of the liquefaction, yet it’s still causing ships to sink.
Think of a dangerous cargo, and toxic waste or explosives might come to mind. But granular cargoes such as crushed ore and mineral sands are responsible for the loss of numerous ships every year. On average, 10 “solid bulk cargo” carriers have been lost at sea each year for the last decade.
Solid bulk cargoes – defined as granular materials loaded directly into a ship’s hold – can suddenly turn from a solid state into a liquid state, a process known as liquefaction. And this can be disastrous for any ship carrying them – and its crew.
In 2015, the 56,000-tonne bulk carrier Bulk Jupiter rapidly sank around 300km south-west of Vietnam, with only one of its 19 crew surviving. This prompted warnings from the International Maritime Organization about the possible liquefaction of the relatively new solid bulk cargo bauxite (an aluminum ore).
A lot is known about the physics of the liquefaction of granular materials from geotechnical and earthquake engineering. The vigorous shaking of the Earth causes pressure in the ground water to increase to such a level that the soil “liquefies.” Yet despite our understanding of this phenomenon (and the guidelines in place to prevent it occurring), it is still causing ships to sink and take their crew with them.
Solid bulk cargoes
Solid bulk cargoes are typically “two-phase” materials, as they contain water between the solid particles. When the particles can touch, the friction between them makes the material act like a solid (even though there is liquid present). But when the water pressure rises, these inter-particle forces reduce and the strength of the material decreases. When the friction is reduced to zero, the material acts like a liquid (even though the solid particles are still present).
A solid bulk cargo that is apparently stable on the quayside can liquefy because pressures in the water between the particles build up as it is loaded onto the ship. This is especially likely if, as is common practice, the cargo is loaded with a conveyor belt from the quayside into the hold, which can involve a fall of significant height. The vibration and motion of the ship from the engine and the sea during the voyage can also increase the water pressure and lead to liquefaction of the cargo.
When a solid bulk cargo liquefies, it can shift or slosh inside a ship’s hold, making the vessel less stable. A liquefied cargo can shift completely to one side of the hold. If it regains its strength and reverts to a solid state, the cargo will remain in the shifted position and cause the ship to permanently tilt (or “list”) in the water. The cargo can then liquefy again and shift further, increasing the angle of list.
At some point, the angle of list becomes so great that water enters the hull through the hatch covers, or the vessel is no longer stable enough to recover from the rolling motion caused by the waves. Water can also move from within the cargo to its surface as a result of liquefaction and subsequent sloshing of this free water can further impact the vessel’s stability. Unless the sloshing can be stopped, the ship is in danger of sinking.
The International Maritime Organization has codes governing how much moisture is allowed in solid bulk cargo in order to prevent liquefaction. So why does it still happen?
The technical answer is that the existing guidance on stowing and shipping solid bulk cargoes is too simplistic. Liquefaction potential depends not just on how much moisture is in a bulk cargo but also other material characteristics, such as the particle size distribution, the ratio of the volume of solid particles to water and the relative density of the cargo, as well as the method of loading and the motions of the vessel during the voyage.
The production and transport of new materials, such as bauxite, and increased processing of traditional ores before they are transported, means more cargo is being carried whose material behavior is not well understood. This increases the risk of cargo liquefaction.
Commercial agendas also play a role. For example, pressure to load vessels quickly leads to more hard loading even though it risks raising the water pressure in the cargoes. And pressure to deliver the same tonnage of cargo as was loaded may discourage the crew of the vessel draining cargoes during the voyage.
What’s the solution?
To tackle these problems, the shipping industry needs to better understand the material behavior of solid bulk cargoes now being transported and prescribe appropriate testing. New technology could help. Sensors in a ship’s hold could monitor the water pressure of the bulk cargo. Or the surface of the cargo could be monitored, for example using lasers, to identify any changes in its position.
The challenge is developing a technology that is cheap enough, quick to install and robust enough to survive loading and unloading of the cargo. If these challenges can be overcome, combining data on the water pressure and movement of the cargo with information on the weather and the ship’s movements could produce a real-time warning of whether the cargo is about to liquefy.
The crew could then act to prevent the water pressure in the cargo rising too much, for example, by draining water from the cargo holds (to reduce water pressure) or changing the vessel’s course to avoid particularly bad weather (to reduce ship motions). Or if those options are not possible, the crew could evacuate the vessel. In this way, this phenomenon of solid bulk cargo liquefaction could be overcome, and fewer ships and crew would be lost at sea.
Susan Gourvenec is a professor of offshore geotechnical engineering at the University of Southampton. This article was originally published on The Conversation and has been lightly edited to conform to Ars Technica style guidelines. Read the original article.
Macquarie University’s Lighthouse publication recently showcased research being undertaken by Risk Frontiers’ Andrew Gissing on planning and capability requirements for catastrophic events. This research, undertaken through the Bushfire and Natural Hazards Cooperative Research Centre, is investigating better practice approaches to planning and preparedness for extreme events that may overwhelm existing response frameworks. You can read the story highlighting the research in the context of the recent Sulawesi earthquake and tsunami here: https://lighthouse.mq.edu.au/article/october/sulawesis-earthquake-and-tsunami-provide-key-insights-into-catastrophe-response
Twenty-eight years on from the First Assessment Report in 1990, the IPCC’s most recent Special Report on Global Warming delivers an urgent warning to policymakers that we are reaching the point of no return for mitigating anthropogenic impacts on global warming and associated climate change. The report has divided opinion in Australia and further highlights the polarising power of climate change across government, academia and industry.
The report finds that limiting global warming to 1.5 °C, although “possible within the laws of chemistry and physics”, would now require rapid and unprecedented change in all aspects of society. Global net human-caused emissions of CO2 would need to fall by approximately 45 percent from 2010 levels by 2030, reaching ‘net zero’ around 2050. This means that any remaining emissions would need to be balanced by utilising as-yet under-developed technologies to remove CO2 from the air.
The report also highlights that we are already seeing the consequences of 1 °C of global warming through more extreme weather, rising sea levels and diminishing Arctic sea ice. One of the difficulties in communicating the impacts of seemingly small increases in mean temperatures is related to how this affects extreme weather events. The immediate reaction of many to “a 1 °C temperature increase” is to imagine oneself lying on a beach at 24 °C and then at 25 °C with global warming. Not that bad, right?
The key notion is that a small increase in the mean temperature also shifts the tails of the distribution, meaning the probability of extreme weather events increases just as much – and sometime more (depending on the shape of the distribution) – as the shift in the mean temperature (Figure 1). Prof Andy Pittman, the director of the ARC Centre of Excellence for Climate Extremes at the University of New South Wales, describes this nicely in an anecdote to BBC News back in January:
“the probability works a bit like if you stand at sea level and throw a ball in the air, and then gradually make your way up a mountain and throw the ball in the air again. The chances of the ball going higher increases dramatically. That’s what we’re doing with temperature.”
Figure 1 Small changes in the averages of many key climate variables can correspond to large changes in weather. Source: Solomon et al. (2007).
What the report says
Abridged findings from the report that have high confidence (80 % chance) are:
Global warming is likely to reach 1.5 °C between 2030 and 2052 if temperatures continue to increase at the current rate (Figure 2);
There are robust differences in climate model projections of regional climate characteristics between present-day and global warming of 1.5 °C and between 1.5 °C and 2 °C, most notably sea level rise and extreme heat;
Most climate change adaptation needs will be lower for global warming of 1.5 °C compared to 2 °C;
Estimates of the global emissions outcomes of current nationally stated mitigation ambitions as per the Paris Agreement would not limit warming to 1.5 °C, even if supplemented by challenging emissions reductions after 2030.
Figure 2 Observed monthly global mean surface temperature change and likely modelled responses to anthropogenic emission and forcing pathways relative to the 1.5 ° C threshold, extending to 2.0 ° C. Source: Figure SPM.1 in IPCC (2018).
The report advocates for anthropogenic climate change to be limited to 1.5 °C, and cites considerable additional impacts for land, energy, industry, buildings and transport in a “2° C world”. The marine world is singled out for particular impacts under a 2° C scenario, with modelling and observations suggesting the large-scale die-out of tropical coral reefs including, of course, the Great Barrier Reef (GBR).
Changes to the GBR not only have direct impacts for marine biodiversity, but also for cyclone risk along the adjacent mainland coast, which would potentially experience higher storm surge and wave exposure under a combination of rising sea levels and reduced energy dissipation by coral reefs.
The report is published at a time of international discord on climate mitigation, with most scientists acknowledging that the likelihood of achieving a plateau at the proposed 1.5 °C is very small. This is essentially a reflection on the myopic nature of global political institutions, and the opposing long-term nature of the problem at hand.
It also highlights the divisive nature of climate change in Australia. As elsewhere, it has become entangled with political agendas, class, energy and living standards. However, unlike elsewhere, adaptation to climate change has yet to occupy a central role in government policy as it has done, for example, in Europe. It has exposed an interesting divide between sectors that have come to the fore in recent years – with banking, insurance and industry at large leading the charge in understanding climate change risk and exposures, and the federal government lagging.
The righteous indignation of some in the public eye too often overshadows the high standards of objectivity, self-imposed on the science community, in delivering the most robust findings possible. This was highlighted last week by the coincidental media release of an ‘audit’ of climate data used by climate models, undertaken as part of a PhD at James Cook University, with the apparent intention of undermining the IPCC’s report.
The audit claims that the underlying data used by Global Climate Models (GCM) is unfit for purpose, citing concerns around temperature anomalies, coverage and sample size, and that GCM predictions cannot be relied on as a result.
While the audit was undertaken as part of a high-quality PhD thesis (McLean, 2017), it is as yet unpublished in the peer-reviewed scientific literature. The concerns over observational data coverage and sample size in years prior to the satellite era are well known and this is why climate reanalysis data should be handled with care – particularly in the Southern Hemisphere.
The assertion that a limited number of spurious temperature anomalies in observational records would distort the global suite of ensemble climate model output is difficult to prove, given the strict uncertainty estimates and sampling checks climate institutions such as the Bureau of Meteorology and the UK Met Office undertake. However, it is still important that end-users understand the multiple layers of uncertainty inherent in climate modelling.
By comparison, the IPCC’s report included the contributions of 91 climate experts from 40 different countries and draws on over 6,000 cited references. The simultaneous reporting of both the climate audit and the IPCC report in the media gives equal weighting to the two and undermines the climate science, at an important juncture for climate politics internationally.
The global impasse on mitigation efforts only serves to highlight the importance of climate change adaptation planning and risk management in Australia, as we transition to a period in which we look to accommodate climate change impacts rather than reduce them, or indeed to utilize a combination of the two.
It also suggests (fascinatingly, from a data science perspective) that, as anthropogenic warming proceeds, we may no longer be able to apply the near-past to predict near-future climate risk as relationships between climate variables in the short-term past become no longer valid.
McLean, J.D. (2017) An audit of uncertainties in the HadCRUT4 temperature anomaly dataset plus the investigation of three other contemporary climate issues. PhD thesis, James Cook University, available https://researchonline.jcu.edu.au/52041/.
Solomon, S., et al. (2007) Technical Summary. In: Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA.
Wednesday 31st October, 2018
at The Museum of Sydney
cnr Bridge and Phillip Streets, Sydney
2pm until 4.30pm followed by light refreshments in the foyer.
The focus on natural hazards, climate change and cyber risk is rising. The world economic forum has identified extreme weather as the number one global risk. Australian company directors must consider risks associated with a changing climate and the Commonwealth Government are set to deliver a national framework for disaster mitigation. Major advances have been made in modelling earthquake risks and cyber remains a significant challenge for industry. Cutting edge scientific research and policy thinking has never been more important.
The 2018 Risk Frontiers’ seminar series continues a well- forged tradition of sharing scientific knowledge with the Australian insurance and disaster management industry. Come along to hear from our experts about the latest in science, policy and modelling advances, and join the team for light refreshments. This year we also welcome Barry Hanstrum, formally the Regional Director for the Bureau of Meteorology, to deliver an informative key note about the risks posed by East Coast Lows. We look forward to seeing you on the 31st of October.
Shaking it up: QuakeAUS reborn (6.0) – Paul Somerville and Valentina Koschatsky
Stormy horizon: the East Coast Low effect – Barry Hanstrum
Speed Talks – Modelling
Phoenix rising: FireAUS 3.0 – Mingzhu Wang
Shaken not stirred: Quake NZ 4.0 – Niyas Madappatt
A family of floods: improving cross-catchment relationships in FloodAUS – Thomas Mortlock
Speed Talks – Research
Towards modelling cyber risk – Tahiry Rabehaja
The new normal: ICA List revisited – John McAneney
A tale of two catastrophes: what determines behavior during disasters? – Andrew Gissing
Employees of Sponsor Companies
Attendance is free for employees of our Sponsor companies and their subsidiaries (Aon Benfield, Guy Carpenter, IAG, QBE, Suncorp and Swiss Re). Please email your name, employer and email address to firstname.lastname@example.org.
Risk Frontiers through the Bushfire and Natural Hazards Cooperative Research Centre is undertaking research into catastrophic disasters. As part of this research we are exploring how businesses can become more involved in the response to and recovery after major disasters. Risk Frontiers’ Andrew Gissing has recently published a piece in the Asia Pacific Fire Magazine summarising some thoughts on the topic. See link below.
Paul Somerville, Chief Geoscientist, Risk Frontiers
The 28 September Mw 7.5 Sulawesi Earthquake occurred on the Palu-Koro fault, which ruptured southward from the epicenter to a location south of Palu. The Palu-Koro fault is a strike-slip fault on which the two sides slide horizontally past each other (east side north and west side south on a fault aligned north-south in this case), and usually do not cause much vertical movement of the ground. In contrast, thrust faults (including subduction earthquake faults) are caused when one side is thrust under the other. Consequently they are much more likely to trigger a tsunami because the vertical movement of the ground raises a column of seawater, setting a tsunami in motion. Although most media attention has been focused on the tsunami, it is clear that strong near-fault ground motions from the earthquake caused massive structural damage and large scale soil liquefaction (which also caused major structural damage) before the arrival of the tsunami in Palu.
Map of the region surrounding the 28 September Mw 7.5 Sulawesi earthquake showing forecast tsunami inundation and arrival time contours. The north-south alignment of aftershocks (red dots) approximately outlines the location of the Palu-Koru fault rupture zone. Sources: USGS/Indonesia Tsunami Early Warning System/Reuters.
Fifteen earthquakes with magnitudes larger than 6.5 have occurred near Palu in the past 100 years. The largest was a magnitude 7.9 event on January 1996, about 100km north of the September 2018 earthquake. Several of these large earthquakes have also generated tsunamis. In 1927, an earthquake and tsunami caused about 50 deaths and damaged buildings in Palu, and in 1968 a magnitude 7.8 earthquake near Donggala generated a tsunami that killed more than 200 people.
Despite this local history and the 2004 Boxing Day Sumatra earthquake and tsunami, many people in Palu were apparently unaware of the risk of a tsunami following the earthquake. The tsunami occurred in an area where there are no tide gauges that could give information about the height of the wave. The Indonesian Tsunami Warning System issued a warning only minutes after the earthquake, but officials were unable to contact officers in the Palu area. The warning was cancelled 34 minutes later, just after the third tsunami wave arrived in Palu. It is likely that the bay’s narrow V-shape intensified the effect of the wave as it funneled through the narrow opening of the bay, inundating Palu at the end of the bay.
While it is possible that a more advanced tsunami warning system could have saved lives if it had been fully implemented, a system currently in the prototype stage may not have helped the people of Palu, as the tsunami arrived at the shore within 20 minutes of the earthquake. Such early warning systems are most useful for areas several hundred kilometres from the tsunami source. In regions like Palu where the earthquake and tsunami source are very close, education is the most effective warning system. If people feel more than 20 seconds of ground shaking, that should form the warning to immediately move to higher ground.
It is not yet clear whether the tsunami was caused by fault movement or by submarine landslides within Palu Bay triggered by shaking from the earthquake. It is possible that the fault cut through a submarine slope, with the horizontal displacement of the sloping sea floor pushing the water aside horizontally, causing it to pile up in a wave. Alternatively, as seems more likely, the tsunami may have been generated by a submarine landslide within the bay. The sides of the bay are steep and unstable, and maps of the sea floor suggest that submarine landslides have occurred there in the past. In that case, even if there had been tsunami sensors or tide gauges at the mouth of the bay, they would not have sensed the tsunami before it struck the shore in Palu.
It is clear from images of building damage that there was strong ground shaking in Palu and surrounding regions, as would be expected in the near-fault region of an earthquake of this magnitude. This shaking damage would have made structures even more vulnerable to the ensuing tsunami in low lying areas.
Another major cause of damage was the soil liquefaction in large areas within Palu and surrounding regions. Palu is situated on a plain composed of water saturated soft sandy soils. Images from the disaster area show large scale lateral spreading, in which buildings on chunks of thin brittle crust slide across the underlying liquefied sands as if they are flowing in the water. This has resulted in the total destruction of buildings in large areas, leaving a churned landscape composed of debris and buildings that have sunk into the liquefied soil.
Australian Tsunami Risk and Warning
Australia is sufficiently remote from major subduction earthquake source zones that there is enough time (a few hours) for tsunami warning for such events, and in any case the hazard from such tsunamis is quite low. The main source of tsunami hazard may come from the occurrence of local earthquakes offshore that trigger submarine landslides on the continental slope. Such earthquakes are thought to be infrequent, and so the hazard from them is thought to be low. Marine surveys have been undertaken to identify potential locations of past underwater landslides and estimate their recency and frequency of occurrence. Such landslides would generate local tsunamis that would give little time for effective tsunami warning.
The Australian east coast has experienced at-least 47 tsunami events in historical time. The largest occurred in 1960 as a result of the 22 May 1960 Mw 9.6 earthquake in Chile, the largest earthquake in recorded history. The recorded wave height at Fort Denison in Sydney Harbour was 1 metre, strong flow velocities caused damage to boats in Sydney Harbour and the Hunter River, and there was some minor inundation at Batemans Bay.
The Australian Government operates the Australian Tsunami Warning System, and states and territories maintain disaster plans and education programs. In the rare event of a large tsunami generated by a local source, emergency services would likely be overwhelmed and faced with significant challenges in achieving access to impacted areas due to damage to infrastructure.
Local sociality, which is local people’s everyday lives in and with their community, influences recovery in disaster-affected communities. This paper examines recovery in four disaster-impacted communities. In the two Australian examples rural communities were impacted by the 2011 Queensland floods. The two Japanese communities discussed suffered in the 2011 Tohoku earthquake and tsunami and, in one case, from radiation contamination arising from the damaged Fukushima Daiichi nuclear power plant. We argue that local sociality is often poorly understood by external parties such as disaster recovery experts and agencies. The Japanese planning concept of machizukuri – literally “creating communities” – incorporates physical, structural and social aspects in urban planning practices and was successfully applied to recovery processes in one of the Japanese cases. Drawing on that case, the paper concludes that machizukurioffers a valuable tool to foster better consideration of local sociality – both pre- and post-disaster – as an intrinsic component of communities’ vulnerability and resilience.
As the Earth’s atmosphere warms, the atmospheric circulation changes. Understanding how tropical cyclone activity may change in response to this warming is no easy task, with recent studies showing considerable dispersion in projected changes in activity for the Australian region. For example, Knutson et al. (2015) projected a decrease in tropical cyclone activity, including Cat 4-5 storms, around northeast Australia. Earlier, in 2014, the IPCC Fifth Assessment Report (Reisinger et al. 2014) summarised the projected changes as “Tropical cyclones are projected to increase in intensity and stay similar or decrease in numbers and occur further south (low confidence)”.
Identifying anthropogenic climate change influences on observational records of tropical cyclone activity is also challenging. Reliable records are relatively short and contain high year-to-year variability. Most research effort has focused on identifying changes in frequency and intensity: Callaghan and Power (2010), for example, documented a long-term decline in numbers of Cat 3-5 events making landfall over eastern Australia. Other recent studies have begun to consider changes in the distribution of events and other characteristics, including their forward motion (referred to as the translation speed). Sharmila and Walsh (2018) showed that events in the Australian region may reach further south while Kossin (2018), reported on by Risk Frontiers in Briefing Note 370 in July this year, found a global slowing of translation speeds. Here we further discuss the findings of Kossin (2018) for the Australian region.
Anthropogenic warming may also cause a general weakening of summertime tropical circulation (Vecchi et al., 2006; Mann et al., 2017) and, because tropical cyclones are carried along within their ambient environmental wind, the translation speed of tropical cyclones may slow, thereby increasing the potential for flooding and longer duration sustained high wind speeds (Kossin 2018). Tropical Cyclone Debbie (Queensland, March 2017) and Hurricane Harvey (Texas, August 2017) are two recent examples of slow-moving events.
In addition to the reported global slowdown in tropical cyclone translation speeds, Kossin (2018) also analysed trends across various regions. While those for the Northern Hemisphere were strong, those for the Australian region, both over land and over water, were only marginally significant and exhibited high multi-annual variability.
Here we present an exploratory investigation of the extent to which changes in tropical cyclone translation speeds around Australia (Kossin, 2018) are driven by internal climate variability, in addition to any possible anthropogenic warming signal. The proxy for translation speeds is the ambient winds that control the movement of tropical cyclones. We begin with the tropical Indian Ocean, < 100 ° E (Fig. 1), where Kossin (2018) reported a -0.01 km/hr/yr trend between 1949 and 2016.
Chan and Gray (1981) suggested that winds between 500 and 700 mB are the most relevant measure of ambient winds that transport tropical cyclones. We extracted the 500 mB scalar wind speed monthly means (November to April – coinciding with our tropical cyclone season) from 1980/81 to 2017/18, using the NCEP-NCAR Reanalysis, for the region between 5 and 20 °S and 50 and 100 °E. (Prior to 1980, the homogeneity of the reanalysis record is questionable.)
We then compared the year-on-year scalar wind speeds (averaged within the analysis region) to the Pacific Decadal Oscillation (PDO) Index. The PDO is the leading principal component of North Pacific monthly sea surface temperature variability and can be seen as a long-lived (multi-decadal) ENSO-like pattern of Pacific climate variability. While the PDO is a Pacific-origin index, the tropical cyclone climatologies in Queensland, Northern Territory and Western Australia are principally influenced by Pacific ENSO variability, in addition to other regional climate indices such as the Indian Ocean Dipole and the Madden-Julian Oscillation.
Our results show a strong correlation between the ambient environmental winds in the tropical Indian Ocean (TIO) and the PDO (average of Nov-Apr PDO values for each year)during the period 1981 – 2000 (R = 0.54, p < 0.05, Fig. 2b), but this is much diminished during the period 2001 – 2018 (R = 0.23, p < 0.05, Fig. 2c). The two time-series in Fig. 2a show a change in the relationship between the variables occurred around the year 2000. They also show that wind speeds are consistently higher post-2000.
The PDO was in a sustained ‘warm’ phase (i.e. PDO positive, or El Niño–like) from approximately 1977 to 1999, after which it has experienced less coherent polarity (Fig. 3). Our analysis suggests that during this period, ambient winds (and by inference, tropical cyclone translation speeds) in the Indian Ocean were closely related to variability in the PDO. Post-2000, a weakening of the PDO signal coincides with a much-reduced level of correlation, and a jump to higher wind speeds.
It is well known that the PDO influences interdecadal variability of tropical cyclogenesis in northern Australia (Grant and Walsh, 2001). However, the importance of the PDO on cyclone translation speeds for this region remains unclear. Our brief analysis suggests PDO positive conditions suppress wind speeds in the upper atmosphere in the TIO and, by inference, reduce tropical cyclone translation speeds in this region. This is because during PDO positive (El Niño–like) conditions, sea surface temperature anomalies occur further east in the Pacific Ocean – causing the area of cyclogenesis to move eastwards away from Australia.
When the PDO signal becomes more La Niña to ENSO neutral-like (i.e. post-2000, Fig. 3), wind speeds in the TIO increase but become less correlated to the PDO Index. This suggests a more complex relationship between upper atmosphere winds in this region and other regional climate indices (like the Indian Ocean Dipole or Madden-Julien Oscillation), during multi-decadal periods where the PDO signal is not strong.
Further work is needed to fully explore these relationships, and to extend the analysis into the Pacific. What can be concluded at this juncture is that the role of internal climate variability needs also be considered when analysing tropical cyclone records.
Callaghan, J. & Power, S. (2010). Variability and decline in the number of severe tropical cyclones making land-fall over eastern Australia since the late nineteenth century. Clim. Dyn., 37, 647-662.
Chan, J.C. and Gray, W.M. (1981). Tropical Cyclone Movement and Surrounding Flow Relationships. Mon. Weather Rev., 110, 1354-1374.
Grant, A.P. and Walsh, K.J.E. (2001). Interdecadal variability in north-east Australian tropical cyclone formation. Atmos. Sci. Let., 1530-261X.
Kossin, J.P. (2018). A global slowdown of tropical-cyclone translation speed. Nature, 558, 104-107.
Knutson, T.R. et al. (2015). Global projections of intense tropical cyclone activity for the late twenty-first century from dynamical downscaling of CMIP5/RCP4.5 scenarios. J. Clim., 28, 7203–7224.Mann, M. E. et al. (2017). Influence of anthropogenic climate change on planetary wave resonance and extreme weather events. Sci. Rep. 7, 19831.
Reisinger, A., R.L. Kitching, F. Chiew, L. Hughes, P.C.D. Newton, S.S. Schuster, A. Tait, and P. Whetton, 2014: Australasia. In: Climate Change 2014: Impacts, Adaptation, and Vulnerability. Part B: Regional Aspects. Contribution of Working Group II to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change [Barros, V.R., C.B. Field, D.J. Dokken, M.D. Mastrandrea, K.J. Mach, T.E. Bilir, M. Chatterjee, K.L. Ebi, Y.O. Estrada, R.C. Genova, B. Girma, E.S. Kissel, A.N. Levy, S. MacCracken, P.R. Mastrandrea, and L.L. White (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, pp. 1371-1438.
Sharmila, S & Walsh, K.J.E. (2018). Recent poleward shift of tropical cyclone formation linked to Hadley cell expansion. Nature, 8, 730-736.
Vecchi, G. A. et al. (2006). Weakening of tropical Pacific atmospheric circulation due to anthropogenic forcing. Nature 441, 73–76.
Risk Frontiers’ new Australian earthquake loss model is now available.
We are excited to announce the release of our new probabilistic earthquake loss model for Australia.
The updated model incorporates Geoscience Australia’s recent revision of the Australian Earthquake Catalogue and, for the first time, the inclusion of an active fault model.
The model also includes a number of updates incorporating the latest data and methodologies.
Estimated losses have generally decreased across the country due to the update of the historical earthquake catalogue. This effect is partly mitigated at longer return periods in regions where active faults have now been modelled.
This briefing contains excerpts from a recently-published article in the journal Proceedings of the National Academy of Sciences (PNAS) by Will Steffen and colleagues. The paper has sparked recent media interest and scientific discussion on the possibility of abrupt climate change that lies outside ‘likely’ projections, by surpassing climate thresholds and instigation of positive feedback loops. It calls for stronger action on climate mitigation because of this risk. Will Steffen is Emeritus Professor at the Climate Change Institute at ANU, and a Councillor for the Climate Council, an Australian climate change communications organisation.
The following are some extracts from Steffen’s paper, followed by some comments on this work. The full article and associated references can be accessed here.
Steffen et al.’s article – in short
We explore the risk that self-reinforcing feedbacks could push the Earth System toward a planetary threshold that, if crossed, could prevent stabilization of the climate at intermediate temperature rises and cause continued warming on a “Hothouse Earth” pathway even as human emissions are reduced. Crossing the threshold would lead to a much higher global average temperature than any interglacial in the past 1.2 million years and to sea levels significantly higher than at any time in the Holocene.
We examine the evidence that such a threshold might exist and where it might be. If the threshold is crossed, the resulting trajectory would likely cause serious disruptions to ecosystems, society, and economies. Collective human action is required to steer the Earth System away from a potential threshold and stabilize it in a habitable interglacial-like state.
Such action entails stewardship of the entire Earth System—biosphere, climate, and societies—and could include decarbonization of the global economy, enhancement of biosphere carbon sinks, behavioral changes, technological innovations, new governance arrangements, and transformed social values.
Our analysis suggests that the Earth System may be approaching a planetary threshold that could lock in a continuing rapid pathway toward much hotter conditions—Hothouse Earth. This pathway would be propelled by strong, intrinsic, biogeophysical feedbacks difficult to influence by human actions, a pathway that could not be reversed, steered, or substantially slowed.
Where such a threshold might be is uncertain, but it could be only decades ahead at a temperature rise of ∼2.0 °C above preindustrial, and thus, it could be within the range of the Paris Accord temperature targets. The impacts of a Hothouse Earth pathway on human societies would likely be massive, sometimes abrupt, and undoubtedly disruptive. Avoiding this threshold by creating a Stabilized Earth pathway can only be achieved and maintained by a coordinated, deliberate effort by human societies to manage our relationship with the rest of the Earth System, recognizing that humanity is an integral, interacting component of the system. Humanity is now facing the need for critical decisions and actions that could influence our future for centuries, if not millennia.
The idea of abrupt climate change and threshold events is well established and there is evidence in the sedimentary record that such events have occurred multiple times in the past. To provide some context, we are talking about the transition from glacial to interglacial type climate (or vice-versa) within a matter of decades. These thresholds are difficult to forecast but readily identifiable in hindsight. Once a threshold is passed, a feedback loop can develop that reinforces and amplifies the climate signal – and this is the scenario that Steffen et al. explore. However, it is important to highlight that they can equally lead to an abrupt climate signal that is opposite to the initial forcing.
A well-cited example of this is the ‘8.2 event’, where a warming trend led to a sudden decrease in atmospheric temperatures, most notably over the North Atlantic and Europe, around 8,200 years before present. One theory is that warming ocean temperatures in the Arctic led to sea ice melt, which freshened and warmed the surface ocean and inhibited the sinking of salty, cold water to the ocean floor. This mechanism is required to sustain the ocean’s thermohaline circulation, of which the Gulf Stream (which transports warm water to NW Europe) is the surface signal. The slowing or closing down of this mechanism around the Arctic may have led to a slowing or deviation of the Gulf Stream, and abruptly cooler air temperatures (on the order of 3 to 4 ° C) over NW Europe. Paleo-climatic evidence suggests this all happened in the space of 20 years. Similarly, today, there is a strong ice melt and positive temperature signal around the Arctic. The climate response is highly complex and difficult to predict.
In their paper, Steffen et al. also use the term ‘Anthropocene’. This is a somewhat politically-charged term proposed for the present geological epoch dating from the commencement of significant human impact on the Earth’s environment and ecosystems, including, but not limited to, anthropogenic climate change (Waters et al., 2016). The past 10,000 years or so is known as the Holocene (the present inter-glacial period), thus the ‘Anthropocene’ would be a sub-division of this. There are suggestions that the Anthropocene should start from the beginning of the Industrial Revolution, or even the detonation of the first Atomic Bomb.
However, the International Commission on Stratigraphy (ICS), which has the prerogative of naming geological epochs, does not concur. Almost coincident with the publication of Steffen’s paper, the ICS ratified the subdivision of the Holocene and renamed the Late Holocene as the Meghalayan Epoch, snubbing the term Anthropocene. According to the ICS, the Meghalayan started about 4,250 years ago with a mega-drought that caused the collapse of a number of civilisations in Egypt, the Middle East, India and China about 2,250 years BCE. The ICS objects that the Anthropocene does not arise from geology and is not associated with a “stratigraphic unit” (rock layer); it is based more on the future than the past; is more a part of human history than the immensely long history of Earth; and is a political statement, rather than a scientific one (The Australian, August 11, 2018).
As reported by Mark Maslin (Professor of Earth System Science at University College London) in The Conversation (August 9, 2018), the ICS’s decision is a blow to those pushing for tough action on climate change, and “has profound philosophical, social, economic and political implications”. Maslin says “there is a huge difference to the story of humanity if we are living in the Meghalayan Age that makes no mention of the human impact on the environment — or in the Anthropocene Epoch, which says human actions constitute a new force of nature. The Meghalayan Age says the present is just more of the same as the past. The Anthropocene rewrites the human story, highlighting the need for planetary stewardship.”
The call to arms for stronger mitigation on climate change is a positive one, because it is unlikely any level of planning or adaptation could cope with temperature changes (and associated hazards) of 3 – 4 ° C occurring over a couple of decades. However, inertia – in both the climate system and on a political level – may result in it being too little too late.
Colin N. Waters, Jan Zalasiewicz, Colin Summerhayes, Anthony D. Barnosky, Clément Poirier, Agnieszka Gałuszka, Alejandro Cearreta, Matt Edgeworth, Erle C. Ellis, Michael Ellis, Catherine Jeandel, Reinhold Leinfelder, J. R. McNeill, Daniel deB. Richter, Will Steffen, James Syvitski, Davor Vidas, Michael Wagreich, Mark Williams, An Zhisheng, Jacques Grinevald, Eric Odada, Naomi Oreskes, Alexander P. Wolfe (2016), The Anthropocene is functionally and stratigraphically distinct from the Holocene. Science, 351, 6269.
Lloyd, G. (2018). Will Steffen’s paper gets scientists hot under the collar. The Australian, August 11, 2018.
Maslin, Mark (2018). Anthropocene vs Meghalayan: why geologists are fighting over whether humans are a force of nature. Article published in The Conversation, August 9, 2018.
Steffen, Will, Johan Rockström, Katherine Richardson, Timothy M. Lenton, Carl Folke, Diana Liverman, Colin P. Summerhayes, Anthony D. Barnosky, Sarah E. Cornell, Michel Crucifix, Jonathan F. Donges, Ingo Fetzer, Steven J. Lade, Marten Scheffer, Ricarda Winkelmann, and Hans Joachim Schellnhuber (2018). Trajectories of the Earth System in the Anthropocene. Proceedings of the National Academy of Science, August 6, 2018. 201810141.