Climate change may lead to bigger atmospheric rivers

The following briefing, by Esprit Smith of NASA’s Jet Propulsion Laboratory, was published on the NASA website on 24 May 2018.

The study described below considers projections based on two Representative Concentration Pathways (RCPs) – 4.5 and 8.5. There are four pathways in total (including RCP2.6 and RCP6) and the findings of the IPCC Fifth Assessment Report are based upon these. Most of the discussion of results presented below is based on the RCP8.5 analysis which is the most extreme scenario based on minimal effort to reduce emissions. Toward the end of the briefing the results from the RCP4.5 analysis are noted as follows: ‘The team also tested the algorithm with a different climate model scenario that assumed more conservative increases in the rate of greenhouse gas emissions. They found similar, though less drastic changes.’


A new NASA-led study shows that climate change is likely to intensify extreme weather events known as atmospheric rivers across most of the globe by the end of this century, while slightly reducing their number.  The new study projects atmospheric rivers will be significantly longer and wider than the ones we observe today, leading to more frequent atmospheric river conditions in affected areas.

“The results project that in a scenario where greenhouse gas emissions continue at the current rate, there will be about 10 percent fewer atmospheric rivers globally by the end of the 21st century,” said the study’s lead author, Duane Waliser, of NASA’s Jet Propulsion Laboratory in Pasadena, California. “However, because the findings project that the atmospheric rivers will be, on average, about 25 percent wider and longer, the global frequency of atmospheric river conditions — like heavy rain and strong winds — will actually increase by about 50 percent.” The results also show that the frequency of the most intense atmospheric river storms is projected to nearly double.

Atmospheric rivers are long, narrow jets of air that carry huge amounts of water vapor from the tropics to Earth’s continents and polar regions. These “rivers in the sky” typically range from 250 to 375 miles (400 to 600 kilometers) wide and carry as much water — in the form of water vapor — as about 25 Mississippi Rivers. When an atmospheric river makes landfall, particularly against mountainous terrain (such as the Sierra Nevada and the Andes), it releases much of that water vapor in the form of rain or snow.

These storm systems are common — on average, there are about 11 present on Earth at any time. In many areas of the globe, they bring much-needed precipitation and are an important contribution to annual freshwater supplies. However, stronger atmospheric rivers — especially those that stall at landfall or that produce rain on top of snowpack — can cause disastrous flooding. Atmospheric rivers show up on satellite imagery, including in data from a series of actual atmospheric river storms that drenched the U.S. West Coast and caused severe flooding in early 2017.

In early 2017, the Western United States experienced rain and flooding from a series of storms flowing to America on multiple streams of moist air, each individually known as an atmospheric river. Image credit: NASA/JPL-Caltech

The study

Climate change studies on atmospheric rivers to date have been mostly limited to two specific regions, the western United States and Europe. They have typically used different methodologies for identifying atmospheric rivers and different climate projection models — meaning results from one are not quantitatively comparable to another.

The team sought to provide a more streamlined and global approach to evaluating the effects of climate change on atmospheric river storms.   The study relied on two resources — a set of commonly used global climate model projections for the 21st century developed for the Intergovernmental Panel on Climate Change’s latest assessment report, and a global atmospheric river detection algorithm that can be applied to climate model output. The algorithm, developed earlier by members of the study team, identifies atmospheric river events from every day of the model simulations, quantifying their length, width and how much water vapor they transport.

The team applied the atmospheric river detection algorithm to both actual observations and model simulations for the late 20th century. Comparing the data showed that the models produced a relatively realistic representation of atmospheric rivers for the late 20th century climate.  They then applied the algorithm to model projections of climate in the late 21st century. In doing this, they were able to compare the frequency and characteristics of atmospheric rivers for the current climate with the projections for future climate.

The team also tested the algorithm with a different climate model scenario that assumed more conservative increases in the rate of greenhouse gas emissions. They found similar, though less drastic changes. Together, the consideration of the two climate scenarios indicates a direct link between the extent of warming and the frequency and severity of atmospheric river conditions.

What does this mean?

The significance of the study is two-fold.   First, “knowing the nature of how these atmospheric river events might change with future climate conditions allows for scientists, water managers, stakeholders and citizens living in atmospheric river-prone regions [e.g. western N. America, western S. America, S. Africa, New Zealand, western Europe] to consider the potential implications that might come with a change to these extreme precipitation events,” said Vicky Espinoza, postdoctoral fellow at the University of California-Merced and first author of the study. And secondly, the study and its approach provide a much-needed, uniform way to research atmospheric rivers on a global level — illustrating a foundation to analyze and compare them that did not previously exist.

Limitations

Data across the models are generally consistent — all support the projection that atmospheric river conditions are linked to warming and will increase in the future; however, co-author Marty Ralph of the University of California, San Diego, points out that there is still work to be done. “While all the models project increases in the frequency of atmospheric river conditions, the results also illustrate uncertainties in the details of the climate projections of this key phenomenon,” he said. “This highlights the need to better understand why the models’ representations of atmospheric rivers vary.”

The study, titled “Global Analysis of Climate Change Projection Effects on Atmospheric Rivers,” was recently published in the journal Geophysical Research Letters.

Drivers risk death when driving into flood water: new study

This article by Fran Molloy was published in yesterday’s issue of  Macquarie University’s The Lighthouse.

New research shows that most Australian drivers think they can work out when it is safe to enter flood waters – as foolhardy Hobart drivers proved during last week’s natural disaster.

Read more: https://lighthouse.mq.edu.au/article/drivers-risk-death-when-driving-into-floodwater-new-study

Newsletter Volume 17, Issue 3

The new QuakeAUS: impact of revised GA earthquake magnitudes on hazards and losses

Paul Somerville and Valentina Koschatsky, Risk Frontiers

Geoscience Australia (GA) is updating the seismic hazard model for Australia through the National Seismic Hazard Assessment (NSHA18) project (Allen et al., 2017). The update includes the corrections of measurements of local magnitude, ML and the conversion of the ML values to moment magnitude, MW. Moment magnitude is the preferred magnitude type for probabilistic seismic hazard analyses, and all modern ground motion prediction equations use this magnitude type. This is because ML is a purely empirical estimate of earthquake size whereas MW is a theoretically-based measure of earthquake size, derived from the seismic moment, M0 of the earthquake which is given by:

M0 = u A D

where A is the rupture area of the fault, D is the average displacement on the fault and u is the shear modulus of rock. The seismic moment quantifies the size of each of the pair of opposing force couples that constitute the force representation of the shear dislocation on the fault plane. For comparison with the more familiar magnitude scale, MW is calibrated to M0 using the following equation:

MW = 2/3 log10 M0 – 10.7

Prior to the early 1990s, most Australian seismic observatories relied on the Richter (1935) local magnitude (ML) formula developed for southern California. At regional distances (where many earthquakes are recorded), the Richter scale tends to overestimate ML relative to modern Australian magnitude formulae. Because of the likely overestimation of local magnitudes for Australian earthquakes recorded at regional distances, there is a need to account for pre-1990 magnitude estimates due to the use of inappropriate Californian magnitude formulae. A process was employed that systematically corrected local magnitudes using the difference between the original (inappropriate) magnitude formula (e.g., Richter, 1935) and the Australian-specific correction curves (e.g., Michael-Leiba and Malafant, 1992) at a distance determined by the nearest recording station likely to have recorded a specific earthquake.

The relationship between ML and MW developed for the NSHA18 demonstrates that MW is approximately 0.3 magnitude units lower than ML for moderate-to-large earthquakes (4.0<MW<6.0). Together, the ML corrections and the subsequent conversions to MW more than halve the number (and consequently the annual rate) of earthquakes exceeding magnitude 4.5 and 5.0, as shown in Figure 1. This has downstream effects on hazard calculations when forecasting the rate of rare large earthquakes using Gutenberg-Richter magnitude-frequency distributions in PSHA. A secondary effect of the ML to MW magnitude conversion is that it tends to increase the number of small and moderate-sized earthquakes relative to large earthquakes. This increases the Gutenberg–Richter b-value, which in turn further decreases the relative annual rates of larger potentially damaging earthquakes (Allen et al., 2017).

Figure 1. Cumulative number of earthquakes with magnitudes equal to or exceeding 4.5 (left) and 5.0 (right) for earthquakes in eastern Australia (east of 135°E longitude) from 1900 to 2010. The different curves show different stages of the NSHA18 catalogue preparation: original catalogue magnitudes, modified magnitudes (only local magnitude modified) and preferred MW (for all earthquakes). Source: Modified from Allen et al., (2017).

Preliminary seismic hazard calculations by Allen et al. (2017b) using the new earthquake source catalogue are compared with the existing PGA hazard map for Be site conditions for a return period of 500 years in Figure 2. We have updated the earthquake source model to incorporate the new GA catalogue into QuakeAUS , and obtained a new hazard map for Australia similar to that in Figure 2.

Figure 2. Existing (left) and draft (right) PGA maps for site class Be for a return period of 500 years. Source: Modified from Allen et al. (2017).

Preliminary loss estimates using the new version of QuakeAUS show large scale reductions. Losses in a national residential portfolio for 200 year ARP and for AAL are 30% and 35% of their former values respectively. The changes are not regionally uniform, with the largest reductions occurring in Perth and the lowest reductions occurring in Darwin. Among the five perils that are modelled on Risk Frontiers’ Multiperil Workbench (earthquake, fire, flood, hail and tropical cyclone), earthquake previously had the largest 200 year ARP loss but now lies below tropical cyclone in a near tie with flood and hail, and its AAL has dropped from second last to last, below hail.

We expect to release QuakeAUS 6.0, including these changes, early in the third quarter of 2018.

References

Allen, T., J. Griffin, M. Leonard, D. Clark and H. Ghasemi (2017). An updated National Seismic Hazard Assessment for Australia: Are we designing for the right earthquakes? Proceedings of the Annual Conference of the Australian Earthquake Engineering Society in Canberra, November 24-26, 2017.
Michael-Leiba, M., and Malafant, K. (1992). A new local magnitude scale for southeastern Australia, BMR J. Aust. Geol. Geophys. Vol 13, No 3, pp 201-205.

Tathra 2018 Bushfires

James O’Brien, Mingzhu Wang, Jacob Evans

The 2017/18 bushfire season across southeastern Australia during this hot summer season burned through 237,869 hectares from 11,182 fires prompting seven Emergency Warnings, 25 Watch and Act alerts and 16 Total Fire Ban days1. Despite the high number of fires, the losses were limited, until the Tathra fires with two homes lost in Comboyne. True to its mission of better understanding natural disasters, Risk Frontiers produced in-depth intelligence from aerial photography, field survey and GIS analytics. In what follows we report the results of these exercises.

Observations from the field

The early December 2017 heatwave (December was the 5th hottest on record) set the conditions for the bushfires in New South Wales on 18 March 2018. The high temperatures combined with high winds established the conditions under which an electrical fault apparently triggered the fire. The bushfires in Tathra destroyed around 65 homes, damaged 48 homes, destroyed 35 caravans and cabins and burned 1250 hectares of bushland, in additional to the emotional trauma experienced by survivors. Fortunately there were no casualties.

Risk Frontiers scientists (James, Mingzhu and Jacob) arrived in Tathra on April 10th, a little over three weeks following the peak of the bushfire damage, due to the high proportion (around 50%) of properties which contained asbestos. Our objective was to investigate the most affected areas in Tathra.

New above-ground electricity infrastructure in the region was a clear sign of the work undertaken to repair the obliterated power network and an indication of the extensive damage to infrastructure that left Tathra without power and water for a number of days following the fire.

We were able to quickly cover the whole town in less than a day on foot with the exception of some isolated areas in Reedy Swamp where the fire started and a small number of houses are located. This survey was useful to qualitatively gauge the assumptions used in our bushfire loss model, FireAUS. Our observations can be summarised as follows:

Zero-One (binary) damage ratios: We saw very few cases of partial damage to structures. It appears that once fire hits a structure during a bushfire it will almost certainly be completely destroyed. That’s not to say that the adjacent structures at the same address will always burn; we observed several cases of sheds that were burnt while the main house was unscathed and vice versa. The partial damage we did observe was charring to the sides of properties, where it appeared an active effort had been made to save the property.

Statistical dependence of bushfire risk on distance to bush: As described above, there is no clear pattern in the spatial distribution of damage when observed at close-range. However, the statistics of bushfire damage based on aggregated data from a broad area do show the importance of distance of a property to the nearby bush (see Figure 2). Whether a property is burnt in a bushfire seems determined by random chance and this chance is conditioned by the distance to the bushland. In FireAUS, we assume that any two addresses equidistant from the bush have equal probabilities of burning.

Independence of risk from building types: We observed damage to different construction types: unreinforced masonry, wood, fibro, mobile homes and even stone. There were destroyed brick houses away from the bush and spared wood and fibro houses close to the bush and vice-versa. The damage for this locality appears independent of building types even when globally influenced by proximity to bushland. If there are other risk factors that could explain the building damage, they are not visible in a short inspection and would require a full forensic investigation of each damaged building. The prevailing view was that newer homes generally seemed to perform better than older homes – and in one case a home built within the last 5 years sustained minimal bushfire damage (timber steps were destroyed) although that property was also actively defended by neighbours.

Mapping damage

Figure 1 – Vicinity of Tathra / Reedy Swamp bushfire with prevailing wind direction on the day indicated by arrow and X indicating approximate ignition point.

As the events in Tathra unfolded, Risk Frontiers started the data gathering process to provide a view of this event. Our damage analysis is based on post-fire ground surveys and RFS burned area data captured from live data feeds on Sunday. We also acquired 25 km2 of pre-fire satellite imagery (WorldView-2, 2m resolution) for vegetation analysis and utilized Pitney Bowes Geovision for building location and bushland / tree data.

Figure 2 provides a complete map of damaged properties (house icons) overlain with bushland boundaries (green shading) derived from GeoVision data. It is clear that a number of these properties are surrounded by bushland and are therefore deemed to be at a distance of zero metres from the urban and bushland interface. Properties not within the bushland areas are assigned the linear distance in metres to the nearest pre-fire bushland area greater than 0.5 sq km in area, not necessarily the bushland that burned. Further analysis could be undertaken to classify the burned vegetation – however, in the Tathra region, the majority of bushland burned around properties and it is difficult to recover the clear timeline of local ignition.

There are eyewitness reports of ember attack and the pattern of damage around the different locations has destroyed houses at some distance from the bushland interface with adjacent properties destroyed by either further ember attack or contagion from the neighbouring property.

Figure 2 – Location of destroyed homes and adjacent bushland in Tathra classified from pre-fire imagery and GeoVision (Minimum area threshold for contiguous vegetation: 500 m2)

Individual data

While Figure 2 demonstrates the spatial distribution of destroyed homes graphically, it is useful to quantify the loss as a function of distance to adjacent bushland. The data presented are in cumulative form so as to be consistent with other Risk Frontiers reports and other research. Figure 3 shows the percentile of destroyed buildings in relation to nearby bushland from recent major bushfires in Australia:

  • January 2003 Canberra bushfires (damaged suburbs include Duffy)
  • February 2009 “Black Saturday” bushfires in Victoria (damaged suburbs include Marysville and Kinglake)
  • February 2011 Perth bushfires (damaged suburbs include Roleystone)
  • January 2013 Tasmania bushfires (damaged suburbs include Dunalley)
  • January 2016 Yarloop, WA bushfire

Some new statistics and evidence that emerged from the bushfire damage in Tathra are as follows:

  • 42% of destroyed homes were within 0m of classified bushland boundaries.
  • 50% of surveyed destroyed homes were within 30m of the bushland interface and 72.6% of surveyed homes destroyed were within 100m of the bushland interface. These results closely match the findings previously presented in the “Bushfire Penetration into Urban Areas in Australia” report prepared for the 2009 Victorian Bushfires Royal Commission by Risk Frontiers.
  • No homes were destroyed further than 630m from bushland.
Figure 4 – A view of a destroyed property from Riverview Crescent, Tathra looking west in the direction of the fire’s ignition point across the Bega River. Note the burned vegetation in the distance and the lower green belt on the river’s edge demonstrating ember attack across the river.
Figure 5 – Map and aerial imagery showing property losses in the vicinity of Oceanview Drive, Tathra (1) in top left corner. Note the proximity to bushland immediately behind those properties and the distance to those lost in the lower right corner at Francis Hollis (2) and Bay View Drive (3), suggesting ember attack. House icons again denote destroyed properties. Wind direction was from top left to bottom right of image, red line and shading showing burnt boundary.

1https://www.rfs.nsw.gov.au/news-and-media/ministerial-media-releases/minister-urges-public-to-remain-prepared-with-ongoing-dry-conditions

Thwaites and Pine Island Glaciers of Antarctica and the Prospect of Rapid Sea Level Rise

Thomas Mortlock and Paul Somerville, Risk Frontiers.

The Thwaites and Pine Island glaciers in Antarctica are flowing toward the Amundsen Sea along a 250 km wide front.  Further inland, the glaciers widen into a 3 km thick mass of ice covering an area the size of Texas. Scientists are worried that the glaciers are going into irreversible retreat, meaning that no amount of climate change reversal could stop them from melting into the ocean.  If both of these glaciers were to melt completely, they would raise the sea level of the world’s oceans by 1 metre. What is worse, together these glaciers act as a plug holding back enough ice to raise the sea level of the world’s oceans by over 3 metres— an amount that would submerge large areas of the world’s coastal cities.

When in balance, the quantity of snow at the glacier’s head matches the ice lost to the ocean at its front through the calving of icebergs (top of Figure 2). But Thwaites is out of balance: it has sped up and is currently flowing at over 4km per year. It is also thinning at a rate of almost 40cm a year. According to Dr Anna Hogg of Leeds University, this thinning started after 2000, spreading inland at a rate of 10-12km/year at its fastest.  She suggested that on Thwaites Glacier, the increase in ice speed has coincided with a period of rapid ice thinning, and grounding line retreat, which suggests that the observed changes may have been caused by warm ocean water reaching the glacier and accelerating ice melt. The grounding line refers to the zone where the glacier enters the sea and lifts up to form a buoyant platform of ice.

If warm ocean bottom-waters are able to get under this shelf (bottom of Figure 2), the grounding line can be eroded and the glacier forced backwards even if local air temperatures are sub-zero. In the case of Thwaites, a large portion of the ice stream sits below sea level, with the rock bed sloping back towards the continent.  This can produce marine ice sheet instability, in which the tall cliff that forms at the front of the glacier begins to calve in a runaway fashion.  This has not yet been seen in this part of Antarctica.

Figure 2.  Schematic diagram of stable and retreating glaciers.  Source: BBC

It is unclear how long it would take for the glaciers to melt completely – it may take decades or centuries.  Scientists have been looking back to the end of the last ice age, about 11,000 years ago, when global temperatures stood at roughly their current levels. There is growing evidence that the glaciers collapsed rapidly back then, flooding the world’s coastlines.  Unfortunately, as indicated above, the ocean floor on which the glaciers rest gets deeper toward the interior of Antarctica (Figure 2), so each new iceberg that breaks away exposes progressively taller and taller cliffs. When the cliffs become so tall that they cannot support their own weight, they may collapse catastrophically.

Scientists funded by the U.K. National Environment Research Council and the U.S. National Science Foundation are planning to go to the field to try to find out how quickly these glaciers might collapse.  They will monitor the way in which ocean water moves beneath the floating shelf, drill sediments from under and just in front of the glacier to find out what it did during past warming events on Earth, and use a submersible to explore the cavity under the buoyant sections of Thwaites.

Such massive ice sheet collapses have occurred in the past, but the climate effect of a huge freshwater input into the Southern Ocean, in the form of ice sheet melt, is far from certain. In the Northern Hemisphere, there is evidence to suggest that past periods of rapid ice sheet melt have actually led to periods of climate cooling, called Heinrich events, after the paleoclimatologist Hartmut Heinrich. Scientists have hypothesised that these freshwater dumps reduced ocean salinity enough to slow deepwater formation in the Arctic and the ocean circulation that relies on seawater density differences (in the form of salinity and temperature) to operate. Since the ‘thermohaline’ circulation plays an important role in transporting heat towards Europe, a slowdown would cause the North Atlantic to cool. Such deepwater formation also occurs around the rim of Antarctica.

The U.S. National Oceanic and Atmospheric Administration reports that, globally, sea level has risen about 6.6 cm above the 1993 average level, and it continues to rise by about 3 mm per year.  Meltwater streaming into the Amundsen Sea from Antarctica’s Thwaites glacier  accounted for about 4 percent of total global sea level rise in recent years — twice its contribution from the mid-1990s.

Glaciers like Thwaites matter a great deal to sea level because they are large masses of landlocked ice that hold back even larger masses of ice, keeping them from sliding into the sea. Landlocked ice changes sea level because when it melts, it introduces new water to the ocean. Sea ice, on the other hand, like the ice cap in the Arctic, can have major effects on climate when it melts, but it is basically water that is already in the ocean, and whether it is liquid or solid does not directly affect sea level around the world.

The current suite of projections of sea level rise are derived from a range of global climate models and a range of future carbon emission scenarios (Representative Concentration Pathways, RCPs) – thus inter-scenario and intra-model uncertainty is not insignificant. The range of uncertainty for global sea level rise to 2100 is largely shaped by the uncertain contributions of the Antarctic Ice Sheet and Greenland Ice Sheet, and thermal expansion of the oceans (Figure 3).

Figure 3.  Future probabilistic global sea-level projections for the 21st century under RCP2.6 (dark blue), RCP4.5 (light blue) and RCP8.5 (red) forcing scenarios. Source: IPCC (2014).

On the east coast of Australia, sea level rise to 2030 is expected to be on the order of 0.09 – 0.19 m, and between 0.22 – 0.88 m by 2090 (change relative to 1986 – 2005, taking the 95 % confidence limits of RCP 2.6, 4.5 and 8.5). A typical, convenient horizon for most coastal planning is to consider sea level rise of 0.9 – 1.0 m by 2100.

Over the past four years, Risk Frontiers has been developing a coastal risk visualisation tool, in association with the Office of Environment and Heritage, to help State Government entities and Local Governments visualize impacts of sea level rise on assets and infrastructure over planning timeframes in NSW. Figure 4 shows an example of the amount of expected seawater inundation around Newcastle with 0.5 to 1.5 m sea level rise, using the tool. As can been seen, inundation even to this level impacts critical infrastructure and residential areas with potential significant costs to asset owners, insurance and the local community.

Figure 4.  Potential seawater inundation as a result of sea level rise between 0.5 to 1.5 m above the present-day high tide level in Newcastle using the Risk Frontiers / OEH coastal risk visualisation tool.

However, there is great uncertainty in the standard IPCC projections related to the West Antarctic Ice Sheet (WAIS), of which Thwaites is a small part. Some studies (Bakker et al, 2017, Pollard et al) suggest that if the whole of the WAIS were to collapse (of which Thwaites is a small part), it could contribute a further 3 – 4 m to global sea levels (Figure 5).

Figure 5.  Future sea-level projections including very uncertain contribution of the WAIS. Red line shows most extreme RCP scenario considered by the IPCC, and yellow and brown shaded areas demarcate different WAIS collapse scenarios, with deep uncertainty in between. Source: Bakker et al (2017)

Obviously, these levels of sea level rise would be catastrophic for the ~ 80 % of Australian population that current lives within the coastal zone and well beyond planning capabilities. While Government deliberates over how best to plan for sea level rise of 1 m by 2100, we perhaps should also be thinking about what provisions should be in place if sea level rise of 4 m + were to occur. For the time being, all eyes are on Thwaites.

References

Bakker, A.M.R., et al. (2017), Sea-level projections representing the deeply uncertain contribution of the West Antarctic ice sheet, Scientific Reports, 7, 3880

Pollard, D., et al (2015), Potential Antarctic ice sheet retreat driven by hydrofracturing and ice cliff failure. Earth Plan. Sci. Lett. 412, 112–121.

IPCC, 2014: Climate Change 2014: Synthesis Report. Contribution of Working Groups I, II and III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change [Core Writing Team, R.K. Pachauri and L.A. Meyer (eds.)]. IPCC, Geneva, Switzerland, 151 pp.

FS-ISAC 2018 Cybersecurity Trends

By Tahiry Rabehaja.  Email: tahiry.rabehaja@riskfrontiers.com.

2017 was not a good year for cyber security. Victims ranged from small businesses to corporate giants such as Equifax, Deloitte and Kmart with the impacts of ‘improved’ ransomware such as WannaCry and NotPetya just two well-publicised examples.  Such breaches emphasise that cybersecurity poses not just a headache for IT departments but is an issue warranting a top-down solution, starting with C-level executives. To this end, the Financial Services Information Sharing and Analysis Center (FS-ISAC), have recently published a report summarising the thoughts of over 100 financial sector Chief Information Security Officers (CISO) regarding key priorities to improve digital security postures for 2018 (FS-ISAC, 2018). This survey shows most executives focused on improving their defensive strategies against cyber attacks.

Figure 1: Snapshot from the FS-ISAC report ranking the key priorities to improve cyber security postures in 2018.


[FS-ISAC is a non-profit global organisation providing a platform for sharing and analysing cyber and physical security information and intelligence. It currently has approximately 7000 members from 39 different countries. It was an initiative established by the financial service sector in response to the 1998 US Presidential Directive 63.] 

For more than a third (35%) of the executives, improving employees’ awareness about digital threats ranks top of the list. This comes as no surprise given employees have always been on the front line of defence against cyber attacks while remaining the weakest link. Indeed, most attacks against financial services companies exploit human weaknesses using social engineering, spear phishing and account take-over due to weak and reused passwords, etc. In 2017, Verizon reported that 1 in 14 employees were opening attachments or links sent through phishing emails and 1 in 4 were giving out account credentials or personal information (Verizon, 2017).

Investment into modern cyber resilient infrastructures (25%) comes in as runner up. Such an investment includes a progressive upgrade of existing network defence hardware and software as well as the creation of specialised departments that ensure digital information security.

Another recent study shows that subscription to Threat Intelligence, the emergent use of defence systems based on Machine Learning as well as strategic use of Cyber Analytics rank amongst the more cost-effective security investments (Accenture, 2017). That same study shows many companies over-investing in technologies that fail to deliver the desired cost-benefit ratios. These include extensive applications of Advanced Perimeter Controls and incongruous use of data loss prevention such as full disk encryption. Thus, efficient security programs should be implemented by ensuring an optimal cost-benefit ratio. This can be achieved by prioritising the security of critical assets and related infrastructures.

Figure 2: Snapshot from the Accenture report showing spending in security technology and the associated business benefit value.

2018 will also mark a long-awaited ratification of various breach notification regulatory laws. These include changes to the General Data Protection Regulation in Europe, the Notifiable Data Breaches scheme that has just come into effect in Australia, and upcoming changes to China’s Cybersecurity and Data Protection laws. These entail that compliance, explicitly voted by 2% of the surveyed executives, will also play an important role in shaping digital security especially for companies dealing with personally identifiable information.

The focus towards defensive solutions (FS-ISAC, 2018) is disturbing. The report also investigates the impact of hierarchical organization on reporting frequency but nothing is said about responses. This may be due to the fact that those executives interviewed were mainly from the financial industry. However, historical breaches shows response is equally as important as is defence. In fact, it is very likely that a resourceful hacker interested in a particular asset of a certain company will be able to hack in and extract or destroy the targeted information.

Targeted attacks are amongst the most costly and usually affect critical assets such as Intellectual Property. A successful attack on these key assets can have destructive impacts on the victim’s business model itself. Expenses incurred during a cyber event will span from direct costs — forensic and remediation cost, customer protection, regulatory penalty, etc. — to collateral damages — loss of customers, damage to reputation and brand name, increased cost of capital, etc. These costs can be considerably reduced using efficient incident response and mitigation policies as well as cyber insurance.

The White House Council of Economic Advisers estimate the average cost of a breach to be as high as $330 million when an event negatively affects the market value of the victim (Advisers, 2018). For instance, Equifax’s stock price dropped by more than 35% within 7 days of last year’s massive data breach disclosure. The emergence of cyber insurance is anticipated to provide cover against some of the financial losses. Various vendors are already providing cyber insurance products and it is expected this market will grow to over $7 billion within the next three years (PwC, 2015).

References

Accenture. (2017). Cost of Cybercrime Study. Retrieved from Accenture: https://www.accenture.com/au-en/insight-cost-of-cybercrime-2017

Advisers, W. H. (2018, February 16). Cost of malicious cyber activity to the US economy. Retrieved from https://www.whitehouse.gov/articles/cea-report-cost-malicious-cyber-activity-u-s-economy/

FS-ISAC. (2018, February 12). FS-ISAC Unveils 2018 Cybersecurity Trends According to Top Financial CISOs. Retrieved from FS-ISAC: https://www.fsisac.com/article/fs-isac-unveils-2018-cybersecurity-trends-according-top-financial-cisos

PwC. (2015). Insurance 2020 and beyond: Reaping the dividends of cyber resilience. Retrieved from https://www.pwc.com/gx/en/industries/financial-services/publications/insurance-2020-cyber.html

Verizon. (2017). Verizon Data Breach Investigation Report. Retrieved from Verizon: http://www.verizonenterprise.com/verizon-insights-lab/dbir/2017/

 

Why is Roman concrete more durable than modern concrete?

Jacob Evans, Risk Frontiers (jacob.evans@riskfrontiers.com)

Modern concrete is porous and degrades in contact with seawater. Seawater can seep into its pores, and when dried out the salts crystalize. The crystallization pressure of the salts produces stresses that can result in cracks and spalls. There are also other chemical processes such as sulphate attack, lime leaching and alkali-aggregate expansion all of which degrade modern concrete. Some submerged concrete objects may last only 10 years; meanwhile, 2000-year old concrete constructed during the Roman Empire is still going strong (Figure 1). Why this is so is a question an international research team led by geologist Marie Jackson of Utah University sought to reveal.

Figure 1: Erosion due to sea water on concrete pylons. Image: Brian Robinson.

The composition of Roman concrete has been long known, being a mixture of volcanic ash, quicklime (calcium oxide) and volcanic rock, but the science behind its resilience to seawater remained unknown until recently. It is thought volcanic material was used after the Romans observed ash from volcanic eruptions crystallize to form durable rock.

The research team discovered that while modern concrete is made to be inert, the Roman version interacts with the environment. When seawater interacts with the mixture, it forms rare minerals aluminous tobermorite and phillipsite which are believed to strengthen the material. This discovery could lead to the development of more resilient concrete to be used in coastal environments.

Modern concrete is generally limestone mixed with other ingredients such as sandstone, ash, chalk, iron and clay. The mixture is designed to be inert and not interact with the environment. In coastal environments building regulations govern the type of concrete used and water-cement ratio, but the concrete is still porous: seawater can pass through the material, leading to corrosion and destructuralisation.

As well as salt crystallization, the process whereby dried out salts within the concrete lead to a buildup of pressure, other chemical reactions can affect the integrity of concrete. These include sulphate attack, lime leaching and alkali-aggregate expansion (Figure 2). Sulphate attack occurs when sulphates in the water react with the hydrated calcium aluminate within the concrete. This changes the microstructure and leads to an increase in volume within the concrete, resulting in physical stress and potential cracking. Lime leaching is the simple process of water passing through the concrete and dissolving calcium hydroxide from the concrete. (Calcium hydroxide is formed from the action of calcium oxide and water.) This is often seen as white patches or stalactites on the exterior of the concrete and reduces its strength. Alkali-aggregate expansion is when aggregates, such as silica, decrease the alkalinity of the cement paste, resulting in the expansion of minerals and cracking of the cement.

Figure 2: A 2000 year old Roman jetty. Image: Art853.

Roman concrete however does not appear susceptible to any of these processes. The research team found that seawater, the kryptonite to modern concrete, was the magic ingredient responsible for the structural stability of the Roman mixture. The Roman concrete samples were found to contain rare aluminous tobermorite and phillipsite crystals. It is believed that with long-term exposure to seawater, tobermorite crystalizes from the phillipsite as it becomes more alkaline. This crystallization is thought to strengthen the compound, as tobermorite has long plate-like crystals that allow the material to bend rather than crack under stress. Pliny the Elder in the first century CE exclaimed “that as soon as it [concrete] comes into contact with the waves of the sea and is submerged [it] becomes a single stone mass (fierem unum lapidem), impregnable to the waves and every day stronger.”

Figure 3: The research ground lead by Marie Jackson obtaining samples from the Portus Cosanus pier in Orbetello Italy. Image: Marie Jackson.

To arrive at these conclusions, Jackson et. al. (2017) performed scanning electron microscopy (SEM), micro x-ray diffraction (XRD), Raman spectroscopy and electron probe microanalysis at the Advanced Light Source at the Lawrence Berkeley National Laboratory. Samples were obtained by drilling Roman harbour structures, and were compared with volcanic rock (Figure 3). The combination of these techniques in conjunction with in situ analysis provided evidence of crystallized aluminous tobermorite and phillipsite within Roman marine concrete (Figure 4). These crystals formed long after the original setting of the concrete. This finding was surprising, as tobermorite typically forms only at temperatures above 80 °C, though there is one occurrence of it forming at ambient temperature in the Surtsey volcano.

Figure 4: SEM image showing the presence of aluminous tobermorite and phillipsite within Roman marine concrete. Image from Jackson et. al., Figure 6.

After this discovery, there is now a desire to develop a concrete mixture which replicates ancient Roman marine concrete. It could result in more environmentally friendly concrete construction, and would provide a mixture resilient to seawater and advantageous to coastal defence.

References

Jackson, M.D. et. al. (2017). Phillipsite and Al-tobermorite mineral cements produced through low-temperature water-rock reactions in Roman marine concrete. American Mineralogist: Journal of Earth and Planetary Materials102(7), pp.1435-1450.

Jackson, M.D. et. al. (2013). Unlocking the secrets of Al-tobermorite in Roman seawater concrete. American Mineralogist98(10), pp.1669-1687.

Suprenant, B.A. (1991). Designing concrete for exposure to seawater. Concrete Construction Magazine, pp.814-816.

 

 

Updated GNS Central New Zealand Earthquake Forecast

Paul Somerville, Risk Frontiers

Until now, GNS Science earthquake forecasts have been mainly focused on aftershocks occurring within the region affected by mainshock events.  This has been the case for the 2010 Mw 7.1 Darfield and 2011 Mw 6.2 Christchurch earthquakes as well as the 2016 Mw 7.8 Kaikoura earthquake. However, these events have the potential to trigger large earthquakes in adjacent regions (as described in Briefing Note 332). Now, GNS Science and an international group of earthquake scientists have developed a forecast, excerpts of which are reproduced below, that accounts in part for such large events. 

The probability of an earthquake with magnitude 7.8 and higher has doubled compared with that in the National Seismic Hazard Model, while that for an earthquake of magnitude 7.0 and higher has only increased by 20%.  Although the information released by GNS does not indicate which earthquake sources are contributing to these increases, we can deduce, with reference to Briefing Note 332, that the Hikurangi subduction zone is making the largest contribution, because the Wairarapa fault is the only crustal fault that is thought to be capable of producing an earthquake with Mw larger than 7.8.  The Hikurangi subduction zone may be capable of generating earthquakes as large as Mw 9.0.

GNS Science have taken considerable care to lucidly explain the forecast to the general public, but as in previous forecasts this new one is characterized by sanguine verbal descriptions of probabilities, such as “We estimate that there is a 2% to 14% chance – in verbal likelihood terms this is a very unlikely chance – of a magnitude 7 or above earthquake occurring within the next year in central New Zealand.”


GeoNet – Geological hazard information for New Zealand

Published: Tue Dec 19 2017 11:45 AM

https://www.geonet.org.nz/news/5JBSbLk9qw8OU4uWeI86KG

Updated Forecasts

We estimate that there is a 2% to 14% chance – in verbal likelihood terms this is a very unlikely chance – of a magnitude 7 or above earthquake occurring within the next year in central New Zealand.

The area inside the yellow box in the map below indicates the area of the earthquake forecasts that we refer to in this story. Our best estimate is a 6% (very unlikely) chance, which is about a 1 in 16 chance. This has decreased over the last year (in December 2016 it was greater than 20% within the next year), but it is still a higher chance than before the 2016 Kaikōura earthquake.

The table below shows the estimated chance of large earthquakes within the next year, and within the next 10 years. For example, within the next 10 years, there is a 10% to 60% chance (best estimate is 30%, unlikely) of a magnitude 7 or higher earthquake occurring in the area shown on the map (the map below shows what we mean by central New Zealand).

Updated probabilities table for central New Zealand.

The magnitude ranges are for a magnitude 7.8 or greater and magnitude 7 or greater within the next year and within the next 10 years.

How did we come up with these numbers?

Scientists from Japan, Taiwan, and USA met with our scientists to estimate the chance of a large earthquake occurring in central New Zealand. Together they assessed all the earthquake models, plus newly developed models of how slow slip events impact the probability of future earthquakes.

The results of these models were then combined with other information, including observations of how the numbers of earthquakes change during slow slip events, and evidence of earthquake clustering over the past few thousand years to estimate revised probabilities for large events in central New Zealand.

How does this forecast compare to before the Kaikoura earthquake?

The best estimate over the next year for a magnitude 7.0 or higher earthquake is 6%. This is an increase of 20% over the long-term estimates from the National Seismic Hazard Model (i.e., it is 1.2 times higher).

The best estimate for a magnitude 7.8 and higher earthquake is 1% within the next year. This is double the long-term estimates (i.e. it is twice as likely to happen now as it was before November 2016). The upper bounds for both magnitude range estimates are much higher than the long-term estimates.

That being said, the chance of a very big earthquake has been going down over the past year, since we first estimated the numbers following the Kaikoura earthquake.

Back in December 2016, there was a 5% chance of a M7.8+ earthquake within the coming year (December 2016 to November 2017), now the best estimate is 1% within the next year (December 2017 to November 2018).

This exercise has been focused on earthquake forecasts for larger magnitude earthquakes over central New Zealand rather than the Kaikōura aftershock sequence- (those will still be regularly updated here).

Science contact: Matt Gerstenberger m.gerstenberger@gns.cri.nz.

Science input received from Matt Gerstenberger and Sally Potter (GNS Science) as well as valuable contributions from our colleagues at MCDEM and USGS This research was funded by the Natural Hazards Research Platform Kaikoura Earthquake short term research projects.

 

Newsletter Volume 17, Issue 2

Weather-related natural disasters 2017: was this a reversion to the mean?

Professor Roger Pielke Jr (University of Colorado, Boulder)

Last July, I observed here that the world had recently experienced an era of unusually low disasters and that streak of good luck was going to end sometime. Little could I know that less than one month later the United States would be hit by Hurricane Harvey, which was soon followed by Hurricanes Irma and Maria. Not only did these three major hurricanes emphatically break the more than decade-long drought in major hurricane landfalls in the US but, according to Aon Benfield (PDF), together they resulted in > $220 billion in total losses and >$80 billion in insured losses.

In this column I take a look back at 2017 and put its catastrophes into longer-term historical perspective. Media reports have sent mixed messages about the catastrophes of 2017. On the one hand, there have been headlines about the record insured catastrophe losses of 2017. On the other hand, the impact of record losses on pricing in insurance and reinsurance has been less than many had expected or hoped for. How might we reconcile these two perspectives?

The short answer is that 2017 did indeed result in record weather-related catastrophe losses, but understanding the significance of losses requires understanding the inexorable growth in global wealth in addition to patterns in weather extremes. Total global losses in 2017 were $344 billion worldwide according to Aon Benfield. In terms of total catastrophe losses, 2017 trails only 2011 which had $486 billion in losses. Insured losses followed a similar pattern, with $134 billion in total losses (almost all of which were weather-related), just below that of 2011 and just above 2005.

The figure below places total weather-related catastrophe losses into the context of increasing global GDP. The graph presents data on losses from Munich Re (1990-2017) and Aon Benfield (2000-2017) in relation to global GDP (World Bank) all expressed in constant 2017 dollars (US Office of Management and Budget) (OMB). The data show clearly that 2017 was indeed an extreme year, with losses exceeding 0.4% of global GDP.

 

 

 

 

 

 

 

Yet, at the same time, since 1990 total global catastrophe losses are down by about one third, based on a simple linear trend. Over the past decade, reinsurance capacity, according to Aon Benfield, has increased by almost 80% (to $605 billion in 2017), whereas global GDP increased by about 24%. Simple math here helps to explain why reinsurance market pricing did not respond as much as some thought despite the 2017 record losses: (1) global GDP has increased, (2) reinsurance capacity has increased much faster than global GDP and (3) catastrophe losses have decreased as a proportion of global GDP. The consequence of these dynamics explain why it is that, even with losses in 2017 at a record levels, the market is nonplussed.

The majority of 2017 catastrophe losses and vast majority of insured losses resulted from the three major Atlantic hurricanes. How should we understand the 2017 Atlantic hurricane season?

According to Phil Klotzbach of Colorado State University, 2017 saw the most active North Atlantic hurricane season since 2005. (This assessment uses a metric called ACE, accumulated cyclone energy.) Three of the previous four years were well below average. These data reinforce what I wrote last July: “A simple regression to the mean would imply disasters of a scale not seen worldwide in more than a decade.” The active 2017 hurricane season reminds us that catastrophe luck cuts both ways.

Interestingly, in addition to the three major hurricanes that made landfall in the North Atlantic, there was only one other intense landfall worldwide (tropical cyclone Enawo struck Mozambique, killing 81 and causing >$20 million in damage). The figure below (based on updated data provided by Ryan Maue http://(tropical cyclone Enawo struck Mozambique, killing 81 and causing >$20 million in damage- @ryanmaue –http://(tropical cyclone Enawo struck Mozambique, killing 81 and causing >$20 million in damage based on our 2012 study) shows global tropical cyclone landfalls since 1970.

In 2017 there were 18 total landfalls at hurricane strength, above the long-term average of 15.3 (median = 15, record = 30 in 1971), but the four major landfalls were below the long-term average of 4.8 (median = 4; record = 9 (five times)). Overall, 2009 to 2016 were all below average for global landfalls, which helps to explain the good fortune experienced with respect to global weather catastrophe losses.

Despite the record catastrophe losses in 2017, according to Aon Benfield, the year continued a streak of well-below average (and below median) loss of life, according to longer-term data provided by Max Roser and Hannah Ritchie at Oxford University. However, large loss of life in 2004, 2008 and 2010 (>200,000 in each year) reminds us that the challenge of protecting lives in the face of disasters remains a crucial priority.

2017 saw a range of other catastrophes, including notable severe weather and wildfire events, together totaling more than $50 billion in losses, whereas flood losses were well below a longer-term average. However, despite these various catastrophes and associated losses, 2017 was notable primarily due to the three major hurricanes in the North Atlantic.

What does 2017 portend for 2018?

My advice has not changed: Even with the record losses of 2017, over more than a decade the world has had a run of good luck when it comes to weather disasters. The hurricanes of 2017 show how quickly good luck can come to an end.
Understanding loss potential in the context of inexorable global development and long term climate patterns is hard enough. It is made even more difficult with the politicized overlay that often accompanies the climate issue. Fortunately, there is good science and solid data available to help cut through the noise. 2017 was far from the worst we will see: even bigger disasters are coming – will you be ready?

The Hawaii nuclear alert: how did people respond?

Andrew Gissing & Ashley Avci

Nuclear tensions between the United States and North Korea have been extensively reported as both sides continue to posture via threats and propaganda and North Korea continues its missile tests. North Korea’s leader Kim Jong-Un has promised to decimate the US and has referred to President Trump as mentally ‘deranged’. A story in the New York Times based upon consultations with leading security experts recently suggested that the chance of war breaking out was between 15 and 50 percent (Kristof, 29/11/2017). Given the threat of an attack, U.S. government officials have encouraged residents to be prepared and have commenced monthly drills to test warning systems.

Within this environment of heightened geopolitical tensions, a single text message was sent in error to people in Hawaii on the 13th of January at 8.07am, warning of an imminent ballistic missile strike. The message read:

Emergency Alert. BALLISTIC MISSLE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.

Officials alerted the public to the error via social media 13 minutes later, but it took 38 minutes to send a follow-up text message. In the meantime, the community was left to react as if a real missile was to strike Hawaii within twelve to fifteen minutes. It has been revealed that the delays were the result of local officials believing they required federal approval to cancel the alert.

The alert presents an opportunity to improve the understanding of how people react to warnings of extreme events. Risk Frontiers researchers conducted an analysis of media interviews with 207 individuals (respondents) who received the warnings to identify people’s attitudes and responses after the alert was received. The media interviews were sourced from a search of global online media outlets that had reported on the false alarm. Interview responses were coded, analysed and are reported in this article.

Results

Respondents commonly spoke of where they were when they received the alert. Locations varied, highlighting the importance of considering the many likely locations of people when an alert is issued. Most frequently respondents were in a hotel (n=39) or awake at home (n=38). Others were at home, but in bed (n=11); at work (n=10), in a car (n=10), at the beach (n=7) or in the ocean (n=3).

Most respondents received the alert via the official text message issued by the State (n=89), but a minority were informed by someone else: for example, a family member (n=17). Some respondents, however, spoke of being spared the stress of the false alarm as they did not receive the initial warning (Hawaii News Now, 16/1/2018).

Respondents often spoke about how they had trusted the alert because they had interpreted it in the context of existing North Korea and United States tensions (n=36) and therefore believed the alert to be plausible.

Those that chose to validate the warning did so through a multitude of different channels including social media (n=26), making contact with others (n=15), searching websites (n=16), listening for sirens (n=16), watching TV (n=11) or calling authorities (n=3). Based on interview statements in which residents stated how they had immediately responded to the warning, we estimate that a large number of residents may not have attempted to validate the warning (n=64).

Respondents often spoke about how they felt when they received the alert. Most often people described their emotions as fearful (n=51), concerned (n=23), panicked (n=21), upset (n=13) or calm (n=13).

Most respondents undertook protective actions in response to the warning (n=136), most often stating that they attempted to seek shelter within the building they were located in (n=43); called or texted others to alert them (n=23) or called or texted others to express their emotions (n=22). Other actions included packing emergency items (n=17); gathering family members (n=16); attempting to leave a building to seek shelter elsewhere (n=15) and leaving an open space to seek shelter (n=12). Eighteen respondents stated that they did not know what to do when they received the alert.

Respondents also commented on what they observed other people doing. Most commonly others were observed attempting to seek shelter (n=50), crying (n=26), running (n=25) or calling or messaging others (n=13).

When seeking shelter, respondents most often stated that they had attempted to seek shelter within their home (n=34), frequently within the bathroom (n=18). In addition nineteen respondents spoke about sheltering within their hotel. Some commented that they did not know where to seek shelter (n=18).

A small number of respondents stated that they did not take any action (n=16). Reasons for not responding were that respondents thought that there was nothing that could be done (n=7); the warning was false as sirens did not sound (n=4); the missile would be shot down or would miss (n=2); or the warning was a joke or hoax (n=2).

Those that mentioned how they had discovered the alert was false found this information through social media (n=21) or via a text message from authorities (n=12). On discovering that the alert was a false alarm, respondents described their emotions as relieved (n=23), concerned (n=7) or upset (n=7).

Respondents commented on how the situation was handled or how warnings could be improved in the future. Most often, respondents were concerned about the lack of safeguards to avoid such a false alarm and that it took too long for authorities to notify the public that the alert was false. In some cases, respondents reflected on their own personal disaster preparedness, noting specific actions that they had not undertaken to be prepared.

Discussion and Conclusion

The Hawaii missile false alarm provides numerous insights into how people behave when warned of an extreme event. Practitioners should note the importance of social media as a communications mechanism, particularly for people to validate warnings and share with others.

The case study demonstrates the role of informal networks in both communicating and validating warnings. Hotels were clearly an important node of communication with their guests, and should always be considered an important network in communicating warnings in at-risk areas with large tourist populations.

Interestingly, it would appear that the population had been primed to respond to such an alert by their knowledge or concerns regarding tensions between North Korea and the United States. This demonstrates the importance of communicating long range forecasts to build the community’s awareness of a risk so that individuals will recognise and respond to a warning when it occurs.

Given that the official advice as to what to do in the event of a real alert is for “all residents and visitors to immediately seek shelter in a building or other substantial structure”, it appears that most respondents reacted appropriately. However, consistent with previous Risk Frontiers briefings on community responses to warnings, not everyone responded or knew how to respond. This is a further demonstration that even in extreme circumstances, emergency warnings cannot be relied on to achieve full compliance by communities. This finding should be considered when relying on warning systems to justify the permitting of development in high risk locations.

As for improving warning technologies, the Hawaiian Emergency Management Agency has suspended all future drills until a review of the event has been completed; instituted a two-person activation/verification rule for all tests and actual alarms and instigated a cancellation command that can be activated within seconds of a false alarm.

References

HAWAII NEWS NOW. 16/1/2018. If you didn’t get the false alert about an inbound missile, this might be why. Available: http://www.hawaiinewsnow.com/story/37269695/if-you-didnt-get-the-false-missile-alert-this-might-be-why [Accessed 27/1/2018].

KRISTOF, N. 29/11/2017. Are we headed toward a new Korean war. New York Times.

Risk Frontiers’ Multi-Peril Workbench 2.4 has now been released!

Workbench 2.4:

  • features a major update to our HailAUS hail loss model to national coverage
  • includes our Demand Surge model that can be applied to all Australian Perils
  • contains updates to FloodAUS, FireAUS, CyclAUS as well as many enhancements to the Workbench itself

Changes are coming to QuakeAUS … have you heard?

Prof. Paul Somerville of Risk Frontiers has been participating in the Geoscience Australia update of the seismic hazard model for Australia through the National Seismic Hazard Assessment (NSHA18) project. We have commenced preparations to update our Australian earthquake loss model, QuakeAUS, and expect to have preliminary results in the first quarter of this year!

Flood Deaths in the Northern Territory

Alice Carney1, Lucinda Coates1,2,3 and Katharine Haynes1,3

1 Macquarie University
2 Risk Frontiers
3 Bushfire and Natural Hazards Cooperative Research Centre

Risk Frontiers recently examined the circumstances surrounding deaths from flood events in Australia as part of a wider Bushfire and Natural Hazards CRC (BNHCRC)-funded project, An analysis of human fatalities and building losses from natural disasters. One of the results found was a heightened level of risk in the Northern Territory. We decided to investigate this a little more closely.

Introduction

In the previous Risk Frontiers research project, 1859 individually identified flood-related deaths were recorded in Australia from 1900 to 2015 and, of these, 79% were males (Haynes et al., 2016). Death rates showed a steep statistically significant decline up to 1960, with a lesser, steadier decline over the most recent 55 years (Haynes et al., 2016).

Queensland and New South Wales accounted for 75% of the total fatalities across Australia (Haynes et al., 2016). However, when deaths were examined in relation to population size, a heightened level of risk in the Northern Territory (NT) was revealed, with a death rate almost double that of the jurisdiction with the next highest fatality rate (Haynes et al., 2016). When fatalities in the various jurisdictions were examined longitudinally, an expected downward trend in deaths over time was observed, apart from in the NT – particularly in more recent years, where an increasing proportion of flood deaths were seen (Haynes et al., 2016).

This warranted further investigation. This briefing note summarises the results obtained when the demographic characteristics of flood-related deaths occurring in the NT from 1960-2015 were examined.

Fatality totals and trends

From 1960 to 2015 there have been at least 27 fatal floods in the NT, claiming 38 lives. Annual flood fatalities are increasing with time, while death rates have remained constant (figure 1. Note: zero deaths 1960-1964). Males accounted for 74% of the fatalities. The numbers of both male and female flood fatalities are increasing and, although Australia’s male:female ratio is decreasing, the gap between male and female flood deaths in the Northern Territory is growing, showing no sign of equity in the near future.

 

Figure 1: Flood fatalities in the Northern Territory, 1960-2015

Males tend to be more at risk in flood events due to their risk-taking behaviour: for example, males are over-represented in attempting to cross floodwaters (67%), and undertaking an activity near (100%) or in (80%) floodwaters. In all these activities, males are aware of the flood and undertake the activity nonetheless. On the other hand, females are less likely to take these risks and are over-represented only in carrying out activities not near usual watercourses such as staying at home (71%). These statistics suggest gender-specific approaches must be developed to address the clear differences in causes of death.

In regards to age, males are over-represented in most age brackets. Most (98%) decedents were aged 0-59 years, the age group most at risk being those aged 30-39 years.

The Daly River Drainage Basin has claimed the most lives, accounting for 34% of flood fatalities in NT (figure 2). There have been five fatal floods there since 1960, three of which were high fatality (≥ 3 deaths) events. The Todd River Drainage Basin is the second most dangerous, accounting for 21% of fatalities, and having also experienced five fatal flood events.

Figure 2: Location of flood fatalities in NT by drainage basin, 1960-2015

Indigeneity was investigated from 2000 onwards. A clear inequity is presented, as the indigenous account for 65% of fatalities from 2000-2015. Indigenous males account for over half (52%) of all flood fatalities in NT. This is alarmingly high in comparison to the group least at risk – non-indigenous females, who account for only 4% of fatalities. In terms of age, those most at risk are the 0-9 year-old non-indigenous and 10-19 year-old indigenous groups. It is clear that, similar to the case of the male population, indigenous persons are more prone to risk-taking activities: the majority (71%) of those crossing a flooded watercourse and 67% of those engaged in an activity in a flooded watercourse were indigenous. Research suggests indigenous persons are more likely to present risk-taking behaviours due to poor education on the risks (Atkinson, 2012). A key to reducing flood fatalities in NT is, therefore, training in flood-safe behaviours targeted to the indigenous population.

The riskiest “activity prior to death” was found to be crossing flooded watercourses, which accounts for over a third (35%) of fatalities in NT: 67% of these were male. A total of 57% of female decedents were attempting to cross floodwaters. The second most risky activity (21%) was being engaged in an activity near floodwaters: males were over-represented (100%). [The results for those engaged in an activity not near a usual watercourse – e.g, being at home – are skewed due to one large event in 1977, in which five people were drowned at a cattle station.]

The familiarity of the decedent with the death location was investigated. The term “familiar” as used in this research refers to being within 10km of one’s house. Locals accounted for 87% of flood fatalities. In relation to activity prior to death, locals were most likely to be crossing floodwaters (29%), engaged in an activity in or near floodwaters (21%) or at home (21%). An analysis of those decedents who died at home clearly showed that they chose not to evacuate when warnings were received. This suggests that behavioural changes must be made through education of the risks of ignoring flood warnings. [Note: a relatively small dataset means these results should be treated with some caution.]

Some 25% of the decedents from 2000-2015 were intoxicated and, of those, 80% were attempting to cross a flooded river and 20% were engaged in activities in flood waters. 80% were male; 80% were indigenous. [Note: a relatively small dataset means these results should be treated with some caution.]

There are a few take-home messages around mitigation and education strategies for the Northern Territory. Appropriate strategies must be developed targeting, especially, indigenous males. The aim should be to educate on the risks floods present and the measures that should be taken to avoid them, such as not attempting to cross, or engaging in activities in or near, floodwaters. Haynes et al (2016) gives insight into potential mitigation strategies which should be modified to best suit the target population. The three key strategies should be to educate, pose consequences and apply structural interventions.

Acknowledgements

Risk Frontiers employed Macquarie University climate science PACE student Alice Carney to investigate the circumstances surrounding flood deaths in the Northern Territory (NT). PACE is Macquarie University’s Professional and Community Engagement program, which gives students a chance to explore key economic, social and ethical challenges by seeing at first-hand how contemporary organizations (such as Risk Frontiers) address them, allowing them to develop new knowledge and skills and explore future career opportunities.

The work utilised Risk Frontiers’ database PerilAUS and the National Coronial Information System (NCIS) database of coronial data, sourced from the Department of Justice and Regulation, Victoria: a resource of coronial records across Australia from July 2000 onwards.

References

Atkinson, J. 2012. Anthropometric correlates of reproductive success, facial configuration, risk taking and sexual behaviors among indigenous and Western populations: the role of hand-grip strength and wrist width. In: GALLUP, G. G. & SVARE, B. (eds.). ProQuest Dissertations Publishing.

Haynes, K., Coates, L., Van Den Honert, R., Gissing, A., Bird, D., Dimer De Oliveira, F., D’Arcy, R., Smith, C. & Radford, D. 2016. Exploring the circumstances surrounding flood fatalities in Australia—1900–2015 and the implications for policy and practice. Environmental Science & Policy, 76, 165-176.

National Coronial Information System. (2017). Home – National Coronial Information System.   [online] Available at: http://www.ncis.org.au/ [Accessed 4 Oct. 2017].

Hawaii False Alarm Hints at Thin Line Between Mishap and Nuclear War

The following article, by Max Fisher, appeared in The Interpreter, New York Times, on January 14, 2018, the day after state emergency officials in Hawaii made a false warning to take shelter from an inbound missile threat.   Three days later, Japan’s public broadcaster accidentally sent news alerts that North Korea had launched a missile and that citizens should take shelter. The Japanese broadcaster, NHK, corrected itself five minutes later and apologized for the error on its evening news, initially blaming the J-Alert system but later conceding it was not to blame.  NHK’s swift rectification of its error stands in contrast to the 38-minute delay by officials in Hawaii on Saturday in cancelling warnings of an incoming ballistic missile threat. Notwithstanding common misperceptions to the contrary, there is no legal means to prevent launch once the President of the U.S. has made an order to launch, which can be done without consultation and whose timing follows “launch on launch (by the enemy)” to pre-empt the destruction of ground-based missiles by the enemy. Today’s issue of The Independent has an interesting take on how the Hawaii alarm may have been interpreted in Pyongyang on the following website:

http://www.independent.co.uk/voices/hawaii-missile-alert-nuclear-war-text-stop-north-korea-us-norad-defence-pacific-a8161501.html

and a very close call is described by former U.S. Secretary of Defence William Perry on this website:

https://www.npr.org/2018/01/16/578247161/ex-defense-chief-william-perry-on-false-missile-warnings


Nuclear experts are warning, using some of their most urgent language since President Trump took office, that Hawaii’s false alarm [on January 13, 2018], in which state agencies alerted locals to a nonexistent missile attack, underscores a growing risk of unintended nuclear war with North Korea. To understand the connection, which might not be obvious, you need to go back to the tragedy of Korean Air Lines Flight 007.

In 1983, a Korean airliner bound from Anchorage to Seoul, South Korea, strayed into Soviet airspace. Air defense officers, mistaking it for an American spy plane that had been loitering nearby, tried to establish contact. They fired warning shots. When no response came, they shot it down, killing all 269 people on board.

But the graver lesson may be what happened next. Though it was quickly evident that the downing had been a mistake, mutual distrust and the logic of nuclear deterrence — more so than the deaths themselves — set Washington and Moscow heading toward a conflict neither wanted. The story illustrated how imperfect information, aggressive defense postures and minutes-long response times brought both sides hurtling toward possible nuclear war — a set of dynamics that can feel disconcertingly familiar today.

Ronald Reagan had taken office in 1981 pledging to confront the Soviet Union. Though he intended to deter Soviet aggression, Moscow read his threats and condemnations — he had declared its government an “evil empire” that must be brought to an end — as preludes to war. Mr. Trump’s White House has issued its own threats against North Korea, suggesting that it might pursue war to halt the country’s nuclear weapons development.

The 1983 shooting down, on its own, might have passed as a terrible mistake. But the superpowers had only fragmentary understanding of something that had happened on the far fringes of Soviet territory. In an atmosphere of distrust, technical and bureaucratic snafus drove each to suspect the other of deception.  Moscow received contradictory reports as to whether its pilots had shot down an airliner or a spy plane, and Soviet leaders were biased toward trusting their own. So when they declared it a legal interception of an American military incursion, American leaders, who knew this to be false, assumed Soviet leaders were lying. Moscow had downed the airliner deliberately, some concluded, in an act of undeclared war.

At the same time, Washington made a nearly perfect mirror-image set of mistakes — suggesting that such misreadings are not just possible, but dangerously likely.  Mr. Reagan, furious at the loss of life, accused Moscow of deliberately targeting the civilian airliner. He denounced Soviet society itself as rotten and in pursuit of world domination.  In fact, a C.I.A. assessment, included in the president’s daily briefing that morning, had concluded the incident was likely an error. Mr. Reagan appeared to have simply missed it.

But Soviet leaders had never considered this; they assumed Mr. Reagan was lying about their intentions. Some concluded he had somehow lured the Soviet Union into downing the aircraft as cover for a massive pre-emptive attack, which they feared might come at any moment.  Each read the other’s blundering and dissembling as intentional, deepening suspicions among hard-liners that the other side was laying the groundwork for war. And if war was coming, the logic of nuclear deterrence all but required firing first.

Nuclear-armed missiles had recently achieved a level of speed and capability so that one power could completely disarm another in a matter of minutes. This created something called first-strike instability, in which firing first — even if you think you might be firing in error — is the only way to be sure of preventing your own obliteration.  The result was that the United States and the Soviet Union repeatedly went to the brink of war over provocations or even technical misreadings. Often, officials had mere minutes to decide whether to retaliate against seemingly real or impending attacks without being able to fully verify whether an attack was actually underway. In the logic of nuclear deterrence, firing would have been the rational choice.

That dynamic is heightened with North Korea, which is thought to have only a few dozen warheads and so must fire them immediately to prevent their destruction in the event of war.  “Today’s false alarm in Hawaii a reminder of the big risks we continue to run by relying on nuclear deterrence/prompt launch nuclear posture,” Kingston Reif, an analyst with the Arms Control Association, wrote on Twitter, referring to the strategy of firing quickly in a war. “And while deterring/containing North Korea is far preferable to preventive war, it’s not risk free. And it could fail.”

If similar misunderstandings seem implausible today, consider that an initial White House statement called Hawaii’s alert an exercise — though state officials say it was operator error. Consider that 38 minutes elapsed before emergency systems sent a second message announcing the mistake. If even Washington was misreading events, the confusion in Pyongyang must have been far greater. Had the turmoil unfolded during a major crisis or period of heightened threats, North Korean leaders could have misread the Hawaiian warning as cover for an attack, much as the Soviets had done in 1983. American officials have been warning for weeks that they might attack North Korea. Though some analysts consider this a likely bluff, officials in Pyongyang have little room for error.

Vipin Narang, a nuclear scholar at the Massachusetts Institute of Technology, suggested another possible scenario, using shorthand terms to refer to the president and his nuclear command systems, which Mr. Trump has nearby at all times. “POTUS sees alert on his phone about an incoming toward Hawaii, pulls out the biscuit, turns to his military aide with the football and issues a valid and authentic order to launch nuclear weapons at North Korea,” Mr. Narang wrote on Twitter, adding, “Think it can’t happen?”

Unlike in 1983, no one died in Hawaii’s false alarm. But deaths are not necessary for a mistake to lead to war. Just three months after the airliner was shot down, a Soviet early warning system falsely registered a massive American launch. Nuclear war may have only been averted because the Soviet officer in charge, operating purely on a hunch, reported it as an error.

North Korea is far more vulnerable than the Soviet Union was to a nuclear strike, giving its officers an even narrower window to judge events and even greater incentive to fire first. And, unlike the Soviets, who maintained global watch systems and spy networks, North Korea operates in relative blindness. For all the power of nuclear weapons, scholars say their gravest dangers come from the uncertainty they create and the fallibility of human operators, who must read every signal perfectly for mutual deterrence to hold.

In 1983, Washington and Moscow took steps that heightened the uncertainty, darkly hinting at each other’s illegitimacy and threats of massive retaliation, in a contest for nuclear supremacy, and survival. Each was gambling they could go to the brink without human error pushing them over. William J. Perry, a defense secretary under President Bill Clinton, called the false alarm in Hawaii a reminder that “the risk of accidental nuclear war is not hypothetical — accidents have happened in the past, and humans will err again.”

Mr. Reagan concluded the same, writing in his memoirs, “The KAL incident demonstrated how close the world had come to the nuclear precipice and how much we needed nuclear arms control.” Mikhail Gorbachev, who soon after took over the Soviet Union, had the same response, later telling the journalist David Hoffman, “A war could start not because of a political decision, but just because of some technical failure.”  Mr. Gorbachev and Mr. Reagan reduced their country’s stockpiles and repeatedly sought, though never quite reached, an agreement to banish nuclear weapons from the world. But Mr. Trump and North Korea’s leader, Kim Jong-un, remain locked in 1983, issuing provocations and threats of nuclear strikes on push-button alert, gambling that their luck, and ours, will continue to hold.