Australia’s largest hailstorm disaster

By Andrew Gissing, Chas Keys, Ryan Crompton

The 14th of April, marked the 20th anniversary of the Sydney 1999 hailstorm. The storm is considered to have been Australia’s most expensive insured natural disaster with insurers paying out claims to the tune of 5.5 billion dollars in today’s terms. But such a hailstorm cannot be regarded as a once-in-a-lifetime event.

The storm occurred outside the typical “storm season” usually taken to occur between September and March. It forced a rethink on how the season should be defined.

The storm pelted thousands of homes and vehicles in Sydney’s eastern suburbs with cricket ball and grapefruit-sized hailstones, along with heavy rain and strong winds. The hail was estimated to have weighed some 500,000 tones. Over 100,000 people were affected: one person died and several were injured and attended hospitals. But the real story that emerged was one of the damage caused and the disruption to lives that resulted.

Some 24,000 homes, 70,000 vehicles, 60 schools and 23 aircraft were damaged. The financial scale of the disaster surpassed more recent extreme events such as Queensland’s 2011 floods and the Black Saturday Bushfires of 2009 in Victoria. Severe storms, especially those that bring large hail, are among Australia’s most costly forms of natural perils.

Source: Risk Frontiers

Where large hailstones fell there was substantial damage to roofing tiles and building windows. In the worst hit areas of Rosebery and Kensington almost every dwelling in entire street blocks had been damaged. Some hailstones were confirmed to have had diameters of more than nine centimetres.

Damaged roofs and windows resulted in heavy rainwaters entering homes, with much water damage to home contents. In the most extreme cases ceilings collapsed under the weight of saturated insulation batts. Hailstones punched holes through pergolas and outdoor furniture, shredding gardens. Vehicles too suffered dents to body work and broken windows; about one third of the insurance payout went to cover this kind of damage.

Some motorists became trapped in floodwaters. Days after the storm, some elderly people were found in a state of shock still living in their homes.

Many homes were rendered uninhabitable and some remained so for months thanks to delays to repairs due to the scale of the event: many people had to be given emergency shelter for long periods at public expense. The construction industry was placed under stress and interstate resources were needed to meet the demand. An unusually wet and windy autumn and winter slowed the emergency response and the completion of repairs.

The State Emergency Service (SES) as the lead agency for storm response in NSW formed the front line in working with households to effect temporary building repairs. SES Volunteers from across the nation travelled to help and were supported by crews from the then NSW Fire Brigades, Rural Fire Service, Volunteer Rescue Association and many other organisations. At the peak of the response some 3000 personnel were deployed in the field. In all some 44,000 calls for help were attended involving around 20,000 properties, requiring 12,450 personnel in all to deal with.

Severe hailstorms have always been a feature of Sydney’s climate, and the costs associated with the worst of them have been huge. The hailstorm that hit Sydney in December 2018 led to losses that have now surpassed one billion dollars, and those that struck the Blacktown-Baulkham Hills area over the summer of 2007-08 were of similar impact in terms of insured costs. Severely consequential hailstorms were also experienced in Auburn and nearby in 1990 and on the upper north shore in 1991. One of the most damaging occurred in January 1947.

Though the influence of climate change on hailstorms in Australia is uncertain with only few projection studies undertaken, increases in wealth and the size of Sydney and other capital cities mean that metropolitan areas are more exposed to severe storm events. Since 1999 Greater Sydney has grown by over 1.3 million people with the number of dwellings increasing around 30%.

With increasing exposure will come increased loss activity but thankfully models exist to assess hailstorm risk on a national scale. For those with access, the likelihood of the April 1999 Sydney insured loss, or similar, is easily quantifiable.

Risk Frontiers offers a national hail loss model for Australia. Contact us at info@riskfrontiers.com

APRA’s increased pressure on major financial institutions to manage the financial risks of climate change

Ryan Crompton, Thomas Mortlock and Paul Somerville, Risk Frontiers

In the Reserve Bank’s first substantial comments on the topic on March 12, deputy governor Guy Debelle warned that climate change could cause financial shocks if companies did not take these risks seriously in their planning (see Risk Frontiers Briefing Note 391: ‘Change now or pay later’: RBA’s stark warning on climate change). The following week, on March 20, the Australian Prudential Regulation Authority (APRA) stepped up pressure on our biggest financial institutions with its commitment to increase scrutiny of how they are ‘managing the financial risks of climate change to their businesses’ and their call for them to ‘move from gaining awareness of the financial risks to taking action to mitigate against them’.

The announcements were made in conjunction with the release of its Information Paper entitled Climate change: Awareness to Action that provides insights into their first climate change survey undertaken in mid-2018 and designed to align with the Task Force on Climate-related Financial Disclosures (TCFD) recommended framework. Various themes were analysed as part of the survey (e.g. relating to Risk Awareness and Management, Governance, Strategy, Metrics and Targets, Disclosure) of 38 large banks, insurance companies, and superannuation trustees.

The survey found that a substantial majority of regulated entities were taking steps to increase their understanding of climate-related financial risks, including all of the banks, general insurers and superannuation trustees surveyed. It also showed that, among many other findings:

  • A third of respondents viewed climate change as a “material” risk to their businesses right now. A further half thought it would be a “material” risk at some point in the future.
  • A majority of banks considered climate-related financial risks as part of their risk management frameworks as did most general insurers that indicated climate change risks to be material to their business.
  • About 70% of general insurers indicated that they were undertaking financial analysis on key risks whereas no life insurers indicated they were doing so.

The most common types of climate-related financial risks identified across all survey respondents were damage to reputation, flood, regulatory action, cyclone, energy, and bushfire with the full list shown in Figure 1.

Risk Frontiers’ range of models and tools covering flood, tropical cyclone, bushfire, heatwaves and sea-level rise can assist organisations in assessing their current and future risks to these perils.

Figure 1. Top five climate-related financial risks. Number of respondents who selected each risk among the top five risks relevant to their organisation. Source: APRA (2019).

Risk Frontiers offer Australian Hail Model via ModEx

SIMPLITIUM PRESS RELEASE – 26.03.2019

Risk Frontiers’ HailAUS 7.0 model is now available on ModEx®, the independent multi-vendor catastrophe modelling platform for the insurance industry. Risk Frontiers represent the seventh model vendor on ModEx.

Risk Frontiers specialises in disaster risk assessment and management across the Asia-Pacific region. HailAUS 7.0 is a fully stochastic loss model for hail and covers all of Australia. The model includes a catalogue of hail storms reflecting activity from local radar station data and the frequency and severity of ‘high storm potential days’ derived from reanalysis data and the observed historical record. It calculates losses for residential, commercial and industrial property, as well as motor portfolios.

Foster Langbein, Chief Technology Officer, Risk Frontiers comments:

“ModEx complements our Multi-Peril Workbench offering perfectly and is a compelling solution for firms wishing to use our models but have neither time nor resources to engage with our complete native software solution. This implementation leverages the Oasis Loss Modelling Framework’s new complex model wrapper capabilities, enabling the integration of our native model engine which ensures events and losses are consistent and directly comparable between the Oasis Loss Modelling Framework and Multi-Peril Workbench.”

James Lay, Commercial Director, ModEx:

“With hail being Australia’s most costly natural hazard, responsible for the country’s most expensive insured natural disaster ever, the importance of making this model more readily accessible is clear. We are proud to be able to offer it to our clients through ModEx, as we continue our mission of delivering greater choice of cat modelling services to the industry.”ModEx provides a vibrant catastrophe modelling ecosystem for the (re)insurance industry, uniting multiple catastrophe models, hazard maps and data enhancement services through one platform.

For further information, please contact:

John Yonker
CEO
Simplitium
+44 (0)20 3872 1943
john.yonker@simplitium.com


About ModEx®

ModEx is the only independent multi-vendor catastrophe risk modelling platform for the (re)insurance industry. Powered by the Oasis LMF, ModEx delivers a hosted and fully managed catastrophe risk modelling service that offers a new and cost-effective way for firms to meet their modelling requirements. The platform creates an ecosystem where model vendors make their models available to the industry via a single user interface, improving the quality and choice of models available in the market. For further information, please visit www.simplitium.com/modex

About Risk Frontiers

Risk Frontiers specialises in the assessment and management of disaster risk across the Asia-Pacific region. For almost 25 years, Risk Frontiers has been developing a range of probabilistic natural catastrophe loss models by combining local expertise, the latest science and innovative modelling techniques. Its current modelling suite covers the major perils in Australia – bushfire, earthquake, flood, hail, tropical cyclone – as well as New Zealand earthquake. Their models are currently licensed to a range of domestic insurers, global reinsurers and reinsurance brokers. For further information, please visit: www.riskfrontiers.com

Why are we not taking climate change more seriously?

Thomas Mortlock, Jonathan van Leeuwen and Paul Somerville, Risk Frontiers

Figure 1. Robert FitzRoy. Source: Wikipedia (2019a)

Robert FitzRoy was an English Officer of the Royal Navy, most famously known for captaining HMS Beagle during Darwin’s voyage around the world, and being the 2nd Governor of New Zealand. He was also perhaps the world’s first modern-recognised weather forecaster (even coining the word “forecast” and setting up what is now known as the UK Met Office).

FitzRoy recognised the need for weather prediction after a series of storms on the English coast shipwrecked passenger ships with the loss of many lives. He petitioned the Government of the day and received funding to operationalise his daily weather forecasts. However, there was widespread lack of public acceptance of his forecasts and FitzRoy was ridiculed at the apparent lack of accuracy.

The fascinating life story of Robert FitzRoy culminated in a tragic end when, in April 1865, he took his own life after suffering from depression believed to be associated with having to defend himself against the public’s attacks of his weather forecasts.

While FitzRoy left an important legacy for weather forecasting in the UK, his story rings true of the (still) sizeable lack of public acceptance of climate change today. Why are we not taking climate change more seriously? There is certainly sufficient empirical evidence to suggest anthropogenic climate warming is occurring. A recent study by Frances Moore et al. in the journal PNAS tries to explain why this might be.

The boiling frog effect

Moore et al. (2019) used a sample of over 2 billion social media posts from Twitter in the US to investigate the drivers behind public perception of climate change. The research suggests that experience of weather in recent years – rather than longer historical periods – determines the climatic baseline against which current weather is evaluated, potentially obscuring public recognition of anthropogenic climate change.

The metaphor of a “boiling frog”[1] describes the phenomenon whereby the negative effects of a gradually changing environment become normalised so that corrective measures are never adopted. In this instance, the declining noteworthiness of historically extreme temperatures is not accompanied by a decline in the negative sentiment that they induce, indicating the social normalization of extreme weather conditions.

The study shows that, despite large increases in absolute temperature, anomalies relative to a shifting baseline are small and not clearly distinguishable from zero through the 21st century (Figure 2).

Figure 2. Effect of shifting baselines on the remarkability of temperature anomalies. Population-weighted annual average temperature anomalies over the US under the IPCC’s (most extreme case) RCP 8.5 with 40 realizations of internal variability. Anomalies are defined relative to a fixed 30-yr period (1981-2010, red line) and relative to a shifting baseline (blue line). A shifting baseline reduces the remarkability of increased temperatures to near zero. Source: Moore et al. (2019).

This is a hugely important notion for government, as public policy tends to advance during windows of opportunity provided by focused public attention. Without public perception of a problem, the ability of policy-makers to advance an agenda is limited.

Moore et al. conclude that it is unlikely that rising temperatures alone will be sufficient to produce widespread support for mitigation policies.

It is important to highlight that this is a US-based study, biasing the results toward domestic sentiment (and overt political denial) of climate change in the US. However, there are certainly similarities between the public discourse of climate change in the US and Australia. The problem of a ‘shifting baseline’ is also potentially exacerbated in Australia, where interannual climate variability such as El Nino-Southern Oscillation (ENSO) plays an important role in modulating the behaviour of extreme weather events. Unfortunately, it seems, the reference point for socialised ‘normal’ conditions appears to be based on weather experienced between two to eight years ago, which coincides with the timeframe on which ENSO fluctuates.

The authors also point out that their results relate to ambient average temperatures only. It may well be that more acute extreme events are both more consequential and more salient and therefore less prone to normalisation.

Taking a longer-term view

So, what could the answer be to communicating climate change risk to a public with a constantly shifting baseline? A powerful approach to overcoming short-termism is presenting the long-term picture and contextualising the present-day climate within this.

The recently-published State of the Climate 2018 report by CSIRO/BoM (2019) is unequivocal about the unprecedented nature of today’s current levels of atmospheric CO2, the anthropogenic cause of this and the trajectory that we are locked into for the coming decades.

Two well-known figures are particularly useful in this regard. The first (Figure 3) combines a paleo-reconstruction of atmospheric CO2 concentrations from measurements of oxygen isotopes in Antarctic ice for the past 800,000 years (left panel) with historical observations of CO2 levels measured at the Bureau’s observation station at Cape Grim in Tasmania for almost the past 200 years.

Figure 3. Long-term variability in atmospheric CO2 from Antarctic ice core records (green line, left panel and black line, right panel), and historical observations of CO2 in Tasmania to present day (blue line, right panel).

As can be seen, in the last 800,000 years CO2 varied with the very long glacial-interglacial cycles (a periodicity of around 100,000 years) and was generally between 170 and 300 ppm (parts per million). In just the last 200 years, CO2 as measured in Tasmania has increased from around 280 ppm (largely typical of an interglacial) to 400 ppm.

In other words, we are about 75 % above the natural variability of CO2 in the atmosphere relative to almost the last one million years. This is hugely significant because there is a high correlation between atmospheric CO2 concentrations and temperature. We may not be seeing the full effects of high atmospheric CO2 just yet because of inertia and retention in the climate / ocean system – but for this reason we are locked into a warming trend for decades to come.

Some may note that from Figure 3 we are ‘due’ for another glacial period – but how this will play out with now unprecedented levels of atmospheric C02 relative to the last million years is unknown. This introduces the realm of feedback loops and ‘abrupt climate change’ – a topic we touched on in Briefing Note No. 374.

The next question, if it still needs asking, is: how do we know the recent exceedance of CO2 levels above the long-term natural envelope is anthropogenic, i.e., human-induced?

Figure 4 illustrates an Australian-based modelling study which addresses this question. The grey line represents Australian temperature observations since 1910, with the black line the ten-year running mean. The shaded grey and blue bands are the 10-90% range of the 20-year running mean temperatures simulated from the latest generation of Global Climate Models (CMIP5). The grey band shows simulations that include the observed conditions of greenhouse gases, aerosols, solar input and volcanoes. The blue band shows simulations of observed conditions but not including human emissions of greenhouse gases or aerosols. The red band shows simulations projecting forward into the future (including all IPCC emissions scenarios).

Figure 4. Observations and modelled reconstruction of temperature anomalies in Australia over the past 100 years both with (grey band) and without (blue band) human emissions included.

The grey band shows that global climate models that include human emissions of greenhouse gases or aerosols provide a reasonable reconstruction of temperature changes over the last 100 years. The blue band demonstrates that, without these human effects included, temperature change is insignificant over this period.

By inference, this suggests that the recent warming in Australia can only be explained by human influence. The future trajectory of warming over the next two decades continues this trend – however much we mitigate global carbon emissions – because of the slow response of the climate-ocean system to elevated greenhouse gases.

The past repeating; the future uncertain

Notwithstanding the passage of 150 years since Robert FitzRoy’s time, forecasts of future weather conditions are still not wholly accurate and the lack of public acceptance of climate change today in some quarters is reminiscent of the public’s reaction to FitzRoy’s pioneering forecasts in the mid-1800s. Recent research suggests this response may be related in part to an ever-changing perception of what environmental conditions are ‘normal’ and hence the metaphor of a boiling frog. The scientific evidence, however, that today’s levels of CO2 in the atmosphere are well outside the envelope of natural variability experienced for almost the past one million years, is irrefutable.

References

Commonwealth Scientific and Industrial Research Organisation and Bureau of Meteorology [CSIRO and BoM] (2019). State of the Climate 2018. A report prepared by CSIRO and Bureau of Meteorology, Commonwealth of Australia, 2018, pp 24.

Moore, F.C., Obradovitch, N., Lehner, F., Baylis, P. (2019). Rapidly declining remarkability of temperature anomalies may obscure public perception of climate change. Proceedings of the National Academy of Sciences of the United States of America (PNAS), 116(11), 4905-4910.

Wikipedia (2019a). Robert FitzRoy. Available here, accessed 14 March 2019.

Wikipedia (2019b). Boiling frog. Available here, accessed 14 March 2019.

[1] The boiling frog is a fable describing a frog being slowly boiled alive. The premise is that if a frog is put suddenly into boiling water, it will jump out, but if the frog is put in tepid water which is then brought to a boil slowly, it will not perceive the danger and will be cooked to death. Source: Wikipedia (2019b).

Risk Frontiers Newsletter Volume 18, Issue 2

Sydney Hailstorm: December 20, 2018

by Salomé Hussein and Foster Langbein

On December 20, 2018, a severe hailstorm struck the greater Sydney region in the mid-afternoon. The most impacted areas were Liverpool in Sydney’s southwest and further to the north, Castle Hill and Berowra. Hail sizes as large as 8cm diameter were reported (examples shown in Figure 1) and Chipping Norton, near Liverpool, experienced up to 10cm. More minor damage was reported over much of metropolitan Sydney.

The event caused an estimated $1.04 billion in damages (Insurance Council of Australia (ICA) as at February 14, 2019). This ranks as the 8th most costly hailstorm in the ICA Disaster List in terms of normalised insured losses (2017/18 dollars1, Table 1), just above the Melbourne 2011 Christmas storm, and the 3rd costliest for the Sydney region. The largest volume of claims was for motor vehicles with these contributing around 30% of the total loss. Figure 2 shows the distribution of numbers of claims made for residential, commercial and motor vehicle lines of business.

The event generated a substantial emergency response with the SES receiving 3600 calls, of which 1100 came from the Liverpool area alone. There were 2400 jobs attended by 600 volunteers, mainly to place tarps on roofs.

Figure 2: Chart showing proportion of numbers of claims made, using data from the ICA Disaster List.

The conditions at the time of the event were favourable to see hail fall. The synoptic weather pattern was described as a ‘southeasterly change’ and there was also a sea-breeze present on the day as determined from our analysis of local weather station data, specifically wind direction, wind speed, and relative humidity. This combination of synoptic pattern and sea-breeze occurrence was found to be the most conducive to severe hail in southeast Queensland in research conducted by Soderholm et al. (2017).

Our post-event analysis included estimating a damage footprint using the Bureau of Meteorology (BoM) radar station data (recently made open source) and Maximum Estimated Size of Hail (MESH) algorithm (originally due to Witt et al. 1998). The MESH algorithm estimates hail sizes from radar measurements of reflectivities combined with temperatures over the scanned altitude. We applied this to the nearby Wollongong Radar (Appin station) data and by combining all frames of MESH output over the event time period and averaging the maximum expected hail size we produced a spatial map of hail intensity (Figure 3). This intensity map was then used as input for further analysis, allowing the use of image thresholding techniques to obtain damage footprint contours over the affected areas (see Figure 4.). Fitting ellipses to these contours then allowed us to make a direct comparison with our Risk Frontiers HailAUS CAT loss model, a fully stochastic loss model for hail covering all of Australia.

Figure 3: Mean Maximum Estimated Size of Hail from 02:00 to 10:00 UTC using the Wollongong Radar and Joshua Soderholm’s (BoM) PyHail software. White solid lines are postcode boundaries. Analysis used the second tilt (0.9 degrees from horizontal).
Figure 4: Storm footprints extracted from contours of mean Maximum Estimated Size of Hail algorithm output over the entire event. Dashed lines represent contour levels of 30mm diameter. The maximum predicted over the entire event was 104mm diameter hail. Grey solid lines are postcode boundaries. Red solid lines are extracted hail cell boundaries with a lower threshold of 35mm, and the overlain blue ellipses are fitted to those boundaries for comparison with storm events within HailAUS.

HailAUS 7.0 includes a catalogue of hailstorms reflecting activity from local radar station data and the frequency and severity of ‘high storm potential days’ derived from reanalysis data and the observed historical record. It calculates losses for residential, commercial, industrial and motor portfolios using an approximation of elliptical storm footprints. If we take the approximated ellipses in Figure 4 for this event the estimated loss from HailAUS is $1.6 billion using the PERILs Hail Industry Exposure Database for 2018 and the Redbook Motor portfolio.

There are several factors that limit the accuracy of the HailAUS modelled loss estimate. The first two are related to the Wollongong Radar being a single polarisation instrument (only sends horizontal polarisation of radio wave). This required us to employ the MESH algorithm rather than more modern hail size estimating algorithms, such as the Hail Size Discrimination Algorithm (HSDA, Ortega et al. 2016), that can be applied to dual-pol (uses both horizontal and vertical polarisations of radar signal) stations such as the main Sydney (Terrey Hills) station. We also expect the sea-breeze produced drift, which in turn influenced the location of the damage footprint. Although not available for our analysis, the Terrey Hills Radar data is likely to be released in the near future.

Other limiting factors include the consistency and reliability of the motor vehicle market portfolio and that the damage footprint used for cars will likely be larger due to a lower threshold for car-damaging hailstones than that used in the Figure 4 contours. The latter would act to increase the amount of loss attributed to cars.

Finally, while the elliptical damage footprints in HailAUS are a very reasonable representation, they limit the accuracy of the distribution of damage compared to what is observed in the radar data and we plan to improve this in a future model update.


Disclosure of climate-related financial risk

by Stuart Browning

In light of underwhelming progress at COP-24 (the annual United Nations Framework Convention on Climate Change (UNFCCC) Conference Of the Parties (COP) in Katowice 2018), it is increasingly improbable the Paris Agreement’s ambitions will be achieved. Instead, it seems more likely that recommendations from the Financial Stability Board (FSB) will be the primary catalyst for effective action on climate change mitigation. Projections of the economic cost of climate change have always been somewhat dire (e.g. Stern 2006); and have been mostly ignored by policy makers. However, the FSB have recommended financial risks due to climate change be disclosed by all publicly listed companies. This is driving the financial sector to seriously consider the implications of climate change, and the results are likely to be sobering. With an understanding of risk comes investor pressure to minimise the risk, and this may well drive mitigation efforts above and beyond those achieved via the ‘heads-of-state’ level Paris Agreement. In Australia, this has been manifested most recently by the Reserve Bank of Australia’s stark warning last week to, in effect, “change now or pay later” (see Risk Frontiers Briefing Note 391).

Publicly listed companies are legally required to disclose material risks to their investors. This disclosure is especially relevant for banks, insurance companies, asset owners and managers when evaluating the allocation of trillions of dollars in investor capital. In 2017 the FSB released the final report of the Task Force on Climate-related Financial Disclosures (TCFD), which stresses that climate change is a material risk (and/or opportunity) that should be disclosed—preferably alongside other risks in annual reporting. The TCFD proposes a framework for climate risk determination and disclosure (Figure 1) in which risk is classified into two main types: transitional and physical. Transitional risks are those that may impact business models through changing technologies and policies: examples are a carbon tax, or stranded assets associated with redundant fossil fuel exploration and extraction. Physical risks are those associated with climate change itself: these could be chronic risks such as sea-level rise, or acute risks such as more extreme storms, floods or droughts.

Figure 1: Factors identified in the TCFD report contributing to financial risk and opportunities under climate change (TFCD 2017).

While climate change is expected to impact most businesses, even current exposure and vulnerability is not being adequately disclosed by most organisations. The Australian Securities and Investment Commission (ASIC) report in 2018 looked at climate risk disclosure in Australian companies and found that very few were providing adequate disclosure, thereby exposing themselves to legal implications; and more importantly, by failing to consider climate change as a risk, were potentially putting investor capital at risk. Companies that are attempting to disclose climate risk are typically doing so inconsistently, and with high-level statements of little use for investor decision-making (ASIC 2018). Quantifying organisational vulnerability and risk under climate change is a non-trivial task. Adequate implementation of the TCFD recommendations will likely occur over a >5 year timeframe (Figure 2) . Initially companies are expected to develop some high level information on general risk under climate change. As research progresses, disclosure should become more specific.

Figure 2: Milestones in the implementation of the TCFD (TCFD 2017).

Understanding risk in terms of weather and climate has long been of interest to the insurance sector, but is now something expected to be understood and disclosed by all sectors. The Actuaries Institute have recently developed The Australian Actuaries Climate Index, which tracks the frequency of occurrence of extremes in variables of interest, such as temperature, precipitation, wind speed and sea- level. The index provides a general level of information drawn from a distribution of observed variability. However, climate change will cause a shift in the distribution of events, meaning this information is of limited use for projections. The relationship between a warming climate and the frequency of extreme weather events is likely to be complex and peril and location specific. Quantifying physical climate risk requires an understanding of the physical processes driving climate variability, the technical expertise to work with petabytes of available data, and the capacity to run regional climate models for dynamical downscaling—these skills are typically restricted to research organisations and universities.

Useful risk disclosure will come from using the best available information to represent both past and projected climate variability. This means using a combination of observational and model based data. Exposure and vulnerability will need to be determined using weather station observations and reanalysis data. This will need to be organisation-specific and developed within the context of assets, operations, and physical locations. Risk projections can then be developed, and this should be done using scenario analysis across multiple time horizons: short, medium and long term. Short-term projections can be developed using established vulnerability together with seasonal forecasts. Medium- and long-term projections should be based on global climate model (GCM) projections developed within the framework of the Coupled Model Intercomparison Project (CMIP). These are the scenario-based industry-standard climate model projections used for the IPCC reports. The IPCC Fifth Assessment Report (AR5) was based on the CMIP5 suite of simulations. The next generation of simulations (CMIP6) are underway and should become publicly available from 2019-20 onwards. Projections of organisation-specific risk will need to be developed by downscaling GCM projections. The best results are likely to be achieved through a combination of statistical downscaling, dynamical downscaling, and machine learning.

Risk Frontiers utilises projections within its suite of natural catastrophe (CAT) loss models to investigate how losses may change in the future under different climate scenarios. Risk Frontiers adapts its CAT models, developed for the insurance industry to assist decision makers in estimating and managing catastrophe risk, to assess the impact of projected changes in weather-related hazard activity due to climate change, as well as changes in vulnerability and exposure (Walker et al. 2016). In November 2018, The Geneva Association reported on the benefits of the integration of climate science and catastrophe modelling to understand the impacts of climate change stating that “Cat modelling is more relevant than ever”. With CAT models being the ideal tool for this type of analysis, Risk Frontiers is strongly positioned to address the need for physical climate risk disclosure.

References

ASIC (2018) REPORT 593: Climate risk disclosure by Australia’s listed companies.

Risk Frontiers (2019a). ‘Change now or pay later’: RBA’s stark warning on climate change. Briefing Note 391.

Risk Frontiers (2019b), Disclosure of climate-related financial risk. Briefing Note 386.

The Geneva Association (2018) Managing Physical Climate Risk: Leveraging Innovations in Catastrophe Modelling.

Stern, N. (2006) “Stern Review on The Economics of Climate Change (pre-publication edition). Executive Summary”. HM Treasury, London. Archived from the original on 31 January 2010. Retrieved 31 January 2010.

TCFD (2017) Financial Stability Board, Final Report:
Recommendations of the Task Force on Climate-related Financial Disclosures.

TCFD (2017) Financial Stability Board, Final Report:
Implementing the Recommendations of the Task Force on Climate-related Financial Disclosures.

Walker, G. R., M. S. Mason, R. P. Crompton, and R. T. Musulin, 2016. Application of insurance modelling tools to climate change adaptation decision-making relating to the built environment. Struct Infrastruct E., 12, 450-462.

‘Change now or pay later’: RBA’s stark warning on climate change

by Ryan Crompton, Andrew Gissing, Thomas Mortlock and Paul Somerville, Risk Frontiers


The following article, by Eryk Bagshaw and Nick Bonyhady, appeared in the Sydney Morning Herald on 12 March 2019. The last line notes that “companies disclosing climate risks need to adopt a level of commonality or risk that information not being useful to investors.”

Worth noting is there are two types of climate change risks posed to business. The first is the physical risk posed to direct business operations and supply chains and the second is transitional risk of adapting operations to a climate changed future. Climate change risk disclosure is still at an early stage in Australia with no regulation at present. Most disclosures at present focus on the immediate physical risks to business and do not include transitional risk.

A recent paper by Allie Goldstein and co-authors looked at the private sector’s climate change risk and adaptation blind spots by reviewing more than 1,600 corporate adaptation strategies in the US. Some interesting findings from the paper, relevant for Australia, are:

  1. The magnitude and costs of physical climate change risks are being underestimated by companies. Companies need further guidance on estimating more realistic costs.
  2. Climate change risks to business beyond direct operations are not being considered.
  3. The costs associated with climate change adaptation strategies are being under-reported.
  4. Non-linear climate impacts, and extreme climate scenarios, are not being considered by companies in disclosures.

Risk Frontiers’ goal is to provide an objective assessment of these risks to assist companies (including those in the insurance industry) and governmental organisations in achieving that level of commonality mentioned in the Sydney Morning Herald article, reproduced in part below.


The Reserve Bank has warned climate change is likely to cause economic shocks and threaten Australia’s financial stability unless businesses take immediate stock of the risks.  The central bank became the latest Australian regulator to tell business that they must analyse their investments on Tuesday, as the Coalition grapples with an internal battle over taxpayer-funded coal fired power and energy policy.

In a speech to the Centre for Policy Development in Sydney, the Reserve’s deputy governor Guy Debelle said challenges for financial stability may arise from both physical and transition risks of climate change. “What if droughts are more frequent, or cyclones happen more often?” he asked. “The supply shock is no longer temporary but close to permanent.  That situation is more challenging to assess and respond to.”

Financial stability could be put at risk if businesses remained unaware of anticipated insurance payouts, pollution-driven reputational damage, legal liability and regulation changes that could cause valuable assets to become uneconomic. “All of these consequences could precipitate sharp adjustments in asset prices, which would have consequences for financial stability,” he said.

Dr Debelle said the increasing number of extreme climate events was also changing public opinion. “One of the things that is causing change in public opinion around this is just the straight-up occurrence of extreme events,” he said. “It’s not the way you would actually like this to come about unfortunately … [but] it has changed the general public view.”

Dr Debelle said the bank was speaking about the issue because of the size of the impact climate change would have on the economy. “Some of these developments are actually happening now,” he said. Dr Debelle said the current drought across large swathes of the eastern states has already reduced farm output by around 6 per cent and total economic growth by about 0.15 per cent. “We need to think in terms of trend rather than cycles in the weather. Droughts have generally been regarded as cyclical events that recur every so often. In contrast, climate change is a trend change.”

That has an impact on monetary policy, Dr Debelle said, citing the temporary shock of banana prices surging after Cyclone Yasi in 2011, which in turn boosted inflation by 0.7 percentage points. But he said future events may not be so one-off, with repeated climate events and the transition of the economy likely to have a longer-term impact. “We need to be aware that decisions taken now by businesses and government may have a sizeable influence on that transition path,” he said.

Dr Debelle said the transition posed challenges and opportunities. Industries especially exposed to the consequences of changes in the climate will face lower costs if there is an early and orderly transition, some will bear greater costs from the transition to a lower carbon economy, while others such as the renewables sector, may benefit “There has been a marked pick-up in investment spending on renewable energy in recent years,” he said. “It has been big enough to have a noticeable impact at the macro-economic level and affect aggregate output and hence the monetary policy calculus.”

In comments that are likely to be used against some pro-coal Nationals MPs urging the Coalition to build a taxpayer-funded power station, the deputy governor said the renewable sector was a good example where price signals have caused significant behavioural change. “There has been a rapid decline in the cost of renewable energy sources,” he said. Dr Debelle said the cost of generating electricity has declined in the case of wind and solar to the point where they are now cost-effective sources of generation. He added that storage and transmission remained relevant costs.

Despite coal being one of Australia’s top exports, Dr Debelle said opportunities remained as China transitioned away from coal. “Natural gas is expected to account for a larger share of its energy mix, and Australia is well placed to help meet this demand,” he said.

He endorsed comments by Australian Prudential Regulation Authority executive Geoff Summerhayes in London in January, which warned tackling climate change had become a “financial necessity”. In the speech to the UN’s sustainable insurance forum, Mr Summerhayes lashed government inaction, arguing the summer’s extreme weather, severe drought and floods were all fuelled by climate change, but Australia still lacked the political consensus needed to respond to the threat.

Giving the example of data on when different parts of the Gold Coast would stop being viable, Blair Comley, a former secretary of the federal Department of Climate Change and Energy Efficiency, said the lack of data on the impact of climate change made it harder to plan for. Dr Debelle said while the Reserve Bank was not responsible for developing climate policy, it had a role to play in ensuring there is adequate data.

Where there is inadequate data for the bank to make the decisions it needs to, “we can call out that,” Dr Debelle said. And he emphasised that companies disclosing climate risks need to adopt a level of commonality or risk that information not being useful to investors.

References

Goldstein, A., Turner, W.R., Gladstone, J., and Hole, D.G. (2019). The private sector’s climate change risk and adaptation blind spots. Nature Climate Change, 9, 18-25.

Sydney Morning Herald (2019). ‘Change now or pay later’: RBA’s stark warning on climate change. Available here, accessed 14 March 2019.

 

 

Cyber Attack on the Australian Parliament and the Lessons Learned

The following article was published by the Australian Outlook on March 4th, 2019. It highlights some of the most important technical and political points regarding the recent cyber attack against the Australian Parliament Network and other political parties.

Risk Frontiers are a partner in the Optus Macquarie University Cyber Security Hub focusing on quantitative risk modelling of cyber risks.


Synopsis:

In the lead up to the federal election, the Australian Parliament and multiple political parties have been hit by a sophisticated cyber attack. Experts are divided on who is to blame but the attackers had clear motives and there are some key lessons to learn from this incident.

By Associate Professor Christophe Doche, Dr Stephen McCombie and Dr Tahiry Rabehaja

On February 8, reports emerged regarding an attempt to infiltrate the Australian Parliament network, which is primarily used to exchange emails and store data. On February 18, Prime Minister Scott Morrison and Opposition Leader Bill Shorten addressed the Parliament to acknowledge the attack. The next day, the Australian Cyber Security Centre (ACSC), which is now part of the Australian Signals Directorate (ASD), confirmed that a cyber actor gained illegal access to the networks of the Liberal, Nationals and Labor parties.

Since then, investigations have revealed that the attack was sophisticated and most likely state-sponsored. It is understood the initial breach was the result of a phishing campaign, where a staff member opened an infected document attached to an email. Once the criminals got a foothold on a computer attached to the network, they scanned and infected other targets, including intranet servers. They were then able to redirect network traffic in order to exfiltrate data. They also erased logs to cover their tracks and placed additional malware to maintain control of the infected systems for later use.

digital forensics analysis has shown that the attack relied on a series of malware and exploits, which happened to be in several cases slight modifications of existing open source tools. That is what fooled primary anti-virus software. Many of these open source tools are ironically used by the ethical hacking community to find vulnerabilities in computers and systems with the aim to report and, ultimately, fix them. They are written in the popular language C# for the .NET framework. All these factors indicate there was a clear desire from the attackers to remain undetected for as long as possible and to make attribution – the identification of the perpetrators of the attack – a difficult task.

Figure 1: Reverse engineering some parts of the malware used by the hackers shows that they leverage on well-known penetration testing tools (source: Yoroi).

Although there is no clear evidence – at least none that has been released – the media speculation is that China is most likely behind this attack. China has a long history of cyber espionage operations globally and also locally against the Australian Government, our defence sector, mining industries and even universities. This incident happened on the back of the banning of Huawei from Australia’s 5G network, recent tensions in regard to trade and multiple claims of improper Chinese influence on Australian political parties. There have also been reports that Iran may have been the perpetrator but it is difficult to see what they would gain in Australia from such an action. They have been active in recent times against US targets and perhaps may see Australia as a way into the Five Eyes intelligence alliance or alternately our close relationship with Israel (their bitter enemy) and plans to formally recognise West Jerusalem as the capital of Israel may have made us a target.

Perhaps most surprising is that this attack was actually successful at getting into the Parliament and Australia’s major parties, despite the amount of warning of the potential for such attacks to occur. Attacks on the Democratic National Committee in the United States in 2016, which accessed multiple email accounts including that of Hillary Clinton’s campaign director, by Russian Military Intelligence (GRU) are well known and documented. In the aftermath, members of the Democratic Party visited a number of European countries and spoke to political parties to specifically warn of the risk of such cyber breaches. Similarly, the ASD briefed political parties on threats to our elections in 2017. In July 2018, the Australian Government also offered $300,000 to help political parties shore up their cyber security. In addition, the Government has significantly grown the scope and size of the ACSC and other cyber capabilities. Despite this, these attacks have penetrated our Parliament and major political parties just months before a highly contested election where matters of relations with China are likely to be debated.

One key observation here is that the Government has a very large cyber risk footprint. It employs tens of thousands of employees and human beings have always been part of cyber security issues and solutions. This incident is no exception. Governmental networks are complex, shared and scaled infrastructures, which greatly increases the chance of overlooking security lapses and facilitates the propagation and replication of attacks to other agencies cheaply and quickly. Government agencies are also very attractive targets. They hold a large volume of confidential and personally identifiable information, they are the top target for politically motivated attackers and cyber warfare, and they are amongst the main victims of cyber espionage. This means that they are attracting multiple categories of threat actors ranging from organised cyber criminals looking for financial gains to advanced persistent threats backed by state actors. The Australian Parliament network incident emphasises these three points, but also highlights the Government’s large cyber attack surface area, since such an attack could have occurred in any one of the many interlinked agencies’ digital information and infrastructure.

Although the response to this incident has been swift and there is no evidence that any data has been leaked, the ACSC has warned that the actor, whoever it may be, will probably further target other Australian Government departments. The Government needs to understand, build and protect its digital infrastructure, and associated exposure, with the appropriate controls and responses. The NSW Government and the Government Chief Information Security Officer have taken a leading role in this area by releasing in February 2019 the NSW Cyber Security Policy. Among other measures, this policy mandates every agency to identify its crown jewels – its most valuable or operationally vital systems or information – and implement regular cyber security education for all employees, contractors and outsourced ICT service providers. These two measures alone will go a long way to improve the cyber resilience of NSW Government agencies.

 

Associate Professor Christophe Doche is executive director of the Optus Macquarie University Cyber Security Hub, the first initiative of this kind in Australia, linking academics in information security, business, criminology, intelligence, law and psychology together with cyber security experts from industry. As part of his role, he oversees research, education and thought leadership activities in cyber security.

Dr Stephen McCombie is a senior lecturer in Cyber Security at Macquarie University. His current research interests are in digital forensics, cyber threat intelligence and information warfare. His research draws on a diverse background in policing, security and information technology over the last 30 years. He has also held senior positions in information security with IBM, RSA, National Australia Bank and most recently SecureWorks.

Dr Tahiry Rabehaja is a Software Engineer at Risk Frontiers and research fellow at the Optus Macquarie University Cyber Security Hub specialising in quantitative risk modelling. He has a background in information security and formal program verification and, in particular, the development of mathematical models for quantifying confidentiality in programs. His current research is on the quantification of cyber security risk.

 

Townsville 2019 flood – insights from the field

By Andrew Gissing, James O’Brien, Salomé Hussein, Jacob Evans and Thomas Mortlock

Flooding impacted large areas of Townsville from Wednesday 30th January 2019, as a consequence of heavy rainfall across the north of Queensland. The Bureau of Meteorology (BoM) noted that 370mm of rain fell within 24 hours at Paluma near Townsville. Almost 3300 properties were damaged, thousands were asked to evacuate and there were widespread blackouts . The flooding came in waves, with the initial rainfall causing around 30 cm of flooding in the worst affected areas. This subsided somewhat before more rain fell in the catchment, necessitating the release of water from the Ross River Dam, which led to flood depths of up to 1.6 m over floor height.

These were the highest rainfall volumes on record for this area, with most rain gauges suggesting the rainfall volume was on the order of at least a 1:200 year event. In some areas, like Mt Margaret, it was much greater than this. This rain event was produced by the southern arm of a low pressure trough that was centred over the Gulf of Carpentaria, drawing rain in from the Coral Sea. The low pressure system was part of the monsoonal trough which occurs around this time of year. This same causal event was also responsible for the generation of Cyclone Oma, which has concerned people tuning in to the regular Bureau of Meteorology updates.

While it is not possible to say whether climate change played a role in the intensity of this event, the event occurred after the final breakdown of the blocking high pressure system in the Tasman Sea which was the cause of the sustained heatwave conditions over much of Southeast Australia in January and delayed the onset of the monsoon. There is some theoretical base to suggest the stationarity of synoptic weather systems may increase with a weakening of the summertime circulation associated with anthropogenic warming, although there is little coherent evidence for this at present.

Risk Frontiers and Insurance Council of Australia staff visited Townsville on 11th and 12th February, supported by a grant from the Bushfire and Natural Hazards CRC. The aim was to undertake unstructured interviews with residents and business operators to gain preliminary insights into impacts and responses to warnings and to examine initial recovery. In total, more than 20 residents and six business operators were spoken to. This briefing note highlights key preliminary themes that arose from this research.

Research Preparation

Before arriving in Townsville, Risk Frontiers acquired the Townsville city maps which indicated the likely flooding from a 2000 m3/s release from the Ross River dam and georeferenced that image onto the city. This was done by using the coordinates published on the frame of the map (the graticule) and linking that to points of reference on the ground. [See figure 1]

Image classification was then used to extract the colours from the map which corresponded to the different depths of potential inundation and a GIS layer corresponding to those depths was created. Depths were then determined for every G-NAF (Geocoded National Address File) point which fell within the inundation layer and were mapped accordingly. OpenStreetMap data was also incorporated for a reference to the streets and other points of interest in the city, and the landuse data from ABS meshblocks (the smallest statistical area containing around 20 households) was used to quickly separate residential from commercial and industrial building types.

The purpose of this map was to identify the most affected areas to prioritise the investigation and to assist in validating the modelled depths (which seemed to be very accurate, with minor exceptions usually being lower elevations in parkland or likely due to very minor shifts (e.g. <1m) in registration of the image prior to the analysis being performed).

Figure 1: Sample Townsville map showing modelled flood extents and depths from Townsville City Council and inferred property depths

Impacts

Residential flood damages in the main appear to have been restricted to the ground story areas of raised dwellings, with peak flood heights reaching roughly halfway through these first floor or understorey living areas. In many cases it would appear that these spaces were occupied at the time of the flood and, in some instances, rented to others. The majority were certainly used for extensive storage. There was a smaller number, approximately one quarter of lower-set, slab on ground dwellings in which flooding impacted main living areas.

Almost every home and business on the floodplain had a large muddy pile of possessions stacked by the roadside awaiting council pick-up [figure 2]. Common residential items damaged were carpets; household appliances such as fridges, washing machines, dryers; cupboards and drawers; fabric lounge chairs, chairs and tables; hardware; bedding; doors and outdoor furniture. Some residents mentioned stacking goods on tables or on shelves within the ground storey to attempt to put goods above the floodwaters or to relocate smaller, valuable items to the upper storey (where possible). At least one resident employed the creative solution of placing valuable items on inflatable platforms.

As many living spaces were spared damage on upper floors, the majority of people appeared to have remained living in their homes. Those whose dwelling was not habitable reported staying with friends.

Commercial damages largely varied with the type of business. We observed a number of businesses that had suffered significant losses. For example, the Townsville RSL suffered a total loss downstairs due to the floodwaters and was also in the process of stripping the upstairs due to mould that developed following the flooding. The RSL noted that they were receiving support from other clubs (e.g., supplying the RSL with their surplus equipment) and expected to have the upstairs of their business operating again within four weeks. However, they faced longer lead times for suppliers to fully refit the downstairs and were estimating business interruption of some six months.

Some businesses reported that they could not move large pieces of equipment to protect them in time. Most reported that they were insured and some said that they had sufficient warning time to relocate equipment, including stock and computers, with only minor damage suffered. An electrical / solar installation business had lost around $10k worth of stock after up to 1.5m of water affected their business. The manager said he had redeployed half of his workforce to make safe existing solar installations where the equipment (inverters or isolators) may have been damaged by brackish water while the other half of his workforce completed new installations. He estimated that, with 400 installations to inspect and make safe, it would be many weeks before his workforce would be back to business as usual.

Most flood-affected businesses had closed for a week to enable clean-up and restoration to occur, with some reporting slightly longer shutdowns as they had made preparations in the Thursday and Friday before flooding. They operated without electricity in general for four or more days (Monday 4 February to Thursday 8 February) but continued to clean up. The majority had restarted trading if they had power reconnected and had not suffered significant losses (e.g., the local Ford dealer, automotive workshops and electrical wholesalers were operational) but a number of restaurants and cafes were shut along Charters Towers Road in south Townsville, presumably due to a lack of electricity and spoiled food due to a lack of refrigeration and perhaps mould in their kitchens. Outside of the flood-affected areas a café operator reported that they had lost their food supplies and were still working to get back to being fully stocked.

Figure 2: Roller doors in Eastern Idalia potentially damaged by an electrical short. High flow flood waters were ruled out as a cause based on lack of debris and vegetation.

While significant flood velocities were not reported in Adalia, some structural damage was observed to a few roller doors which were electrically operated. It was surmised that there may have been an electrical short causing the motor to attempt to open the door while it was locked in place, twisting and bending the door upwards within its frame. Figure 2 shows an example of this. There was also one occurrence of a tree having fallen on a building, seen in figure 3.

Figure 3: Debris and furniture in the foreground. In the background, a tree has fallen on the roof of what appears to be a childcare centre.
Figure 4 – Showing damage to a commercial property’s roller doors in Hyde Park.

Across the wider Townsville community, many schools had been closed as flooding was occurring and have now reopened, but a number of early-childhood centres remain closed. Several parks with play equipment have also been closed.

Recent commercial developments were also subject to flooding. These buildings have floor levels set above the one in one hundred year flood level, but that wasn’t sufficient to prevent significant water depths flowing through them. This included a large number of shops in Fairfield where BP, Bunnings, the Fairfield Central Shopping Centre (Woolworths, Kmart and a number of smaller businesses) and Fairfield Homemaker Centre (Petbarn, Pillowtalk, Godfreys etc.), were all still closed a week after floodwaters had subsided.

Community response to warnings

The Bureau of Meteorology, Townsville Council and the QLD Fire and Emergency Services (QFES) provided warnings and information to the community throughout the event via websites, traditional media, door-knocking and social media. The local Council also utilised text messages and other social and traditional media to convey information during the flooding and the dam release.

Many in the community appeared to be caught off-guard by the scale and speed at which the flood occurred. Others believed that residents simply did not believe that the magnitude of the flood would eventuate. They discussed how their decision-making was influenced by a number of past flood events and many spoke of their memories of previous events and then the realisation that this was going to be a larger event when their local landmarks of previous flood extents were submerged.

Overall, people described flood warnings as ‘okay’. Some implied they had found the warnings and particularly the maps difficult to understand and, as a result, misinterpreted the potential level of floodwaters at their house. Others, however, noted that, while text message warnings were vague, it had prompted them to seek further information from the range of sources available and to “take responsibility” for what might happen to them. Suggestions for improvement included providing warnings more regularly and, in regard to the dam release warning, earlier. The suggestion that “if council knew there was a hard limit and the gates would open automatically that should have been conveyed” was repeated a number of times. There was limited criticism of the dam operators, with the majority feeling that “they had done a good job” under difficult circumstances and had the water not been released “it would have been a lot worse”. A dissenting opinion was that, if the dam is to be used for flood control, it should be largely empty before the wet season to maximise the ability of the dam to retain flood waters.

There was significant local flood experience among the worst affected areas in Hermit Park and Rosslea, with many locals stating they had lived in the area for a long time (some with family experience back to the 1940s) and they were well aware of the nature of flooding in the area. They hypothesised that some development had made the flooding worse (infilling of an old rubbish dump with a retaining wall that acted as a dam or levee at Bicentennial Park, for example); recollections of watching floodwaters overtop what is now Idalia while remaining dry in Rosslea were also common. The refrain “how could they have allowed that development” was heard from a number of long term-residents.

Initial Recovery

Both formal and informal mechanisms were observed to have assisted recovery efforts. Emergency services, defence personnel and council staff were assisting with the clean-up. Others brought assistance for those affected on an informal level, and family and friends assisted in the clean-up. In general, the mood among those we spoke to was upbeat, with the majority having insurance and stating “it could have been worse” or “I’m lucky, others have it worse than me” – often while standing beside a pile of ruined belongings on their lawn. The generosity of the flood victims was also apparent, with most people offering us water, food, a spare hat etc. despite having had a difficult time already and with likely more hard work ahead of them.

The resilience of the community was reassuring and inspiring.

Discussion and conclusions

Though Townsville had just experienced a significant and very damaging event, we were left with a sense that the community was functioning, and that there was resilience amongst community members, who seemed to be getting on with the job of cleaning up despite significant uncertainty over the coming weeks through the recovery.

Despite commentary about the size of this flood being unprecedented, bigger floods are definitely possible in Townsville (even denoting this event as a 1 in 500 year event it is far below the potential extent and depths likely to be experienced by a Probable Maximum Flood (1 in 10,000 year event)) and there is much to be learnt from this event. The physical and social impacts would have been far greater had the floods been only a little higher as they would have inundated living spaces of two-storey homes, making them entirely uninhabitable and doubling (or worse) losses for families and debris to be collected and dramatically increasing the displaced population.

There are significant opportunities to better understand community risk perceptions, responses to warnings, sheltering behaviours and flood damages, as well as gaining evidence of the effectiveness of flood mitigation and flood warning systems.

Several policy and communication issues are already apparent, including:

  • what should be done to reduce flood damages in enclosed ground floor areas of raised dwellings? At the least these areas should not be rented as habitable space to others
  • while the Townsville community is fortunate to have the resources of the Australian Defence Force nearby, a larger flood would have necessitated many more rescues which might have overwhelmed their capability. In any case, without local defence resources, a much wider emergency response would have been required
  • as raised in our previous briefing note on land-use planning in flood prone areas, it is essential to adopt a risk-based approach to floodplain management and to ensure that the disclosure of risk considers all event magnitudes.

 

To build or not build: that is the Townsville question

Andrew Gissing

Many would remember the computer game SimCity, an opportunity to build fictitious cities, with the aim of being re-elected as mayor and generating enough tax revenue to maintain vital community infrastructure. Despite, the advanced level requiring some consideration of fires, alien attack and other hazards, for the average player it was all about city growth. In real life, however, hazards occur, and we need to plan for them whilst balancing numerous competing priorities. ‘How’ is often a hotly debated topic.

Planning for floods

Media criticism has been levelled at the development of flood prone areas in Townsville with some of the flooded areas described as ‘newly built’, implying that they were approved in the modern era when there should have been a good understanding of the flood problem. Land use planning is an essential component of disaster risk management and hence is vital in managing existing, residual and future flood risks.

Australian Defence Force members assist with the clean-up of a newly developed Townsville Estate

Many areas of Australia have adopted land use planning policies for residential buildings based on the 1% Annual Exceedance Probability (AEP) flood with an additional level of freeboard applied (safety factor). There is no national standard to define flood planning levels and such policies must be suitable for individual communities ((National guidance regarding the floodplain management process including key considerations for managing flood risk can be found at knowledge.aidr.org.au/media/3521/adr-handbook-7.pdf))

A community survey undertaken by Townsville City Council in 2015 identified the risk appetite of residents for different classes of development. Flooding of residential and commercial buildings in the 1% AEP event (1 in 100-year Average Recurrence Interval (ARI)) was viewed as unacceptable but flooding in the 0.2% AEP event (1 in 500 yr ARI) was acceptable to most.

For many years in Townsville land use planning was based on the 2% AEP event (1 in 50 yr ARI) ((The 2009 Townsville Natural Disaster Risk Assessment Study says that the 1% AEP standard was only recently introduced.)). Other Queensland communities, for example Bundaberg, have also used this level in the past ((Since changed to the 2013 flood level which is equivalent to the 1% AEP event and largest flood on record.)). In recent times Townsville City Council adopted the 1% AEP event as the defined flood level with four classes of flood hazard to establish development controls as shown in Appendix 1. Given that the flooding experienced in the 2019 event was significantly greater than the 1% AEP event it is not surprising that many newly developed suburbs where affected.

Whilst development in High hazard areas is avoided, development in areas of Medium hazard within the extent of the 1% AEP flood ((eplanning.townsville.qld.gov.au.)) appears allowable, but with buildings needing to have floor levels above the 1% AEP flood level ((Essential infrastructure at a minimum is required to be developed above the 0.5% AEP level. Higher requirements are set for hospitals, emergency service facilities and major electricity infrastructure which are restricted to areas above the 0.2% AEP event.)) to limit flood damage. Such an approach is not necessarily uncommon but should require an assessment of access and egress safety and require the continued policing of regulations to prevent development below approved floor levels. Previous Risk Frontiers flood investigations have observed development of ground floor spaces for habitation, and in some instances renting of these spaces to vulnerable or low-income tenants (e.g. Lismore).

In communities that may become isolated in frequent events but inundated in rarer events or for which isolation duration is intolerable, consideration should be given to the feasibility of community evacuation in events rarer than the 1% AEP flood. This should avoid the creation of low flood islands where evacuation access is lost early in a flood only for residents to later experience inundation. Adequate warning time for residents to evacuate is of course essential.

Existing land-use planning policies in Australia are largely probability based, reliant on set thresholds and do not fully account for the level of flood risk that would require wider consideration of possible flood consequences above a defined flood level. After the 2011 Queensland floods, the Queensland Chief Scientist stated ((www.chiefscientist.qld.gov.au/publications/understanding-floods/)):

Currently nearly everywhere in Australia the 1% AEP event, or ‘1 in 100 year flood’, with an appropriate additional height (or freeboard) for buildings is designated as having an ‘acceptable’ risk for planning purposes, regardless of the potential consequences of the flood.

Other countries such as the United Kingdom and the Netherlands ((Other cities globally have very little building controls as was apparent after Hurricane Harvey in Houston – www.washingtonpost.com/graphics/2017/investigations/harvey-urban-planning/?noredirect=on&utm_term=.5a3d5e8c66ac)) have adopted higher standards. For other hazards in Australia more stringent regulations have been adopted: for example, building standards for earthquakes are based on a 1 in 475-year ARI event. Floodplain Management Australia (the peak body for floodplain management practitioners in Australia) has long supported the need to adopt a risk-based approach. Some South East Queensland councils have adopted such an approach, including the application of building controls above the 1% AEP flood. The national flood manual states ((knowledge.aidr.org.au/media/3521/adr-handbook-7.pdf)):

Considering the full range of flood risk in zonings can encourage development in locations where it is compatible with flood function and flood hazard, and where emergency response arrangements are sustainable.

As Townsville recovers and continues to grow as a major Australian regional city it will need to balance multiple competing interests. There is an opportunity cost involved in prohibiting development that must be balanced against the level of flood risk. In NSW, for example, this balance has long been referred to as involving a ‘merits-based’ approach that requires the balancing of social, economic, ecological and flooding factors.

Policy makers should also consider whether existing policies are consistent with the risk appetite of local communities, which is not often well defined.  The Queensland Floods Commission of Inquiry ((www.floodcommission.qld.gov.au/publications/final-report/)) stated:

Whether the 1% AEP flood constitutes an acceptable level of risk for development, and in particular residential development, is a vexed issue. The consequences of flooding are likely to be at their most disastrous for residents and homeowners. Floodplain Management in Australia recognises this: according to it, the community must play a role in determining what level of flood risk it is prepared to live with.

The 1% AEP flood level is not necessarily fixed and should also be expected to evolve over time. The introduction of the new Australian Rainfall and Runoff guidelines and collection of new flood and rainfall data may alter understanding of flood risk. Climate change impacts must also be considered as flood frequency may change in the future.

There is a need to inform residents of the full extent of the risk

Though it is obviously possible to identify flood levels beyond the 1% AEP event, flood mapping available online through the Townsville City Council (“Townsville Maps Flooding”) does not provide information for this. This is not uncommon in Australia, which has inconsistent practices concerning the disclosure of flood risk information across local authorities. Often risk disclosure is limited to areas subject to planning overlays defined typically by the 1% AEP flood.  Without risk disclosure, residents living in areas susceptible to rarer events may be unaware of their risk. This may result in residents opting out of flood insurance believing their property is flood free.

Eburn and Handmer (2012) ((Eburn, M and Handmer, J., ‘Legal Issues and Information on Natural Hazards’ (2012) 17 Local Government Law Journal, 19-26)) suggest that the reluctance, at least anecdotally, to disclose risk information is driven by legal liability. It is subsequently argued that the risk of disclosing reasonably accurate hazard information in a planned manner is less than deliberately withholding information.

This issue requires further consideration and action. The Victorian Government for example has committed to ensuring the full disclosure of flood risks to individuals beyond the 1% AEP event through the Victorian Flood Management Strategy ((www.water.vic.gov.au/managing-floodplains/new-victorian-floodplain-management-strategy)) and some local councils in other areas already disclose the risks associated with extreme flood events.

Of course, consideration must be given as to the most effective manner of communicating such information so that it is easily understood. It is well known that the ‘1 in 100-year flood’ is a widely misunderstood concept amongst community members. Further risk communication efforts are necessary in this regard.

Risk Frontiers regularly undertakes post event research to inform future policy and to improve the estimation of damages. Risk Frontiers visited Townsville this week with the support of the Bushfire and Natural Hazards Cooperative Research Centre. A further Briefing Note is under preparation to outline our key findings. Please contact Andrew Gissing for further detail (andrew.gissing@riskfrontiers.com)

Appendix 1


 

Global Tropical Cyclone Landfalls, 1970 to 2018

Roger Pielke, Jr. (University of Colorado and Associate of Risk Frontiers) and Ryan Maue (Cato Institute and Weather.us)

In 2012 we (along with Jessica Weinkle) published a time series of historical global tropical cyclone landfalls (available here in PDF). Much to our surprise at the time, no such database had previously been assembled. Since then we have updated our dataset on an annual basis, and report here some details of our 2019 update, which extends our global time series to 1970 to 2018.

We employ the definition of a tropical cyclone landfall used by the U.S. National Hurricane Center as “the intersection of the surface center of a tropical cyclone with a coastline.” We include all major land areas and islands, but do not include some tiny islands (see our paper for details). Also, landfall data are available for many basins prior to 1970, which we employ as the starting date for our comprehensive, homogenous, global dataset. Finally, for consistency we categorize tropical cyclones using the Saffir/Simpson scale, recognizing that other metrics of intensity are used around the world.

The figure below shows the total number of tropical landfalls at hurricane strength for 1970 through 2018. Note that 2018 data are preliminary, and will be finalized when each reporting agency finalizes their “best track” data. There is no obvious or simple trend in the data, and one can generate up or down trends by picking and choosing dates to examine.

Similarly, the figure below shows these data separating out S/S category 1 and 2 storms (black bars) from those at S/S category 3+. Again, there are no simple trends observable over this period.

Here are some summary statistics for these data:

  • All landfalls: 15 (median), 15.3 (average), 4.4 (sd)
  • Categories 1 & 2 at landfall: 10, 10.5, 3.8
  • Category 3+ at landfall: 4, 4.8, 2.5
  • Most total landfalls in one year: 30 (1970)
  • Fewest total landfalls in one year: 7 (1978)
  • Most Category 3+ landfalls in one year: 9, (1999, 2004, 2005, 2007, 2008)
  • Fewest Category 3+ landfalls in one year: 0 (1981)
  • Most total landfalls over a 10-year period: 177 (1988-1997)
  • Fewest total landfalls over a 10-year period: 120 (1975-1984)
  • Total landfalls 2009-2018: 140
  • Most Category 3+ landfalls over a 10-year period: 65 (1999-2008)
  • Fewest Category 3+ landfalls over a 10-year period: 33 (1978-1987)
  • Total Category 3+ landfalls 2009-2018: 44
  • Total landfalls 1970-2018: 750, (516 were Categories 1 & 2, 234 were Category 3+)

While data on global tropical cyclone occurrence are of utmost importance in understanding storm dynamics and how they may be changing, now and in the future, landfall data are of particular importance to those whose focus is on damage, including insurance and reinsurance.

To that end, Aon has also started to publish annual statistics on global landfalls (available here in PDF). The Aon dataset, which uses slightly different definitions and methods than we do, is for 1980 through 2018 and is correlated with our dataset at 0.96 (more precisely: our counts differ only in 6 of 39 years, in most cases by just 1 storm).

Both datasets indicate for the world as a whole, and in each of its ocean basins that experience tropical cyclones, there is at present little empirical evidence to support claims that land-falling tropical cyclones have increased in number or intensity on climate time scales.

In an era where the weather is often the subject of contentious political debate and modern communication technologies can bring every disaster to our living rooms, it remains important to maintain an empirical perspective on long-term trends in those extreme events which cause death and destruction around the world.