Risk Frontiers offer Australian Hail Model via ModEx

SIMPLITIUM PRESS RELEASE – 26.03.2019

Risk Frontiers’ HailAUS 7.0 model is now available on ModEx®, the independent multi-vendor catastrophe modelling platform for the insurance industry. Risk Frontiers represent the seventh model vendor on ModEx.

Risk Frontiers specialises in disaster risk assessment and management across the Asia-Pacific region. HailAUS 7.0 is a fully stochastic loss model for hail and covers all of Australia. The model includes a catalogue of hail storms reflecting activity from local radar station data and the frequency and severity of ‘high storm potential days’ derived from reanalysis data and the observed historical record. It calculates losses for residential, commercial and industrial property, as well as motor portfolios.

Foster Langbein, Chief Technology Officer, Risk Frontiers comments:

“ModEx complements our Multi-Peril Workbench offering perfectly and is a compelling solution for firms wishing to use our models but have neither time nor resources to engage with our complete native software solution. This implementation leverages the Oasis Loss Modelling Framework’s new complex model wrapper capabilities, enabling the integration of our native model engine which ensures events and losses are consistent and directly comparable between the Oasis Loss Modelling Framework and Multi-Peril Workbench.”

James Lay, Commercial Director, ModEx:

“With hail being Australia’s most costly natural hazard, responsible for the country’s most expensive insured natural disaster ever, the importance of making this model more readily accessible is clear. We are proud to be able to offer it to our clients through ModEx, as we continue our mission of delivering greater choice of cat modelling services to the industry.”ModEx provides a vibrant catastrophe modelling ecosystem for the (re)insurance industry, uniting multiple catastrophe models, hazard maps and data enhancement services through one platform.

For further information, please contact:

John Yonker
CEO
Simplitium
+44 (0)20 3872 1943
john.yonker@simplitium.com


About ModEx®

ModEx is the only independent multi-vendor catastrophe risk modelling platform for the (re)insurance industry. Powered by the Oasis LMF, ModEx delivers a hosted and fully managed catastrophe risk modelling service that offers a new and cost-effective way for firms to meet their modelling requirements. The platform creates an ecosystem where model vendors make their models available to the industry via a single user interface, improving the quality and choice of models available in the market. For further information, please visit www.simplitium.com/modex

About Risk Frontiers

Risk Frontiers specialises in the assessment and management of disaster risk across the Asia-Pacific region. For almost 25 years, Risk Frontiers has been developing a range of probabilistic natural catastrophe loss models by combining local expertise, the latest science and innovative modelling techniques. Its current modelling suite covers the major perils in Australia – bushfire, earthquake, flood, hail, tropical cyclone – as well as New Zealand earthquake. Their models are currently licensed to a range of domestic insurers, global reinsurers and reinsurance brokers. For further information, please visit: www.riskfrontiers.com

Why are we not taking climate change more seriously?

Thomas Mortlock, Jonathan van Leeuwen and Paul Somerville, Risk Frontiers

Figure 1. Robert FitzRoy. Source: Wikipedia (2019a)

Robert FitzRoy was an English Officer of the Royal Navy, most famously known for captaining HMS Beagle during Darwin’s voyage around the world, and being the 2nd Governor of New Zealand. He was also perhaps the world’s first modern-recognised weather forecaster (even coining the word “forecast” and setting up what is now known as the UK Met Office).

FitzRoy recognised the need for weather prediction after a series of storms on the English coast shipwrecked passenger ships with the loss of many lives. He petitioned the Government of the day and received funding to operationalise his daily weather forecasts. However, there was widespread lack of public acceptance of his forecasts and FitzRoy was ridiculed at the apparent lack of accuracy.

The fascinating life story of Robert FitzRoy culminated in a tragic end when, in April 1865, he took his own life after suffering from depression believed to be associated with having to defend himself against the public’s attacks of his weather forecasts.

While FitzRoy left an important legacy for weather forecasting in the UK, his story rings true of the (still) sizeable lack of public acceptance of climate change today. Why are we not taking climate change more seriously? There is certainly sufficient empirical evidence to suggest anthropogenic climate warming is occurring. A recent study by Frances Moore et al. in the journal PNAS tries to explain why this might be.

The boiling frog effect

Moore et al. (2019) used a sample of over 2 billion social media posts from Twitter in the US to investigate the drivers behind public perception of climate change. The research suggests that experience of weather in recent years – rather than longer historical periods – determines the climatic baseline against which current weather is evaluated, potentially obscuring public recognition of anthropogenic climate change.

The metaphor of a “boiling frog”[1] describes the phenomenon whereby the negative effects of a gradually changing environment become normalised so that corrective measures are never adopted. In this instance, the declining noteworthiness of historically extreme temperatures is not accompanied by a decline in the negative sentiment that they induce, indicating the social normalization of extreme weather conditions.

The study shows that, despite large increases in absolute temperature, anomalies relative to a shifting baseline are small and not clearly distinguishable from zero through the 21st century (Figure 2).

Figure 2. Effect of shifting baselines on the remarkability of temperature anomalies. Population-weighted annual average temperature anomalies over the US under the IPCC’s (most extreme case) RCP 8.5 with 40 realizations of internal variability. Anomalies are defined relative to a fixed 30-yr period (1981-2010, red line) and relative to a shifting baseline (blue line). A shifting baseline reduces the remarkability of increased temperatures to near zero. Source: Moore et al. (2019).

This is a hugely important notion for government, as public policy tends to advance during windows of opportunity provided by focused public attention. Without public perception of a problem, the ability of policy-makers to advance an agenda is limited.

Moore et al. conclude that it is unlikely that rising temperatures alone will be sufficient to produce widespread support for mitigation policies.

It is important to highlight that this is a US-based study, biasing the results toward domestic sentiment (and overt political denial) of climate change in the US. However, there are certainly similarities between the public discourse of climate change in the US and Australia. The problem of a ‘shifting baseline’ is also potentially exacerbated in Australia, where interannual climate variability such as El Nino-Southern Oscillation (ENSO) plays an important role in modulating the behaviour of extreme weather events. Unfortunately, it seems, the reference point for socialised ‘normal’ conditions appears to be based on weather experienced between two to eight years ago, which coincides with the timeframe on which ENSO fluctuates.

The authors also point out that their results relate to ambient average temperatures only. It may well be that more acute extreme events are both more consequential and more salient and therefore less prone to normalisation.

Taking a longer-term view

So, what could the answer be to communicating climate change risk to a public with a constantly shifting baseline? A powerful approach to overcoming short-termism is presenting the long-term picture and contextualising the present-day climate within this.

The recently-published State of the Climate 2018 report by CSIRO/BoM (2019) is unequivocal about the unprecedented nature of today’s current levels of atmospheric CO2, the anthropogenic cause of this and the trajectory that we are locked into for the coming decades.

Two well-known figures are particularly useful in this regard. The first (Figure 3) combines a paleo-reconstruction of atmospheric CO2 concentrations from measurements of oxygen isotopes in Antarctic ice for the past 800,000 years (left panel) with historical observations of CO2 levels measured at the Bureau’s observation station at Cape Grim in Tasmania for almost the past 200 years.

Figure 3. Long-term variability in atmospheric CO2 from Antarctic ice core records (green line, left panel and black line, right panel), and historical observations of CO2 in Tasmania to present day (blue line, right panel).

As can be seen, in the last 800,000 years CO2 varied with the very long glacial-interglacial cycles (a periodicity of around 100,000 years) and was generally between 170 and 300 ppm (parts per million). In just the last 200 years, CO2 as measured in Tasmania has increased from around 280 ppm (largely typical of an interglacial) to 400 ppm.

In other words, we are about 75 % above the natural variability of CO2 in the atmosphere relative to almost the last one million years. This is hugely significant because there is a high correlation between atmospheric CO2 concentrations and temperature. We may not be seeing the full effects of high atmospheric CO2 just yet because of inertia and retention in the climate / ocean system – but for this reason we are locked into a warming trend for decades to come.

Some may note that from Figure 3 we are ‘due’ for another glacial period – but how this will play out with now unprecedented levels of atmospheric C02 relative to the last million years is unknown. This introduces the realm of feedback loops and ‘abrupt climate change’ – a topic we touched on in Briefing Note No. 374.

The next question, if it still needs asking, is: how do we know the recent exceedance of CO2 levels above the long-term natural envelope is anthropogenic, i.e., human-induced?

Figure 4 illustrates an Australian-based modelling study which addresses this question. The grey line represents Australian temperature observations since 1910, with the black line the ten-year running mean. The shaded grey and blue bands are the 10-90% range of the 20-year running mean temperatures simulated from the latest generation of Global Climate Models (CMIP5). The grey band shows simulations that include the observed conditions of greenhouse gases, aerosols, solar input and volcanoes. The blue band shows simulations of observed conditions but not including human emissions of greenhouse gases or aerosols. The red band shows simulations projecting forward into the future (including all IPCC emissions scenarios).

Figure 4. Observations and modelled reconstruction of temperature anomalies in Australia over the past 100 years both with (grey band) and without (blue band) human emissions included.

The grey band shows that global climate models that include human emissions of greenhouse gases or aerosols provide a reasonable reconstruction of temperature changes over the last 100 years. The blue band demonstrates that, without these human effects included, temperature change is insignificant over this period.

By inference, this suggests that the recent warming in Australia can only be explained by human influence. The future trajectory of warming over the next two decades continues this trend – however much we mitigate global carbon emissions – because of the slow response of the climate-ocean system to elevated greenhouse gases.

The past repeating; the future uncertain

Notwithstanding the passage of 150 years since Robert FitzRoy’s time, forecasts of future weather conditions are still not wholly accurate and the lack of public acceptance of climate change today in some quarters is reminiscent of the public’s reaction to FitzRoy’s pioneering forecasts in the mid-1800s. Recent research suggests this response may be related in part to an ever-changing perception of what environmental conditions are ‘normal’ and hence the metaphor of a boiling frog. The scientific evidence, however, that today’s levels of CO2 in the atmosphere are well outside the envelope of natural variability experienced for almost the past one million years, is irrefutable.

References

Commonwealth Scientific and Industrial Research Organisation and Bureau of Meteorology [CSIRO and BoM] (2019). State of the Climate 2018. A report prepared by CSIRO and Bureau of Meteorology, Commonwealth of Australia, 2018, pp 24.

Moore, F.C., Obradovitch, N., Lehner, F., Baylis, P. (2019). Rapidly declining remarkability of temperature anomalies may obscure public perception of climate change. Proceedings of the National Academy of Sciences of the United States of America (PNAS), 116(11), 4905-4910.

Wikipedia (2019a). Robert FitzRoy. Available here, accessed 14 March 2019.

Wikipedia (2019b). Boiling frog. Available here, accessed 14 March 2019.

[1] The boiling frog is a fable describing a frog being slowly boiled alive. The premise is that if a frog is put suddenly into boiling water, it will jump out, but if the frog is put in tepid water which is then brought to a boil slowly, it will not perceive the danger and will be cooked to death. Source: Wikipedia (2019b).

Risk Frontiers Newsletter Volume 18, Issue 2

Sydney Hailstorm: December 20, 2018

by Salomé Hussein and Foster Langbein

On December 20, 2018, a severe hailstorm struck the greater Sydney region in the mid-afternoon. The most impacted areas were Liverpool in Sydney’s southwest and further to the north, Castle Hill and Berowra. Hail sizes as large as 8cm diameter were reported (examples shown in Figure 1) and Chipping Norton, near Liverpool, experienced up to 10cm. More minor damage was reported over much of metropolitan Sydney.

The event caused an estimated $1.04 billion in damages (Insurance Council of Australia (ICA) as at February 14, 2019). This ranks as the 8th most costly hailstorm in the ICA Disaster List in terms of normalised insured losses (2017/18 dollars1, Table 1), just above the Melbourne 2011 Christmas storm, and the 3rd costliest for the Sydney region. The largest volume of claims was for motor vehicles with these contributing around 30% of the total loss. Figure 2 shows the distribution of numbers of claims made for residential, commercial and motor vehicle lines of business.

The event generated a substantial emergency response with the SES receiving 3600 calls, of which 1100 came from the Liverpool area alone. There were 2400 jobs attended by 600 volunteers, mainly to place tarps on roofs.

Figure 2: Chart showing proportion of numbers of claims made, using data from the ICA Disaster List.

The conditions at the time of the event were favourable to see hail fall. The synoptic weather pattern was described as a ‘southeasterly change’ and there was also a sea-breeze present on the day as determined from our analysis of local weather station data, specifically wind direction, wind speed, and relative humidity. This combination of synoptic pattern and sea-breeze occurrence was found to be the most conducive to severe hail in southeast Queensland in research conducted by Soderholm et al. (2017).

Our post-event analysis included estimating a damage footprint using the Bureau of Meteorology (BoM) radar station data (recently made open source) and Maximum Estimated Size of Hail (MESH) algorithm (originally due to Witt et al. 1998). The MESH algorithm estimates hail sizes from radar measurements of reflectivities combined with temperatures over the scanned altitude. We applied this to the nearby Wollongong Radar (Appin station) data and by combining all frames of MESH output over the event time period and averaging the maximum expected hail size we produced a spatial map of hail intensity (Figure 3). This intensity map was then used as input for further analysis, allowing the use of image thresholding techniques to obtain damage footprint contours over the affected areas (see Figure 4.). Fitting ellipses to these contours then allowed us to make a direct comparison with our Risk Frontiers HailAUS CAT loss model, a fully stochastic loss model for hail covering all of Australia.

Figure 3: Mean Maximum Estimated Size of Hail from 02:00 to 10:00 UTC using the Wollongong Radar and Joshua Soderholm’s (BoM) PyHail software. White solid lines are postcode boundaries. Analysis used the second tilt (0.9 degrees from horizontal).
Figure 4: Storm footprints extracted from contours of mean Maximum Estimated Size of Hail algorithm output over the entire event. Dashed lines represent contour levels of 30mm diameter. The maximum predicted over the entire event was 104mm diameter hail. Grey solid lines are postcode boundaries. Red solid lines are extracted hail cell boundaries with a lower threshold of 35mm, and the overlain blue ellipses are fitted to those boundaries for comparison with storm events within HailAUS.

HailAUS 7.0 includes a catalogue of hailstorms reflecting activity from local radar station data and the frequency and severity of ‘high storm potential days’ derived from reanalysis data and the observed historical record. It calculates losses for residential, commercial, industrial and motor portfolios using an approximation of elliptical storm footprints. If we take the approximated ellipses in Figure 4 for this event the estimated loss from HailAUS is $1.6 billion using the PERILs Hail Industry Exposure Database for 2018 and the Redbook Motor portfolio.

There are several factors that limit the accuracy of the HailAUS modelled loss estimate. The first two are related to the Wollongong Radar being a single polarisation instrument (only sends horizontal polarisation of radio wave). This required us to employ the MESH algorithm rather than more modern hail size estimating algorithms, such as the Hail Size Discrimination Algorithm (HSDA, Ortega et al. 2016), that can be applied to dual-pol (uses both horizontal and vertical polarisations of radar signal) stations such as the main Sydney (Terrey Hills) station. We also expect the sea-breeze produced drift, which in turn influenced the location of the damage footprint. Although not available for our analysis, the Terrey Hills Radar data is likely to be released in the near future.

Other limiting factors include the consistency and reliability of the motor vehicle market portfolio and that the damage footprint used for cars will likely be larger due to a lower threshold for car-damaging hailstones than that used in the Figure 4 contours. The latter would act to increase the amount of loss attributed to cars.

Finally, while the elliptical damage footprints in HailAUS are a very reasonable representation, they limit the accuracy of the distribution of damage compared to what is observed in the radar data and we plan to improve this in a future model update.


Disclosure of climate-related financial risk

by Stuart Browning

In light of underwhelming progress at COP-24 (the annual United Nations Framework Convention on Climate Change (UNFCCC) Conference Of the Parties (COP) in Katowice 2018), it is increasingly improbable the Paris Agreement’s ambitions will be achieved. Instead, it seems more likely that recommendations from the Financial Stability Board (FSB) will be the primary catalyst for effective action on climate change mitigation. Projections of the economic cost of climate change have always been somewhat dire (e.g. Stern 2006); and have been mostly ignored by policy makers. However, the FSB have recommended financial risks due to climate change be disclosed by all publicly listed companies. This is driving the financial sector to seriously consider the implications of climate change, and the results are likely to be sobering. With an understanding of risk comes investor pressure to minimise the risk, and this may well drive mitigation efforts above and beyond those achieved via the ‘heads-of-state’ level Paris Agreement. In Australia, this has been manifested most recently by the Reserve Bank of Australia’s stark warning last week to, in effect, “change now or pay later” (see Risk Frontiers Briefing Note 391).

Publicly listed companies are legally required to disclose material risks to their investors. This disclosure is especially relevant for banks, insurance companies, asset owners and managers when evaluating the allocation of trillions of dollars in investor capital. In 2017 the FSB released the final report of the Task Force on Climate-related Financial Disclosures (TCFD), which stresses that climate change is a material risk (and/or opportunity) that should be disclosed—preferably alongside other risks in annual reporting. The TCFD proposes a framework for climate risk determination and disclosure (Figure 1) in which risk is classified into two main types: transitional and physical. Transitional risks are those that may impact business models through changing technologies and policies: examples are a carbon tax, or stranded assets associated with redundant fossil fuel exploration and extraction. Physical risks are those associated with climate change itself: these could be chronic risks such as sea-level rise, or acute risks such as more extreme storms, floods or droughts.

Figure 1: Factors identified in the TCFD report contributing to financial risk and opportunities under climate change (TFCD 2017).

While climate change is expected to impact most businesses, even current exposure and vulnerability is not being adequately disclosed by most organisations. The Australian Securities and Investment Commission (ASIC) report in 2018 looked at climate risk disclosure in Australian companies and found that very few were providing adequate disclosure, thereby exposing themselves to legal implications; and more importantly, by failing to consider climate change as a risk, were potentially putting investor capital at risk. Companies that are attempting to disclose climate risk are typically doing so inconsistently, and with high-level statements of little use for investor decision-making (ASIC 2018). Quantifying organisational vulnerability and risk under climate change is a non-trivial task. Adequate implementation of the TCFD recommendations will likely occur over a >5 year timeframe (Figure 2) . Initially companies are expected to develop some high level information on general risk under climate change. As research progresses, disclosure should become more specific.

Figure 2: Milestones in the implementation of the TCFD (TCFD 2017).

Understanding risk in terms of weather and climate has long been of interest to the insurance sector, but is now something expected to be understood and disclosed by all sectors. The Actuaries Institute have recently developed The Australian Actuaries Climate Index, which tracks the frequency of occurrence of extremes in variables of interest, such as temperature, precipitation, wind speed and sea- level. The index provides a general level of information drawn from a distribution of observed variability. However, climate change will cause a shift in the distribution of events, meaning this information is of limited use for projections. The relationship between a warming climate and the frequency of extreme weather events is likely to be complex and peril and location specific. Quantifying physical climate risk requires an understanding of the physical processes driving climate variability, the technical expertise to work with petabytes of available data, and the capacity to run regional climate models for dynamical downscaling—these skills are typically restricted to research organisations and universities.

Useful risk disclosure will come from using the best available information to represent both past and projected climate variability. This means using a combination of observational and model based data. Exposure and vulnerability will need to be determined using weather station observations and reanalysis data. This will need to be organisation-specific and developed within the context of assets, operations, and physical locations. Risk projections can then be developed, and this should be done using scenario analysis across multiple time horizons: short, medium and long term. Short-term projections can be developed using established vulnerability together with seasonal forecasts. Medium- and long-term projections should be based on global climate model (GCM) projections developed within the framework of the Coupled Model Intercomparison Project (CMIP). These are the scenario-based industry-standard climate model projections used for the IPCC reports. The IPCC Fifth Assessment Report (AR5) was based on the CMIP5 suite of simulations. The next generation of simulations (CMIP6) are underway and should become publicly available from 2019-20 onwards. Projections of organisation-specific risk will need to be developed by downscaling GCM projections. The best results are likely to be achieved through a combination of statistical downscaling, dynamical downscaling, and machine learning.

Risk Frontiers utilises projections within its suite of natural catastrophe (CAT) loss models to investigate how losses may change in the future under different climate scenarios. Risk Frontiers adapts its CAT models, developed for the insurance industry to assist decision makers in estimating and managing catastrophe risk, to assess the impact of projected changes in weather-related hazard activity due to climate change, as well as changes in vulnerability and exposure (Walker et al. 2016). In November 2018, The Geneva Association reported on the benefits of the integration of climate science and catastrophe modelling to understand the impacts of climate change stating that “Cat modelling is more relevant than ever”. With CAT models being the ideal tool for this type of analysis, Risk Frontiers is strongly positioned to address the need for physical climate risk disclosure.

References

ASIC (2018) REPORT 593: Climate risk disclosure by Australia’s listed companies.

Risk Frontiers (2019a). ‘Change now or pay later’: RBA’s stark warning on climate change. Briefing Note 391.

Risk Frontiers (2019b), Disclosure of climate-related financial risk. Briefing Note 386.

The Geneva Association (2018) Managing Physical Climate Risk: Leveraging Innovations in Catastrophe Modelling.

Stern, N. (2006) “Stern Review on The Economics of Climate Change (pre-publication edition). Executive Summary”. HM Treasury, London. Archived from the original on 31 January 2010. Retrieved 31 January 2010.

TCFD (2017) Financial Stability Board, Final Report:
Recommendations of the Task Force on Climate-related Financial Disclosures.

TCFD (2017) Financial Stability Board, Final Report:
Implementing the Recommendations of the Task Force on Climate-related Financial Disclosures.

Walker, G. R., M. S. Mason, R. P. Crompton, and R. T. Musulin, 2016. Application of insurance modelling tools to climate change adaptation decision-making relating to the built environment. Struct Infrastruct E., 12, 450-462.

‘Change now or pay later’: RBA’s stark warning on climate change

by Ryan Crompton, Andrew Gissing, Thomas Mortlock and Paul Somerville, Risk Frontiers


The following article, by Eryk Bagshaw and Nick Bonyhady, appeared in the Sydney Morning Herald on 12 March 2019. The last line notes that “companies disclosing climate risks need to adopt a level of commonality or risk that information not being useful to investors.”

Worth noting is there are two types of climate change risks posed to business. The first is the physical risk posed to direct business operations and supply chains and the second is transitional risk of adapting operations to a climate changed future. Climate change risk disclosure is still at an early stage in Australia with no regulation at present. Most disclosures at present focus on the immediate physical risks to business and do not include transitional risk.

A recent paper by Allie Goldstein and co-authors looked at the private sector’s climate change risk and adaptation blind spots by reviewing more than 1,600 corporate adaptation strategies in the US. Some interesting findings from the paper, relevant for Australia, are:

  1. The magnitude and costs of physical climate change risks are being underestimated by companies. Companies need further guidance on estimating more realistic costs.
  2. Climate change risks to business beyond direct operations are not being considered.
  3. The costs associated with climate change adaptation strategies are being under-reported.
  4. Non-linear climate impacts, and extreme climate scenarios, are not being considered by companies in disclosures.

Risk Frontiers’ goal is to provide an objective assessment of these risks to assist companies (including those in the insurance industry) and governmental organisations in achieving that level of commonality mentioned in the Sydney Morning Herald article, reproduced in part below.


The Reserve Bank has warned climate change is likely to cause economic shocks and threaten Australia’s financial stability unless businesses take immediate stock of the risks.  The central bank became the latest Australian regulator to tell business that they must analyse their investments on Tuesday, as the Coalition grapples with an internal battle over taxpayer-funded coal fired power and energy policy.

In a speech to the Centre for Policy Development in Sydney, the Reserve’s deputy governor Guy Debelle said challenges for financial stability may arise from both physical and transition risks of climate change. “What if droughts are more frequent, or cyclones happen more often?” he asked. “The supply shock is no longer temporary but close to permanent.  That situation is more challenging to assess and respond to.”

Financial stability could be put at risk if businesses remained unaware of anticipated insurance payouts, pollution-driven reputational damage, legal liability and regulation changes that could cause valuable assets to become uneconomic. “All of these consequences could precipitate sharp adjustments in asset prices, which would have consequences for financial stability,” he said.

Dr Debelle said the increasing number of extreme climate events was also changing public opinion. “One of the things that is causing change in public opinion around this is just the straight-up occurrence of extreme events,” he said. “It’s not the way you would actually like this to come about unfortunately … [but] it has changed the general public view.”

Dr Debelle said the bank was speaking about the issue because of the size of the impact climate change would have on the economy. “Some of these developments are actually happening now,” he said. Dr Debelle said the current drought across large swathes of the eastern states has already reduced farm output by around 6 per cent and total economic growth by about 0.15 per cent. “We need to think in terms of trend rather than cycles in the weather. Droughts have generally been regarded as cyclical events that recur every so often. In contrast, climate change is a trend change.”

That has an impact on monetary policy, Dr Debelle said, citing the temporary shock of banana prices surging after Cyclone Yasi in 2011, which in turn boosted inflation by 0.7 percentage points. But he said future events may not be so one-off, with repeated climate events and the transition of the economy likely to have a longer-term impact. “We need to be aware that decisions taken now by businesses and government may have a sizeable influence on that transition path,” he said.

Dr Debelle said the transition posed challenges and opportunities. Industries especially exposed to the consequences of changes in the climate will face lower costs if there is an early and orderly transition, some will bear greater costs from the transition to a lower carbon economy, while others such as the renewables sector, may benefit “There has been a marked pick-up in investment spending on renewable energy in recent years,” he said. “It has been big enough to have a noticeable impact at the macro-economic level and affect aggregate output and hence the monetary policy calculus.”

In comments that are likely to be used against some pro-coal Nationals MPs urging the Coalition to build a taxpayer-funded power station, the deputy governor said the renewable sector was a good example where price signals have caused significant behavioural change. “There has been a rapid decline in the cost of renewable energy sources,” he said. Dr Debelle said the cost of generating electricity has declined in the case of wind and solar to the point where they are now cost-effective sources of generation. He added that storage and transmission remained relevant costs.

Despite coal being one of Australia’s top exports, Dr Debelle said opportunities remained as China transitioned away from coal. “Natural gas is expected to account for a larger share of its energy mix, and Australia is well placed to help meet this demand,” he said.

He endorsed comments by Australian Prudential Regulation Authority executive Geoff Summerhayes in London in January, which warned tackling climate change had become a “financial necessity”. In the speech to the UN’s sustainable insurance forum, Mr Summerhayes lashed government inaction, arguing the summer’s extreme weather, severe drought and floods were all fuelled by climate change, but Australia still lacked the political consensus needed to respond to the threat.

Giving the example of data on when different parts of the Gold Coast would stop being viable, Blair Comley, a former secretary of the federal Department of Climate Change and Energy Efficiency, said the lack of data on the impact of climate change made it harder to plan for. Dr Debelle said while the Reserve Bank was not responsible for developing climate policy, it had a role to play in ensuring there is adequate data.

Where there is inadequate data for the bank to make the decisions it needs to, “we can call out that,” Dr Debelle said. And he emphasised that companies disclosing climate risks need to adopt a level of commonality or risk that information not being useful to investors.

References

Goldstein, A., Turner, W.R., Gladstone, J., and Hole, D.G. (2019). The private sector’s climate change risk and adaptation blind spots. Nature Climate Change, 9, 18-25.

Sydney Morning Herald (2019). ‘Change now or pay later’: RBA’s stark warning on climate change. Available here, accessed 14 March 2019.

 

 

Cyber Attack on the Australian Parliament and the Lessons Learned

The following article was published by the Australian Outlook on March 4th, 2019. It highlights some of the most important technical and political points regarding the recent cyber attack against the Australian Parliament Network and other political parties.

Risk Frontiers are a partner in the Optus Macquarie University Cyber Security Hub focusing on quantitative risk modelling of cyber risks.


Synopsis:

In the lead up to the federal election, the Australian Parliament and multiple political parties have been hit by a sophisticated cyber attack. Experts are divided on who is to blame but the attackers had clear motives and there are some key lessons to learn from this incident.

By Associate Professor Christophe Doche, Dr Stephen McCombie and Dr Tahiry Rabehaja

On February 8, reports emerged regarding an attempt to infiltrate the Australian Parliament network, which is primarily used to exchange emails and store data. On February 18, Prime Minister Scott Morrison and Opposition Leader Bill Shorten addressed the Parliament to acknowledge the attack. The next day, the Australian Cyber Security Centre (ACSC), which is now part of the Australian Signals Directorate (ASD), confirmed that a cyber actor gained illegal access to the networks of the Liberal, Nationals and Labor parties.

Since then, investigations have revealed that the attack was sophisticated and most likely state-sponsored. It is understood the initial breach was the result of a phishing campaign, where a staff member opened an infected document attached to an email. Once the criminals got a foothold on a computer attached to the network, they scanned and infected other targets, including intranet servers. They were then able to redirect network traffic in order to exfiltrate data. They also erased logs to cover their tracks and placed additional malware to maintain control of the infected systems for later use.

digital forensics analysis has shown that the attack relied on a series of malware and exploits, which happened to be in several cases slight modifications of existing open source tools. That is what fooled primary anti-virus software. Many of these open source tools are ironically used by the ethical hacking community to find vulnerabilities in computers and systems with the aim to report and, ultimately, fix them. They are written in the popular language C# for the .NET framework. All these factors indicate there was a clear desire from the attackers to remain undetected for as long as possible and to make attribution – the identification of the perpetrators of the attack – a difficult task.

Figure 1: Reverse engineering some parts of the malware used by the hackers shows that they leverage on well-known penetration testing tools (source: Yoroi).

Although there is no clear evidence – at least none that has been released – the media speculation is that China is most likely behind this attack. China has a long history of cyber espionage operations globally and also locally against the Australian Government, our defence sector, mining industries and even universities. This incident happened on the back of the banning of Huawei from Australia’s 5G network, recent tensions in regard to trade and multiple claims of improper Chinese influence on Australian political parties. There have also been reports that Iran may have been the perpetrator but it is difficult to see what they would gain in Australia from such an action. They have been active in recent times against US targets and perhaps may see Australia as a way into the Five Eyes intelligence alliance or alternately our close relationship with Israel (their bitter enemy) and plans to formally recognise West Jerusalem as the capital of Israel may have made us a target.

Perhaps most surprising is that this attack was actually successful at getting into the Parliament and Australia’s major parties, despite the amount of warning of the potential for such attacks to occur. Attacks on the Democratic National Committee in the United States in 2016, which accessed multiple email accounts including that of Hillary Clinton’s campaign director, by Russian Military Intelligence (GRU) are well known and documented. In the aftermath, members of the Democratic Party visited a number of European countries and spoke to political parties to specifically warn of the risk of such cyber breaches. Similarly, the ASD briefed political parties on threats to our elections in 2017. In July 2018, the Australian Government also offered $300,000 to help political parties shore up their cyber security. In addition, the Government has significantly grown the scope and size of the ACSC and other cyber capabilities. Despite this, these attacks have penetrated our Parliament and major political parties just months before a highly contested election where matters of relations with China are likely to be debated.

One key observation here is that the Government has a very large cyber risk footprint. It employs tens of thousands of employees and human beings have always been part of cyber security issues and solutions. This incident is no exception. Governmental networks are complex, shared and scaled infrastructures, which greatly increases the chance of overlooking security lapses and facilitates the propagation and replication of attacks to other agencies cheaply and quickly. Government agencies are also very attractive targets. They hold a large volume of confidential and personally identifiable information, they are the top target for politically motivated attackers and cyber warfare, and they are amongst the main victims of cyber espionage. This means that they are attracting multiple categories of threat actors ranging from organised cyber criminals looking for financial gains to advanced persistent threats backed by state actors. The Australian Parliament network incident emphasises these three points, but also highlights the Government’s large cyber attack surface area, since such an attack could have occurred in any one of the many interlinked agencies’ digital information and infrastructure.

Although the response to this incident has been swift and there is no evidence that any data has been leaked, the ACSC has warned that the actor, whoever it may be, will probably further target other Australian Government departments. The Government needs to understand, build and protect its digital infrastructure, and associated exposure, with the appropriate controls and responses. The NSW Government and the Government Chief Information Security Officer have taken a leading role in this area by releasing in February 2019 the NSW Cyber Security Policy. Among other measures, this policy mandates every agency to identify its crown jewels – its most valuable or operationally vital systems or information – and implement regular cyber security education for all employees, contractors and outsourced ICT service providers. These two measures alone will go a long way to improve the cyber resilience of NSW Government agencies.

 

Associate Professor Christophe Doche is executive director of the Optus Macquarie University Cyber Security Hub, the first initiative of this kind in Australia, linking academics in information security, business, criminology, intelligence, law and psychology together with cyber security experts from industry. As part of his role, he oversees research, education and thought leadership activities in cyber security.

Dr Stephen McCombie is a senior lecturer in Cyber Security at Macquarie University. His current research interests are in digital forensics, cyber threat intelligence and information warfare. His research draws on a diverse background in policing, security and information technology over the last 30 years. He has also held senior positions in information security with IBM, RSA, National Australia Bank and most recently SecureWorks.

Dr Tahiry Rabehaja is a Software Engineer at Risk Frontiers and research fellow at the Optus Macquarie University Cyber Security Hub specialising in quantitative risk modelling. He has a background in information security and formal program verification and, in particular, the development of mathematical models for quantifying confidentiality in programs. His current research is on the quantification of cyber security risk.