Newsletter Volume 16, Issue 3

Better Managing New Zealand’s Earthquake Risks

This speech was given by Hon Dr Nick Smith, Minister for Building and Construction, Minister for the Environment, on 25 January 2017. Dr Smith’s experience as a civil engineer is manifested in this lucid analysis of policy issues related to earthquake risk in New Zealand.

New Zealand Earthquake Risk“A big worry in this Trump era of modern politics is that complex issues are dumbed down to 140 character tweets. The beauty of this annual opportunity you give me as Nelson’s MP is to give a far more considered and thorough account of a topical issue. The focus of this 22nd Rotary address is the steps we are taking to improve New Zealand’s management of earthquake risks.

We were dubbed the Shaky Isles 170 years ago and at two minutes past midnight on November 14 we got another harsh reminder of why. That Kaikoura quake was the largest in New Zealand since 1855. We are one of the most seismically active countries in the world and we need to be at the leading edge of protecting people, infrastructure and the economy from earthquakes.

The challenge in government is that there are all sorts of risks to manage – financial, terrorism, biological, trade, climate change, fire, and cyber-security, as well as the natural risks of floods, volcanic eruptions and cyclones, as well as earthquakes. We cannot pretend that government can eliminate these risks and we will always be limited in the resources we have to reduce them. My long term ambition as a Minister and as a rare engineer in Parliament is to try to ensure as a country we manage these risks and allocate resources based on science-based risk assessment. Politics and rational science are not close relatives, but tonight is an attempt to bring them closer together.

It is worth recalling our history of seismic events. We have had eight fatal earthquakes post-1840, or about one every 20 years. While it is true that two majors inside six years is unusual, we should treat the 40-year lull between Inangahua and Christchurch as unusually long.

There is no evidence the frequency of earthquakes in New Zealand has changed. GNS measures about 15,000 a year, of which 150, or one every three days, is felt. What has been unlucky is that we have had major quakes close to major population centres where the effects are so much greater.

It is useful to compare the risks to life from earthquakes to other risks. Our history points to an average loss of three lives a year from earthquakes, as compared to 300 a year from road accidents, 120 a year from drowning and 30 a year from house fires. You can see in these numbers why I placed huge importance in getting a new law through Parliament last year requiring smoke alarms in rental properties, when the costs are so small in comparison to earthquake strengthening and the number of lives saved so much greater. These stats are not to discount the risks from quakes, but to keep the relative risk in perspective.

Average expected fatalities are just one factor to take into account in determining priorities. Earthquakes will cost New Zealand close to $50 billion in both public and private sector costs this decade, of which the Government’s share is about $20 billion – $18 billion for Christchurch and $2 billion for Kaikoura.

The loss of life from earthquakes in New Zealand pales by comparison internationally. The 185 deaths in Christchurch compares to 230,000 in the 2004 Boxing Day quake and tsunami in Indonesia, the 160,000 killed in Haiti in 2010, the 16,000 killed in the Tohoku quake in Japan of 2011 and the 70,000 killed in Sichuan quake in China in 2008.

It is of note that the last decade has been the deadliest on record for earthquakes globally and that fatalities have been on the rise over the past half century.

The big killers are building failures and tsunamis. The reason for the significant rise is not any increase in seismicity but many more people living in the cities and in coastal areas. Improved building seismic resilience and better managing tsunami risks are the issues we should focus on to reduce future fatalities.

New Zealand’s comparatively low level of fatalities despite being one of the most seismically active areas of the world is due to both our relatively low population density and the huge improvements in building standards over the past century.

The Christchurch and Napier earthquakes were similarly sized quakes but whereas one in 100 died in Napier, in Christchurch one in 2000 died. This 95 percent reduction in fatalities can largely be attributed to the huge improvements in buildings’ seismic resistance. To put it another way, there would have been about 4000 fatalities in Christchurch were building standards left as they were in 1931. The key issue for my Building Minister’s role is how we further improve our engineering and building standards into the future.

It is not my intention to spend too much time on the seismic and engineering sciences, but there are a few core facts needed to explain the Government’s priorities and direction of policy. The first is to communicate the scale of energy release in a seismic event that makes designing and constructing earthquake resistant buildings so challenging. The Richter scale used to report earthquakes is logarithmic. An increase from a 5 to a 6 magnitude quake actually represents a 32-fold increase in the energy being released.

To get some sense of scale, the Christchurch 2011 quake at a 6.3 involved a release of energy equivalent to four Hiroshima atomic bombs. The Kaikoura earthquake at 7.8 was 180 times more powerful and the equivalent of 800 Hiroshima bombs. But the magnitude 9, mega thrust Tohuku earthquake that struck Japan in 2011 was 80 times stronger again and the equivalent of 60,000 Hiroshima bombs. So my first point is that earthquakes involve the release of phenomenal energy and that we cannot make our buildings totally safe.

The Christchurch earthquake was comparatively small and made deadly not by its size but by its location. We need to be prepared for the worse scenario of a Kaikoura or Tohoku scale quake close to a major city.

The analogy I would make to improved building design is the improvements made in vehicle standards.

Cars today are not 100 per cent safe in a crash but the risk of fatality has been made an order of magnitude better by smart design.

The challenge with buildings is more difficult because cars generally last 15 years, whereas buildings last 100, buildings are generally one off designed whereas cars are massed produced and accidents occur far more frequently than earthquakes, enabling design lessons to occur far more frequently. The common feature is that while we can make buildings a lot safer, a big enough crash or quake will still result in fatalities. My greatest concern is about the thousands of vintage buildings still in use that pose the most risk.

The second important scientific fact relates to the cause and probability of earthquakes. We heard all sorts of phantom theories about earthquakes being triggered by the phase of the moon, by oil exploration activity and from Destiny’s Brian Tamaki that sexual sinning was the cause. Earthquakes are caused by the sudden movement along faults of the earth’s tectonic plates and the timing cannot currently be predicted beyond probability estimates.

I was particularly offended by the moon-man, who caused widespread alarm in 2011 when he publically predicted a major shake at the Sign of the Kiwi on Christchurch’s Port Hill’s at a particular date and time. I was part of Skeptics New Zealand’s protest on site to highlight the nonsense of such pseudo-science. Extensive studies have shown no correlation between phases of the moon and earthquakes.

The science does, however, tell us two things about the probability of earthquakes. There are no surprises that the risk of earthquakes varies significantly with geography, i.e. that Wellington is much more prone than Auckland but the scale of difference needs highlighting. We would expect a significant earthquake of intensity MM8 in Wellington about once every 120 years, in Christchurch or Nelson every 720 years, in Dunedin every 1700 years and in Auckland once every 7400 years. For the record, the most high risk earthquake locations are Arthurs Pass, Hanmer Springs, Hokitika, Masterton and Kaikoura. The importance of this is that we need to focus our policies on the areas of greatest risk and avoid imposing excessive costs in areas like Auckland and Dunedin, where the seismic activity is low.

The second factor about the timing of earthquakes that we know is that they are much more likely after a significant quake. One of the worst psychological impacts of earthquakes is the long tail of aftershocks that can last several years. There is nothing more soul destroying than fixing the sewer pipe or removing the liquefied silt only to have it re-break and re-appear time and time again.

The last technical issue I want to cover is an explanation of why some buildings failed and others did not in the Kaikoura earthquake. People have been both mystified and unnerved by the fact that many older buildings labelled as earthquake prone had minimal, if any, damage in Wellington, while other new modern buildings had life-threatening partial failures. The explanation for this lies in the way the frequency of shaking interacts with the natural frequency of a building.

Every building has a natural frequency. If you give it a strong enough shove, it will naturally rock back and forward with a particular frequency. A short building may have a period of 0.2 seconds, but a tall building may be at over 2 seconds per sway. If the frequency of the earthquake’s shaking coincides with the building’s own frequency, it will experience much more extensive damage.

An earthquake will typically release a whole lot of shaking frequencies, but the short sharp shaking abates in close proximity to the quake. So the Kaikoura earthquake in Wellington had strong frequency shakes in the range of 0.8-1.2 seconds that lasted for an unusually long time. That affected buildings in the five to ten storey range. For these buildings, the earthquake was stronger and longer than the design standards required. But these same buildings would not be the most vulnerable in a major quake close to the city. The one and two storey, unreinforced masonry buildings that were untouched by the Kaikoura quake would be more likely to be hugely damaged and cause significant loss of life in a closer quake.

The Government has been severely tested by the challenges of the Christchurch and Kaikoura earthquakes and, while some mistakes have been made, I think history will judge our Government well. I particularly give tribute to Gerry Brownlee who, through the Canterbury and Kaikoura earthquakes, has done the lion’s share of the work.

We have poured in billions of dollars, passed special pragmatic laws to facilitate the rebuild, bailed out failed insurers to protect householders and acted decisively on getting infrastructure quickly fixed.

The responsibility is not just to rebuild but to learn every possible lesson so as to improve our resilience as a country to future earthquakes.

Tonight I want to outline a dozen initiatives we are taking to achieve this.
1. New Earthquake Prone Building Act
2. Adding Natural Hazards To The Resource Management Act
3. Post-Quake Building Act Reform
4. Improving Consistency Of Building Assessments
5. Standards And Training Of Engineers
6. Powers For Addressing Newly Identified Risks
7. Tackling High Risk Parapets And Facades Post Kaikoura
8. Supporting Heritage Buildings Upgrades
9. Improving Tsunami Warning Systems
10. Supporting Innovative Design
11. Investing In Seismic Research
12. National Policy On Natural Hazards”
The full text of these twelve initiatives can be found at:
https://riskfrontiers.com/better-managing-new-zealands-earthquake-risks/


Storm Direction Controls Coastal Erosion Risk in New South Wales

By Thomas Mortlock

Between 3 and 7 June 2016, an East Coast Low (ECL) storm caused widespread flooding, wind damage and coastal erosion along the eastern seaboard of Australia. Wave heights measured offshore of Sydney were not exceptional, but beachfront properties experienced some of the worst erosion losses in 40 years. Recent research suggests the major cause of erosion was the unusual north-easterly wave direction, which may have significant implications for future coastal management along the east coast [1].

Coastal Erosion Risk
Figure 1. Waves at South Narrabeen on Monday 6th June 2016. These conditions don’t occur every other year! Images reproduced with permission from Mark Onorati.

Wave direction controls probability estimates

The peak storm wave height measured offshore is a key design parameter for coastal engineers and is commonly used as an indicator of coastal erosion potential. For example, most seawalls are built to prevent overtopping up to a 1 in 100-year storm wave event and erosion hazard planning uses similar design conditions. However, annual return intervals (ARI) of extreme ocean wave heights can change dramatically according to wave direction.

For example, the peak storm wave height measured at the Sydney buoy on 5 June 2016 was approximately 6.5 m which – based on 35 years of observations – is exceeded, on average, once every two years. Coastal wave conditions were clearly less frequent than this (Figure 1), and beach monitoring showed more sand was eroded from Sydney beaches during this storm than the benchmark ‘1 in 100-year’ event, the ‘Sygnia Storm’, of 1974 [2]. Evidently, the peak-wave-height method can grossly under-estimate erosion risk if wave direction is ignored.

When extreme values are recalculated to include wave direction, the June 2016 storm becomes a 1 in 30-year event. While this is a more realistic estimate, it still does not explain why the erosion response in 2016 was larger than other ECL events over (at least) the past 40 years.

Wave power retained across the shelf

There have been many more powerful ECL storms than the 2016 event, but few have produced waves from north of southeast (only 8% over the past three decades at Sydney). Large storms such as those in 1974, 1997 (the ‘Mothers Day Storm’), 2001 and 2008 (the ‘Pasha Bulker Storm’) all produced waves from the south of southeast because of the cyclonic rotation of the low-pressure systems.

While rarer, easterly and north-easterly storm waves retain a higher portion of their offshore wave power by the time they arrive at the NSW coast. This is because they approach normal to the coastline, meaning shorter travel distances across the continental shelf and thus dissipate less energy via friction with the seabed, than do waves approaching from oblique angles.

This means smaller storm waves from the east and northeast can produce more powerful coastal wave conditions – and erosion – than do larger storm waves from the south.

Coastline shape amplifies erosion risk

The present-day NSW coastline has evolved over the past 6,000 years during a period of relatively stable sea level and under a predominantly south-easterly wave climate. As a result, the northern sections of beaches up and down the coast are more exposed to wave energy while the southern ends receive much less, and are often protected by rocky headlands. This south-to-north gradient in wave energy at the coast controls the morphology of the beach and dunes – both being lower and narrower towards the southern ends.

For this reason, NSW beaches are particularly vulnerable to storms from the east and northeast because wave energy is focussed on southern beach sections not equilibrated with high wave exposure under the prevailing south-easterly wave climate.

Collaroy particularly vulnerable

Collaroy, a suburb situated at the south end of the Narrabeen-Collaroy embayment on Sydney’s Northern Beaches, was one of the worst affected areas in June 2016. Six properties, including some multi-unit blocks, were declared structurally unsafe and residents were evacuated.

The shorefront at Collaroy is characterised by years of inappropriate development into the active beach zone, contributing to its reputation as one of the State’s erosion ‘hot-spots’. Our modelling now shows that the geometry of the Collaroy embayment, in particular Long Reef headland, amplifies erosion during east and north-easterly storms.

Figure 2. Modelling of water current speed and direction at Collaroy-Narrabeen during the June storm. A mega-rip cell is formed at Collaroy exactly where the most severe erosion damage occurred. Long Reef is located at the far south of the image.

During most wave conditions, Long Reef shadows Collaroy and Fishermans Beach from wave energy, but during east and north-easterly storms it contributes to a mega-rip circulation which instead focusses erosion at Collaroy (Figure 2). The location of rip currents is well known to correspond to areas of beach erosion as they facilitate the offshore transport of sand during a storm. In areas where the beach is severely lowered, waves can attack adjacent dunes and undermine structural foundations of buildings.

Implications for coastal management

The June 2016 ECL highlighted the importance of storm wave direction for coastal erosion risk in NSW. While an unusual event in the context of the past few decades, extreme wave events from this direction are projected to become more common for Southeast Australia in the future with climate change [3,4].

Regulatory requirements for both shoreline recession and beach erosion planning currently ignore potential changes in wave direction under a warming climate. They also do not consider the significant impact less powerful storms from unusual directions can have on coastal risk. There needs to be a fuller examination of the implications of changes to the storm wave climate for the NSW coast to inform sustainable management practice for the coming decades.

The full research paper on which this article is based can be accessed at: http://www.mdpi.com/2073-4441/9/2/121.

References

1. Mortlock, T.R., et al. (2017), Water, 9(2), 121.
2. Turner, I.L., et al. (2016), Sci. Data, 3, 160024.
3. Hemer, M.A., et al. (2013), Nat. Clim. Chang., 3, 471-476.
4. Goodwin, I.D., et al. (2016), J. Geophys. Res. Oceans, 121, 4833-4853.

Twitter can predict hurricane damage as well as emergency agencies

By John Bohannon  Mar. 11, 2016.


Following the community–initiated Facebook groups that emerged during the 2010/11 Queensland and Victorian floods, Risk Frontiers undertook research into the use of social media as a complementary form of hazard and risk communication. The online questionnaire underpinning the study concluded that it has value to the emergency services, not only as a tool to disseminate information but also as an important resource to tap into and review informal communications.

The power of social media and the value it has to emergency services was demonstrated again following Hurricane Sandy, and this Briefing Note discusses the research that showed that social media could be used to rapidly assess damage in the aftermath of an event.


Mapping out the intensity of tweets during and just after a hurricane produced a map of damage on par with the government’s.

In October 2012, meteorologists noticed a massive low-pressure system forming over the waters south of Cuba. In just 5 days, it spun into one of the largest hurricanes on record, cutting a path up the eastern U.S. coast and devastating communities with flooding and 140-kilometer-per-hour winds. Superstorm Sandy posed a massive problem for government clean-up crews. Where should they send their limited emergency supplies and services? A new study suggests a way to get that answer fast: Just listen to Twitter.

Mapping damage is a crucial first step in hurricane response. Inaccurate mapping, as there was with Sandy and even more so with Hurricane Katrina in 2005, can add up to weeks—and in some cases months—before help arrives to those most in need. To predict where the worst damage has occurred, the U.S. Federal Emergency Management Agency (FEMA) puts together models that look at everything from geography to infrastructure to storm characteristics, and then flies over the affected areas to further refine their map. Surveying people on the ground in natural disaster zones is just too difficult.

A team led by Yury Kryvasheyeu, a computational physicist at Australia’s National Information and Communications Technology Research Centre of Excellence in Melbourne, wondered whether better data might already be waiting online. By 2012, people were relying on social media apps such as Twitter to communicate about real-time events. But can a map of tweets be translated to a map of damage?

Kryvasheyeu’s first task was to get the data. Though Twitter opened up its full archive to researchers back in 2014, many academics have been worried about the legal strings that might be attached to using the California-based company’s data. But the team only needed a subset for their experiment, so they bought it from one of the many third-party companies that collects, processes, and resells Twitter data. The database included all tweets in the world between 15 October and 12 November 2012. The team then narrowed the set to those with words like “hurricane,” “Sandy,” “frankenstorm,” and “flooding.”

Many tweets already had map coordinates locating their origin. But others did not. So the researchers also analyzed user accounts and message contents to further pin down the location of tweets. All in all, the team mapped out nearly 10 million tweets from more than 2 million user accounts.

The first discovery was reassuring. The relevant tweets weren’t just scattered randomly on the map: The closer people were to the hurricane, the more they had to say about it. But does such Twitter activity translate into actual damage? It was possible, for example, that local media coverage could amplify fear, even in areas that weren’t hit hard by the storm. So the researchers obtained data on the true extent of the damage from FEMA and the state governments of New Jersey and New York.

It turns out that Twitter was a remarkably good source of information on hurricane damage. The more damage Sandy actually did to a neighborhood, as measured by the per capita cost of the repairs, the higher the intensity of relevant tweeting from those areas just after the storm. In fact, Twitter was slightly better than FEMA’s own models in predicting the location and severity of damage, the team reports today in Science Advances. The main advantage of the technique is that it is a “virtually zero-cost solution,” says co-author Manuel Cebrian, a computer scientist at the Commonwealth Scientific and Industrial Research Organisation in Clayton, Australia.

Still, Twitter data have many limitations and pitfalls, says Urbano França, a computational public health researcher at Harvard Medical School in Boston. These include everything from “Twitter-bots” that robotically generate tweets to the quirks of who does and does not use social media. But, he says, the researchers in this case “seem to have thought of most, if not all, issues and potential loopholes.” The next step, he says, is to look for data on other social platforms, like Facebook, which has a much higher user base and “could potentially provide more precise results.” Then again, getting those data may prove even more difficult than dealing with Twitter’s data firehose.

Risk Frontiers’ PhD student takes out second prize

Risk Frontiers’ PhD student, Avianto Amri, takes out second prize for a poster to develop a household preparedness plan at the First Innovation in Flood Resilience Conference held recently in Jakarta.

The conference aimed at matching local innovators to potential funders and other partners to scale up ideas and solutions in flood management.

 

Below is a report from the IFRC website.

“Innovators take to the stage with ideas for building community resilience towards flooding and other disasters

By Ika Koeck, IFRC

From floating structures to sustainable health insurance for flood-affected communities, the First Innovation in Flood Resilience Conference, held recently in Jakarta, saw the presentation of innovative ideas and projects to improve disaster management and build community resilience towards flooding.

The two-day conference was a collaboration between the Indonesian Red Cross Society, the International Federation of Red Cross and Red Crescent Societies (IFRC) and Zurich Insurance, to match local grassroots innovators to potential funders and other partners to scale up ideas and solutions in flood management. Other key partners to the conference included Pulse Lab Jakarta, the Humanitarian Leadership Academy and Hamburg University of Technology.

The Indonesian Civil Society has already been developing their own innovative solutions to improve flood response within the country. However, most innovators struggle to gain the recognition and support needed to take their ideas forward.

Giorgio Ferrario, Head of Delegation for the IFRC Country Cluster Support Team in Indonesia and Timor-Leste, said that a joint effort between private sectors, academic institutions and humanitarian organisations is key to addressing community priorities in a sustainable and effective way.

“We are consciously working in new and innovative ways with non-traditional partners, as the challenges we face with recurring disasters continue to grow in unprecedented ways,” said Ferrario.

The highlight of the conference was when the innovators took to the stage during An Ideas Pitch session to narrate their ideas and what motivated them to seek a change in their own communities before a panel of judges.

Nine promising local inventive and original solutions from different organisations including Risk Frontiers, Starside, Mallsampah.com and Garbage Clinical Insurance took part in the competition. The winners of the session, Mallsampah.com and two other innovative ideas from Risk Frontiers and Telaga, will receive grants to keep developing their innovations.

“Since waste management is a major issue in our village, we decided to create a recycling system where people can exchange their waste for medical services,” explained the creators from Garbage Clinical Insurance.  “Everyone should have access to health insurance.”

Wirahadi Suryana, the Director of Corporate and Commercial in Zurich Insurance Indonesia said that the company has been working together with the Red Cross since 2013, focusing on building community resilience before, during and after disasters.

In line with the conference, Zurich Insurance also highlighted their mobile application for flood response. “Our team has developed a mobile app called Z-Alert, which can be downloaded for free in the AppStore and Google Play. Users will be able to locate disasters around them, ranging from floods, traffic accidents and even power outages,” said Suryana.”

Building evidence for risk-based insurance

Professor John McAneney and Andrew Gissing were invited to contribute to the 2016 World Disaster Report by the International Federation of Red Cross and Red Crescent Societies. Their contribution is provided below.


 Building evidence for risk based insuranceImproving societal resilience in the face of the growing cost of disasters triggered by natural disasters and how to do so in a fair and affordable manner is an increasing challenge. Many governments are looking to insurance as a partial solution to this problem.

Insurance is a contract between a policy-holder and a company that guarantees compensation for a specified loss in return for the payment of a premium. Conventional insurance works by pooling risks, an approach that works well for car accidents and house fires but not for the spatially-related nature of losses from disasters caused by natural hazards. It is the global reinsurance market that ultimately accepts much of this catastrophe risk (Roche et al., 2010). Relatively new financial instruments such as Catastrophe Bonds and Insurance-Linked Securities are also being employed to transfer some catastrophe risks to the capital markets.

Insurance is part of the essential infrastructure of a developed economy but it would be a mistake to see it as an instrument of social policy. It cannot in itself prevent flooding or earthquakes. On the other hand, insurance can promote socially desirable outcomes by helping policy-holders fund their post-disaster recovery more effectively. The greater the proportion of home-owners and businesses having insurance against naturally-triggered disasters, the more resilient the community will be.

Insurers can also help promote risk awareness by property owners and motivate them and communities, as well as governments, to take mitigation actions to reduce damaging losses (McAneney et al., 2016). The mechanism for doing this is by way of insurance premiums that properly reflect risk. Insurance is not the only means of providing transparency on the cost of risk, but private insurers are the only ones with a financial incentive to acknowledge such costs. Moreover, they are the only entities that can reward policy-holders when risks are reduced (Kunreuther, 2015; McAneney et al., 2016).

It is in the interest of communities to have a viable private sector insurance market and, arguably, governments should only become involved in the case of market failure (Roche et al., 2010). Of those government-authorized catastrophe insurance schemes examined by McAneney et al. (2016), many are actuarially unsound and end up creating a continuing liability for governments, and/or, in not pricing individual risks correctly, they encourage property development in risky locations while failing to provide incentives for retrofitting older properties at high risk. In less-developed insurance markets some government involvement may encourage the uptake of insurance (e.g., Tinh and Hung, 2014).

How do we assemble the evidence to support risk-reflective insurance premiums? New technologies such as catastrophe loss modelling, satellite imagery and improved geospatial tools are proving helpful in allowing insurers to better understand their exposure to natural hazard risks. While these technologies are increasingly available, in some countries the normal outcomes of such data gathering and analysis – insurance premiums – are constrained politically. This is the case in the United States of America where there has been a tendency to keep premiums low across the board and to have policy-holders in low-risk areas cross-subsidizing those at higher risk (Czajkowski, 2012). Such practices do little to constrain poor land-use planning decisions that lie at the heart of many disasters triggered by natural hazards (e.g., Pielke Jr et al., 2008; Crompton and McAneney, 2008). McAneney et al. (2010) show that most of the homes destroyed in the 2009 Black Saturday fires in Australia were located very close to fire-prone bushland with some 25 per cent actually constructed within the bushland. Effectively these homes were part of the fuel load and their destruction was unsurprising.

One way to build a wider evidence base for collective action to support risk-based insurance policies is for governments to share information on risks of disasters related to natural hazards, both with insurers as well as the community. This information might be hazard footprints as well as the likely cost of the damage (The Wharton School, 2016). In Australia, governments have been reluctant to do this. In some developing insurance markets, home-owners or farmers may have a better understanding of the risks than do insurers, who will price this uncertainty into premiums. Unrestricted access to hazard data for all parties would encourage fairer insurance pricing.

Gathering hazard data for building evidence for risk-reflective premiums depends on the type of hazard. For example, the distance of buildings from fire-prone bushland or the local likelihood of flooding are key determinants of vulnerability to these location-specific hazards. In other areas, or within the same areas in some cases, the annual likelihood of exceeding damaging levels of seismic ground-shaking, wind speed or volcanic ash are important metrics, as are distance from the sea and the elevation of a property when it comes to coastal hazards like tsunami and storm surge.

When this risk evidence is established and becomes reflected in national construction standards, improvements in resilience follow. For example, improvements in construction standards introduced in Australia after the destruction of Darwin by Tropical Cyclone Tracy in 1974 have been credited with reducing subsequent losses from tropical cyclones by some 67 per cent (McAneney et al., 2007).

The availability of such data may result in reductions in some insurance premiums, an increase for others, or, in extreme cases, the withdrawal of insurers from areas where the risk is considered to be too high. The latter outcome will send a strong signal to communities and government for investments in mitigation; subsidized insurance is not the answer. Governments should also ensure that humanitarian aid provided after disasters is targeted effectively, in order to avoid creating disincentives for people to purchase insurance.

Lastly, and to return to the issue of poor land-use planning, it is worth remembering that the 1945 thesis of the famous American geographer, Gilbert White, that “Floods are an act of God, but flood losses are largely an act of man”, still rings true and applicable to a wider range of disasters triggered by natural hazards than just floods.

A full copy of the report can be found at http://www.ifrc.org/Global/Documents/Secretariat/201610/WDR%202016-FINAL_web.pdf.

 

The June 2016 Australian East Coast Low: Importance of Wave Direction for Coastal Erosion Assessment

by Thomas R. Mortlock, Ian D. Goodwin, John K. McAneney and Kevin Roche.

In June 2016, an unusual East Coast Low storm affected some 2000 km of the eastern
seaboard of Australia bringing heavy rain, strong winds and powerful wave conditions. While wave
heights offshore of Sydney were not exceptional, nearshore wave conditions were such that
beaches experienced some of the worst erosion in 40 years. Hydrodynamic modelling of wave
and current behaviour as well as contemporaneous sand transport shows the east to north-east
storm wave direction to be the major determinant of erosion magnitude. This arises because of
reduced energy attenuation across the continental shelf and the focussing of wave energy on coastal sections not equilibrated with such wave exposure under the prevailing south-easterly wave climate. Narrabeen–Collaroy, a well-known erosion hot spot on Sydney’s Northern Beaches, is shown to be particularly vulnerable to storms from this direction because the destructive erosion potential is amplified by the influence of the local embayment geometry. We demonstrate the magnified erosion response that occurs when there is bi-directionality between an extreme wave event and preceding modal conditions and the importance of considering wave direction in extreme value analyses.

Click on the link to read entire article:  http://www.mdpi.com/2073-4441/9/2/121

 

Crowds are wise enough to know when other people will get it wrong

Unexpected yet popular answers often turn out to be correct.

This article by Cathleen O’Grady was published by Ars Technical on 29th January, 2017. https://arstechnica.com/science/2017/01/to-improve-the-wisdom-of-the-crowd-ask-people-to-predict-vote-outcome/Cathleen O’Grady  is Ars Technica’s contributing science reporter. She has a background in cognitive science and evolutionary linguistics.

Flickr user. Hsing Wei

The “wisdom of the crowd” is a simple approach that can be surprisingly effective at finding the correct answer to certain problems. For instance, if a large group of people is asked to estimate the number of jelly beans in a jar, the average of all the answers gets closer to the truth than individual responses. The algorithm is applicable to limited types of questions, but there’s evidence of real-world usefulness, like improving medical diagnoses.

This process has some pretty obvious limits, but a team of researchers at MIT and Princeton published a paper in Nature [Nature, 2016. DOI: doi:10.1038/nature21054] this week suggesting a way to make it more reliable: look for an answer that comes up more often than people think it will, and it’s likely to be correct.

As part of their paper, Dražen Prelec and his colleagues used a survey on capital cities in the US. Each question was a simple True/False statement with the format “Philadelphia is the capital of Pennsylvania.” The city listed was always the most populous city in the state, but that’s not necessarily the capital. In the case of Pennsylvania, the capital is actually Harrisburg, but plenty of people don’t know that.

The wisdom of crowds approach fails this question. The problem is that questions sometimes rely on people having unusual or otherwise specialized knowledge that isn’t shared by a majority of people. Because most people don’t have that knowledge, the crowd’s answer will be resoundingly wrong.

Previous tweaks have tried to correct for this problem by taking confidence into account. People are asked how confident they are in their answers, and higher weight is given to more confident answers. However, this only works if people are aware that they don’t know something—and this is often strikingly not the case.

In the case of the Philadelphia question, people who incorrectly answered “True” were about as confident in their answers as people who correctly answered “False,” so confidence ratings didn’t improve the algorithm. But when people were asked to predict what they thought the overall answer would be, there was a difference between the two groups: people who answered “True” thought most people would agree with them, because they didn’t know they were wrong. The people who answered “False,” by contrast, knew they had unique knowledge and correctly assumed that most people would answer incorrectly, predicting that most people would answer “True.”

Because of this, the group at large predicted that “True” would be the overwhelmingly popular answer. And it was—but not to the extent that they predicted. More people knew it was a trick question than the crowd expected. That discrepancy is what allows the approach to be tweaked. The new version looks at how people predict the population will vote, looks for the answer that people gave more often than those predictions would suggest, and then picks that “surprisingly popular” answer as the correct one.

To go back to our example: most people will think others will pick Philadelphia, while very few will expect others to name Harrisburg. But, because Harrisburg is the right answer, it’ll come up much more often than the predictions would suggest.

Prelec and his colleagues constructed a statistical theorem suggesting that this process would improve matters and then tested it on a number of real-world examples. In addition to the state capitals survey, they used a general knowledge survey, a questionnaire asking art professionals and laypeople to assess the prices of certain artworks, and a survey asking dermatologists to assess whether skin lesions were malignant or benign.

Across the aggregated results from all of these surveys, the “surprisingly popular” (SP) algorithm had 21.3 percent fewer errors than a standard “popular vote” approach. In 290 of the 490 questions across all the surveys, they also assessed people’s confidence in their answers. The SP algorithm did better here, too: it had 24.2 percent fewer errors than an algorithm that chose confidence-weighted answers.

It’s easy to misinterpret the “wisdom of crowds” approach as suggesting that any answer reached by a large group of people will be the correct one. That’s not the case; it can pretty easily be undermined by social influences, like being told how other people had answered. These failings are a problem, because it could be a really useful tool, as demonstrated by its hypothetical uses in medical settings.

Improvements like these, then, contribute to sharpening the tool to the point where it could have robust real-world applications. “It would be hard to trust a method if it fails with ideal respondents on simple problems like [the capital of Pennsylvania],” the authors write. Fixing it so that it gets simple questions like these right is a big step in the right direction.

 

Estimating building vulnerability to volcanic ash fall for insurance and other purposes

This paper by R. J. Blong, P. Grasso, S. F. Jenkins, C. R. Magill, T. M. Wilson, K. McMullan and J. Kandlbauer was published on 26th January 2017 in the Journal of Applied Volcanology.

Abstract:

Volcanic ash falls are one of the most widespread and frequent volcanic hazards, and are produced by all explosive volcanic eruptions. Ash falls are arguably the most disruptive volcanic hazard because of their ability to affect large areas and to impact a wide range of assets, even at relatively small thicknesses. From an insurance perspective, the most valuable insured assets are buildings. Ash fall vulnerability curves or functions, which relate the magnitude of ash fall to likely damage, are the most developed for buildings, although there have been important recent advances for agriculture and infrastructure.  Read more

Scientists expect sand flow on East Coast to slow

The following news pieces have been picked up by various sources from a paper published late last year in the Journal of Geophysical Research, Oceans titled “Tropical and extratropical-origin storm wave types and their influence on the East Australian longshore sand transport system under a changing climate” by Ian Goodwin, Thomas Mortlock and Stuart Browning. Thomas Mortlock is a member of the Risk Frontiers’ team and Ian Goodwin and Stuart Browning are members of the Marine Climate Risk Group at Macquarie University.  Click here to read entire article.

Aerial view of Byron Bay. Source: Swellnet Analysis

Scientists expect sand flow on East Coast to slow. Swellnet Analysis. https://www.swellnet.com/news/swellnet-analysis/2016/07/27/scientists-expect-sand-flow-east-coast-slow

Why Qld beaches will lose their sand to NSW. The Chronicle. http://www.thechronicle.com.au/news/why-qld-beaches-will-lose-their-sand-nsw/3084034/#/0

Gold Coast at threat of severe erosion and property damage, research shows. Gold Coast Bulletin. http://www.goldcoastbulletin.com.au/lifestyle/beaches-and-fishing/gold-coast-at-threat-of-severe-erosion-and-property-damage-research-shows/news-story/46326698783e430830f4e668b0086191

Solving the Puzzle of Hurricane History

This article was posted on the NOAA website on 11 Feb 2016.

If you want to understand today, you have to search yesterday.”  ~ Pearl S. Buck

One of the lesser-known but important functions of the NHC [National Hurricane Centre, Miami, Florida] is to maintain a historical hurricane database that supports a wide variety of uses in the research community, private sector, and the general public.  This database, known as HURDAT (short for HURricane DATabase), documents the life cycle of each known tropical or subtropical cyclone.  In the Atlantic basin, this dataset extends back to 1851; in the eastern North Pacific, the records start in 1949.  The HURDAT includes 6-hourly estimates of position, intensity, cyclone type (i.e., whether the system was tropical, subtropical, or extratropical), and in recent years also includes estimates of cyclone size.  Currently, after each hurricane season ends, a post-analysis of the season’s cyclones is conducted by NHC, and the results are added to the database. The Atlantic dataset was created in the mid-1960s, originally in support of the space program to study the climatological impacts of tropical cyclones at Kennedy Space Center.  It became obvious a couple of decades ago, however, that the HURDAT needed to be revised because it was incomplete, contained significant errors, or did not reflect the latest scientific understanding regarding the interpretation of past data.  Charlie Neumann, a former NHC employee, documented many of these problems and obtained a grant to address them under a program eventually called the Atlantic Hurricane Database Re-analysis Project.  Chris Landsea, then employed by the NOAA Hurricane Research Division (HRD) and now currently the Science and Operations Officer at the NHC, has served as the lead scientist and program manager of the Re-analysis Project since the late 1990s.

In response to the re-analysis effort, NHC established the Best Track Change Committee (BTCC) in 1999 to review proposed changes to the HURDAT (whether originating from the Re-analysis Project or elsewhere) to ensure a scientifically sound tropical cyclone database.  The committee currently consists of six NOAA scientists, four of whom work for the NHC and two who do not (currently, one is from HRD and the other is from the Weather Prediction Center).

Over the past two decades, Landsea, researchers Andrew Hagen and Sandy Delgado, and some local meteorology students have systematically searched for and compiled any available data related to each known storm in past hurricane seasons.  This compilation also includes systems not in the HURDAT that could potentially be classified as tropical cyclones.  The data are carefully examined using standardized analysis techniques, and a best track is developed for each system, many of which would be different from the existing tracks in the original dataset.  Typically, a season’s worth of proposed revised or new tracks is submitted for review by the BTCC.  Fig. 1 provides an example set of data that helped the BTCC identify a previously unknown tropical storm in 1955.

 

Figure 1. Surface plot of data from 1200 UTC 26 Sep 1955, showing a previously unknown tropical storm.

The BTCC members review the suggested changes submitted by the Re-analysis Project, noting areas of agreement and proposed changes requiring additional data or clarification. The committee Chairman, Dr. Jack Beven, then assembles the comments into a formal reply from the BTCC to the Re-analysis Project. Occasionally, the committee’s analysis is presented along with any relevant documentation that would help Landsea and his group of re-analyzers account for the differing interpretation.   The vast majority of the suggested changes to HURDAT are accepted by the BTCC.  In cases where the proposed changes are not accepted, the BTCC and members of the Re-Analysis Project attempt to resolve any disagreements, with the BTCC having final say.

In the early days of the Re-analysis Project, the amount of data available for any given tropical cyclone or even a single season was quite small, and so were the number of suggested changes.  This allowed the re-analysis of HURDAT to progress relatively quickly.  However, since the project has reached the aircraft reconnaissance era (post 1944), the amount of data and the corresponding complexity of the analyses have rapidly increased, which has slowed the project’s progress during the last couple of years.

The BTCC’s approved changes have been significant. On average, the BTCC has approved the addition of one to two new storms per season.  One of the most highly visible changes was made 14 years ago, when the committee approved Hurricane Andrew’s upgrade from a category 4 to a category 5 hurricane.  This decision was made on the basis of (then) new research regarding the relationship between flight-level and surface winds from data gathered by reconnaissance aircraft using dropsondes.

Figure 2 show the revisions made to the best tracks of the 1936 hurricane season, and gives a flavor of the type, significance, and number of changes being made as part of the re-analysis.  More recent results from the BTCC include the re-analysis of the New England 1938 hurricane, which reaffirmed its major hurricane status in New England from a careful analysis of surface observations.  Hurricane Diane in 1955, which brought tremendous destruction to parts of the Mid-Atlantic states due to its flooding rains, was judged to be a tropical storm at landfall after re-analysis.   Also of note is the re-analysis of Hurricane Camille in 1969, one of three category 5 hurricanes to have struck the United States in the historical record.  The re-analysis confirmed that Camille was indeed a category 5 hurricane, but revealed fluctuations in its intensity prior to its landfall in Mississippi that were not previously documented.

The most recent activity of the BTCC was an examination of the landfall of the Great Manzanillo Hurricane of 1959.  It was originally designated as a category 5 hurricane landfall in HURDAT and was the strongest landfalling hurricane on record for the Pacific coast of Mexico. A re-analysis of ship and previously undiscovered land data, however, revealed that the landfall intensity was significantly lower (140 mph).  Thus, 2015’s Hurricane Patricia is now the strongest landfalling hurricane on record for the Pacific coast of Mexico, with an intensity of 150 mph.

Figure 2. Revisions made to the best tracks of the 1936 hurricane season

The BTCC is currently examining data from the late 1950s and hopes to have the 1956-1960 re-analysis released before next hurricane season.  This analysis will include fresh looks at Hurricane Audrey in 1957 and Hurricane Donna in 1960, both of which were classified as category 4 hurricane landfalls in the United States.   As the re-analysis progresses into the 1960s, the committee will be tackling the tricky issue of how to incorporate satellite images into the re-analysis, including satellite imagery’s irregular frequency and quality during that decade. The long-term plan is to extend the re-analysis until about the year 2000, when current operational practices for estimating tropical cyclone intensity became established using GPS dropsonde data and flight-level wind reduction techniques.

https://noaanhc.wordpress.com/2016/02/11/solving-the-jigsaw-puzzle-of-hurricane-history/