This paper by the Risk Frontiers team has just been published in the journal Environmental Science and Policy. The paper documents an analysis of the circumstances surrounding fatalities due to flooding in Australia between 1900 and 2015. This longitudinal investigation is important to understand changing trends in social vulnerability and to inform efficient and strategic risk reduction strategies. The basis of this analysis was PerilAUS, Risk Frontiers’ database of historical natural hazard impacts in Australia. Through funding provided by the Bushfire and Natural Hazards CRC the data was augmented and verified using coronial inquest records which provide detailed data concerning the circumstances of each fatality. Overall there have been 1859 fatalities identified, with distinct trends in relation to gender, age, activity and reason behind the activity. https://authors.elsevier.com/s d/article/S1462901117301818
Washington Post July 16, 2017.
This article from The Washington Post on June 16 by Brady Denis nicely highlights the complex political issues facing attempts to reform the US National Flood Insurance Program.
As is discussed in McAneney et al. (2017) (http://www.sciencedirect.com/science/article/pii/S221242091530159X) many government-sponsored insurance schemes end up subsidising homeowners to live in vulnerable locations.
PEQUANNOCK, N.J. — Time after time, as the river has risen and the water has crept up Roosevelt Street, Leni-anne Shuchter has fled the white clapboard home she bought more than four decades ago.
There was the night in 1984 when rescuers plucked her from a neighbour’s roof as floodwaters engulfed her house. And the months in 2011 when she and her husband, John Van Seters, lived in a hotel after torrential rains from Hurricane Irene forced them to gut walls and floors and replace nearly everything they owned.
In between, other storms have forced her to file claim after claim with the troubled National Flood Insurance Program so she could rebuild. Yet the small home remains as vulnerable as ever, a reality reflected by its falling value in recent years.
“If I had a choice, I would sell,” said the 65-year-old Shuchter, who dreams of retiring to Arizona or Nevada. “I don’t need to deal with this anymore. [But] the reality of selling is nil.”
The couple’s house is what the federal government defines as a “severe repetitive loss property” — one of many that have been covered over and over again by taxpayers, the cumulative payouts often far exceeding what the structures are worth. Nationwide, 11,000 such properties dot coastal zones or other low-lying areas, and their numbers continue to grow, in part because of the effects of climate change and ongoing development.
One house outside Baton Rouge, valued at $55,921, has flooded 40 times over the years, amassing $428,379 in claims. A $90,000 property near the Mississippi River north of St. Louis has flooded 34 times, racking up claims of more than $608,000. And an oft-flooded Houston home has received more than $1 million in payouts — nearly 15 times its assessed value of $72,400. The data is collected by the Federal Emergency Management Agency, which oversees the insurance program.
The extreme cases are only a fraction of the NFIP’s 5 million active policies, but they historically have accounted for about 30 percent of its claims. And while they’re a financial albatross for taxpayers, the claims are hardly the program’s only challenge.
The NFIP, which must be reauthorized by the end of September, is nearly $25 billion in the red — a debt that administrator Roy Wright says he sees no way to pay back.
“Only Congress can deal with that past loss,” Wright said last week . “What we’re focused on today is ensuring that going forward, we’re putting ourselves on a sound financial footing.”
On Capitol Hill, lawmakers are scrambling to overhaul the half-century-old program. Allowing it to lapse Sept. 30 would risk disrupting the buying and selling of homes in flood-prone areas across the country.
The NFIP has long enjoyed bipartisan support, if for one simple reason. “Where it rains, it can flood, so no one in the country is insulated,” said Laura Lightbody, who directs an initiative at the Pew Charitable Trusts aimed at helping communities better prepare for flood risks. “It touches all 50 states.”
But not equally. Data shows that some of the worst flooding, and often the most frequent, has occurred along the Gulf Coast of Louisiana and Texas. Houses along the Mississippi River have repeatedly been deluged. And the Atlantic coast from Miami to Boston faces perpetual — and escalating — threats. Although there are certainly beachfront mansions affected, many homes belong to working-class Americans.
Critics have long maintained that although the NFIP was intended to encourage smarter development, its current design too often bails out people in flood-prone areas. In short, it incentivizes staying put, whatever the cost, rather than moving to higher ground. Plus it has had only limited success in discouraging development in questionable areas.
Figuring out how to tackle the program’s problems remains complicated and politically fraught. Lawmakers must decide whether to raise rates — and by how much — on the roughly one in five homeowners who pay below-market premiums mandated by Congress. Making the homeowners pay rates that reflect their true flood risk could shore up the program’s finances; it also could mean sharp premium hikes and a public backlash over affordability.
The same dilemma is part of the reason Congress retreated from its last major effort to fix things five years ago, when a sudden rise in rates caused an outcry in some communities.
“No congressman ever got unelected by providing cheap flood insurance,” said Rob Moore, a senior policy analyst at the Natural Resources Defense Council and an expert on the program.
Some on Capitol Hill are pushing for more private firms to enter the flood insurance market — an idea Wright, the administrator, said he supports — although critics worry that companies could cherry-pick the least troubled properties, leaving the government on the hook for the other addresses.
No matter who the underwriter is, Congress must deal with the thorny question of how best to fund the continued updating of detailed U.S. flood maps. Many are woefully outdated and do not reflect changed flood risks — not to mention future risks from factors such as rising seas. The Trump administration has actually proposed cutting $190 million annually from the mapping work.
Flooding remains the most common and most costly form of natural disaster in the United States, and insurance to protect against it has become increasingly necessary in certain places. A report this month from the Union of Concerned Scientists suggests an ominous future. Within the next two decades, it forecasts that nearly 170 U.S. coastal communities will face chronic inundation, defined as flooding at least 26 times a year. That’s almost twice as many at-risk locations as today.
Congress created the flood insurance program in 1968 because the costs of disaster assistance were escalating and private insurers had largely abandoned the market. The program not only requires people purchasing homes in floodplains to take out insurance as a condition of getting a mortgage, but it also provides grants to help mitigate vulnerable properties, either by elevating them or in some cases buying out homeowners and tearing their structures down.
But the latter isn’t happening often enough, according to the NRDC’s Moore.
“It’s helping people stay in places that we know are unwise to stay in,” he said. “The days of flood-rebuild-repeat need to come to an end. We need to do things differently to get out of that cycle.”
The financial woes began when Hurricane Katrina devastated the Gulf Coast in 2005, followed by hurricanes Rita and Wilma. The program paid eight times as many claims that year as in any previous year — and ended up borrowing $17.5 billion from the U.S. treasury.
Hurricane Sandy in 2012 resulted in 144,000 more claims and another $6.25 billion in debt, as well as allegations that thousands of homeowners were wrongfully denied payouts by companies administering flood insurance on FEMA’s behalf.
Even in 2016, when there was no singularly catastrophic event, floods in Louisiana, Texas and other states resulted in the third-largest year of payouts in the program’s history.
In a report this spring, the Government Accountability Office detailed the NFIP’s fundamental dilemma, saying it “has experienced significant challenges because FEMA is tasked with pursuing competing programmatic goals — keeping flood insurance affordable while keeping the program fiscally solvent.”
For all its troubles, lawmakers know that the program affects the lives of millions of Americans and that failing to reauthorize it this fall could cause major upheaval for homeowners and the real estate market.
“Flood disasters today would be truly grim but for NFIP,” said Nicholas Pinter, a geology professor at the University of California at Davis and an expert in flood risks. He added, “It definitely has problems. . . . It needs improving. But it’s a hell of a lot better than it was when there was nothing.”
A House committee last month passed legislation to overhaul and reauthorize the program. If adopted, it would compel communities with persistent flooding problems to develop plans to reduce them and would require more transparency about a property’s flood history.
“The American taxpayer [has] been called upon in the past to bail out a program that is currently drowning,” the committee’s chairman, Rep. Jeb Hensarling (R-Tex.), said this spring as lawmakers weighed varying proposals. And although homeowners need to be protected from “sticker shock . . . the program must be made sustainable.”
The Senate also is trying to strengthen the NFIP, with measures proposed to better fund flood mitigation projects, promote the use of high-resolution mapping technology and encourage private insurers to enter the market.
Back on Roosevelt Street in Pequannock, a stone’s throw from the Pompton River, Shuchter and her husband have all but relinquished their dream of retiring and moving, at least for now.
With help from local officials, the couple are in the process of securing a FEMA grant that would raise their 960-square-foot house eight to 10 feet off the ground. The project could begin late this year and cost an estimated $196,000 — $10,000 more than their property’s assessed value.
The work will mean up to six more months living in a hotel. They will return to a home hovering high above its previous site, and stairs Shuchter worries will grow only more daunting as they age.
In the meantime, Shuchter keeps important papers — birth certificates, wills, past flood records — in a waterproof box in the bedroom. She has made digital copies of family pictures. She also has a list of what to quickly grab when the next evacuation call comes, everything from medications to laptops.
She also has bookmarked a National Weather Service website that monitors the flood gauge on the river. On nights when rain is pounding or a storm is swirling, she often stays up late, checking the site to make sure the water hasn’t risen to perilous levels. But experience tells her it’s only a matter of time.
“I do believe it’s when,” she said. “Not if.”
Risk Frontiers, Australia’s longest running natural hazards research centre, is spinning out from Macquarie University after a successful partnership of 23 years.
The ‘new’ Risk Frontiers will continue to provide the rigorous, science-based advice that clients have come to expect. Strong relationships forged with key academics at Macquarie University will be maintained with the creation of a Risk Frontiers Research Fellowship Fund for joint collaborative research in natural hazards, as well as new endeavours in cyber security.
At the time of the 2011 Brisbane floods, few insurance companies offered cover for damage arising from riverine flood: now, because of Risk Frontiers’ involvement in the development of the National Flood Information Database for the Insurance Council of Australia, some 93% of property owners are covered for this risk. The ICA’s General Manager (Risk) Karl Sullivan sees this ‘sea change’ as a success, not only for the insurance sector, but “in making communities more resilient in the face of a key natural peril threat.” Amongst its many contributions to the insurance sector, Karl complements Risk Frontiers on its “courageous and insightful” work in respect to understanding the societal drivers of the rising economic costs of natural disasters, the relative contribution of climate change toward this increasing cost, and the key role played by poor landuse-planning decisions. This body of research laid the foundations for the Productivity Commission’s inquiry into Natural Disaster Funding arrangements.
The company’s CEO, Professor John McAneney, pledges that Risk Frontiers will continue to deliver thought leadership and high-level solutions and services for key industry, government and emergency management clients. In its new guise, there will be an even greater focus on creating innovative commercial solutions to natural peril and other extreme risks.
According to the Macquarie University’s Deputy Vice Chancellor Research, Professor Sakkie Pretorius, “Risk Frontiers’ cutting-edge catastrophic risk modelling has had a material impact on the way natural disaster risks are priced and managed in this country. It serves as a wonderful example of genuine engagement by academia with the private and government sectors resulting in tangible science-led outcomes.”
Christopher Lee, CEO of Climate-KIC Australia describes Risk Frontiers as “the leading supplier of risk advice, models and data to government, insurance and emergency management. They bring wide-ranging expertise to a suite of complex problems and are able to provide pragmatic solutions that meet multiple stakeholder needs” and he looks forward to “continuing to work with them in this new chapter.”
This sentiment is echoed by Dr Richard Thorton, CEO, Bushfire & Natural Hazards CRC. “I congratulate Risk Frontiers on its decision to spin out from Macquarie University and look forward to ongoing research relationships with the new Risk Frontiers.”
Sadly the risks identified in our Briefing Note 315 (April 2016), and which you can again read below, have been realised in London in the June 14 Grenfell Tower fire with the loss of at least 80 lives. The building had smoke detectors but no automatic sprinklers and only a single central staircase for access and evacuation.
Immediately after the fire it was clear that the cladding material had amplified the fire risk. It’s claimed that in a recent refurbishment of the building, a bid seeking to employ higher quality (less combustible) cladding was rejected on cost considerations.
Fire services had earlier reinforced the message to residents that in the case of a fire in someone else’s apartment they were to stay in their home until told otherwise. This advice was on the presumption that a fire could be contained to a single apartment, which was clearly not the case as the cladding led to a fire engulfing the entire building.
The government has taken immediate steps to conduct an audit of hi-rise (>18 m) social housing across the country to see just how widespread the problem is and to ensure that any other similar risks be identified and acted upon. It has also set up an expert panel to advise on other urgent steps to improve fire safety.
Housing associations, local authorities and private landlords are currently engaged in a checking and testing process for Aluminium Composite Material (ACM) cladding at the Building Research Establishment. The Department for Communities and Local Government has extended this checking and testing approach to residential tower blocks owned by private landlords and to tall buildings in the public sector, including hospitals and schools.
The current screening programme only tests the filler. We do not yet know the number of buildings involved, but by the end of Thursday June 29, none of the cladding materials tested satisfied the limited combustibility required by the building regulation guidance.
Building standard requirements set out a requirement that external walls on all buildings do not allow fire spread. Each individual element of the wall — insulation, filler material, etc. — must be of limited combustibility, and each component must meet set standards for this. The second requirement is that the combined elements of a wall, when tested as a whole system, have sufficient fire spread resistance to satisfy the set standard. This was clearly not the case for the Grenfell Tower, or any of the other buildings described in the Briefing note below.
The challenge moving forward is two-fold:
- Ensuring that the use of combustible cladding is limited on new buildings, and1. Ensuring that the use of combustible cladding is limited on new buildings, and
- Addressing the significant risk in the current building stock arising from the use of combustible cladding.
Removing cladding from high-rise buildings will present its own set of risks and costs and may not be the best strategy. Authorities and building owners need to accept that a significant risk exists and then to analyse this to aid decision-making. What is needed is an overall assessment of risk to those living in such buildings. This is not just a fire engineering problem, but a risk problem, and one that also poses a significant issue for the insurance industry.
This is a challenge for many countries, not just the UK and Australia.
More information on the UK Government’s building safety program can be found at:
Role of Composite Sandwich Panel Cladding in Recent Hi-Rise Building Fires
A number of recent fires around the world in which Aluminium Composite Panels (ACP) are implicated is worrying. It is not a new problem having been an issue for UK companies in the late 1990s and early 200os, but over the last 12 months or so we have seen several very high profile fires. Insurers of commercial buildings will be looking closely at their policy wordings.
Last May saw the fire in Docklands, Melbourne, of the 23-story Lacrosse building. Thought to have begun from an un-extinguished cigarette, the fire started on the 8th floor and spread rapidly up the outside of the building to the 21st story.
A report by the Metropolitan Fire Brigade (MFB) said the burn pattern of the blaze was unusual and the “rapid vertical fire spread up the building appeared to be directly associated with the external facade of the building, rather than – – – the internal parts or extensive fuel loads stored on many of the balconies.”
The cladding comprises an aluminum sandwich with expanded polystyrene or polyethylene as the filler and is widely employed because of its attractiveness and insulating properties. More expensive products use mineral-based insulation, which is much less combustible. One can only tell the difference between these products by taking a core sample — or perhaps applying a flame to it!
Over the last two decades, buildings are required by law to be more energy efficient, which has resulted in the increased use of thermal insulation materials. This has had the unintended consequence (side-affect) of increasing the fire risk in buildings.
The City of Melbourne has given the 400 owners of apartments in the Lacrosse residential building in Docklands a year to replace the building’s non-compliant cladding with panels that meet Australian fire-safety standards.
During the holiday period leading up to Chinese New Year in 2009, a 34-story building, containing a 241-room hotel and a cultural centre, and part of China Central Television’s new headquarters, suffered a similar fate to the Lacrosse building in Melbourne. Fire fighters, their equipment reaching up only a dozen or so floors, could do little to contain the blaze, a spectacular wall of flames reflected in the glass skin of the adjacent CCTV tower, which was untouched by the fire.
Last February another fire, thought to have been started by a barbecue on a balcony on the 51st floor, led to the evacuation of a 79-story residential apartment building in Dubai’s Marina District. The building, ironically called The Torch, had been in 2011 the world’s tallest residential building. The original cost of the contract for the cladding was $20 million; the cost to replace it is unknown.
Then on New Year’s Eve, 2015, another Dubai high-rise (63 storeys) building, The Address Downtown hotel, located close to the world’s highest artificial structure the Burj Khalifa skyscraper, suffered a cladding fire so spectacular that it was said to eclipse the fireworks.
To the best of our knowledge in none of these fires has there been any casualties. All appear to share a common use of the ACP as external cladding. So what’s going on?
Well aside from the intrinsic combustibility of the polystyrene or polyethylene filler in the APC it seems likely that the vertical arrangement of this cladding means that the heating caused by the fire cures the material above and in this way accelerates the fire’s vertical propagation. The fire runs up the external wall like a wick.
The speed at which the fires have spread vertically may be responsible for the lack of casualties seen to date. However, sooner or later the radiant heat flux from the external fire will lead to internal combustions that can’t be controlled.
The MFB report after the Lacrosse building argued that the outcome may have been worse if the building’s sprinkler system had not performed beyond its design capacity. It is not hard to imagine a situation with enough fires burning simultaneously in adjacent apartments that a loss of water pressure will mean that sprinklers will not work effectively. Thus there is the potential for even more serious fires and loss of life.
Dubai Civil Defence has recently ordered two new fire trucks with extra-powerful pumps that can deal with fires in buildings as high as 100 floors. At present, the ladders of the fire trucks used by the Civil Defence can only reach up to 18 floors.
And so how widespread is the problem in Australia? At this point we do not know, but a recent audit of 170 Melbourne CBD tower buildings by the Victorian Building Authority is reported to have found that 51% employed similar non-compliant ACP materials. This situation is likely similar in other major cities around the country.
Risk Frontiers spins out from Macquarie University!
As at July 1 and after 23 years at Macquarie University Risk Frontiers will be spinning out as a private R&D company under the ownership of the existing employees. Below we review this remarkable history as we embark on the next leg of this adventure.
Five of Australia’s six most costly natural hazard events have come from different perils: a tropical cyclone, an earthquake, a flood, a bushfire and a convective storm. Over the last 23 years, a unique approach to understanding these risks has developed in Australia through a close relationship between the insurance and academic sectors. And by doing so Australia has been at the cutting edge in applying advances in technology and science to the benefit of the broader community.
The early players in the catastrophe loss modelling space set up shop in the late 1980s in America, but it was not until Hurricane Andrew made landfall in Florida in 1992 that the true power of such modelling was recognised. Approaches to pricing natural hazard risks at the time relied very much on the proverbial rate maker’s moistened finger in the air and recent experience. The errors in this approach had not been exposed because the previous 20 years had been relatively benign with no intense hurricanes making landfall and afflicting areas of high exposure.
Missing from the traditional approach was the bringing together of the science of the hazard with a geospatial understanding of assets and the structural weaknesses of buildings together with insurance policy conditions. Natural catastrophe loss models, while primitive by today’s standards, did just this.
Within hours of Hurricane Andrew making landfall, modelling pioneer Karen Clarke forecast Andrew’s insurance losses to be in excess of $13 billion, way more than Lloyds of London’s estimate of $6 billion. Months later when the final loss emerged at $15.5 billion, eleven insurance companies had gone bust. Clarke’s early estimate of losses after Andrew had proven robust and the utility of catastrophe models fully apparent.
Meanwhile, on the other side of the world in Australia, farsighted individuals in the insurance sector also saw the benefits of catastrophe modelling, but were well aware that interests in the larger exposures of Europe, America and Japan would capture this development. Australia needed its own R&D capability in this new area of applied science.
With this in mind, parties in the insurance sector in Australia reached out to the academic sector to see if there was interest in developing an independent research centre in the natural hazards space. Macquarie University Professor Russell Blong answered this call and the start of a unique partnership between industry and academia was spawned.
Risk Frontiers, born in 1994 under Russell’s leadership, is now the longest running natural hazards research centre in the country. It was initially funded by a group of sponsor insurance, reinsurance and reinsurance broking companies, which provided seed capital in the form of sponsorship. Representatives of these companies provided an advisory board that still exists to this day and which helps set Risk Frontiers’ research agenda. Much of that agenda today is devoted to improving the management of natural hazard risks including a significant commitment to risk communication.
While its business model has changed somewhat over its 23-year history, Risk Frontiers developed into an independent, self-funded R&D business under the stewardship of Professor John McAneney. It continued to thrive despite the differing incentive structure of academics and the commercial business interests of the insurance sector. In spinning out of the University, Risk Frontiers seeks to realise our ambition to become the most credible independent source of risk knowledge, products and services in the natural disaster space, across Asia Pacific.
While the insurance sector still remains a core focus of many of the activities at Risk Frontiers, our multidisciplinary team also works closely with government, disaster management agencies, and supports international efforts to help manage disaster risks and improve the safety of communities. Several staff served as expert witnesses to the Royal Commission into the 2009 Victorian bushfires. The team has also made invited contributions to other key inquiries such as that into the 2010-11 Queensland floods, the Productivity Commission’s review into funding natural disasters and the role of government in the provision of natural catastrophe insurance.
Risk Frontiers will continue to provide evidence-based thought leadership on topics ranging from the potential for improved building codes and land use planning guidelines to reduce risk. Our research interests include risk communication, the detection of global climate change signals in loss data, post-disaster event investigations, estimating the economic costs of natural disasters and helping emergency service agencies in the development of risk management plans. It collaborates closely with other research institutions including the Australian Bushfire and Natural Hazards Cooperative Research Centre.
Risk Frontiers has served Australia remarkably well and is now set to continue to expand on this legacy of achievement as the world faces new challenges in a warming climate as well as the current threats from natural and man-made risks. In spinning out of the University, Risk Frontiers can be more commercial in some aspects of its business operations, while continuing to provide the rigorous science-based advice that its clients have come to expect. Strong relationships forged with key academics at Macquarie University will be maintained with the creation of a Risk Frontiers Fellowship Fund for joint collaborative research in natural hazards, as well as new endeavours in cyber security and machine learning. Stay tuned . . .
Please visit our website to keep abreast of new developments http://www.riskfrontiers.com. Or contact John McAneney directly (email@example.com).
Risk Frontiers’ Suite of CAT Models to be available on AIR Worldwide’s Touchstone Platform
Risk Frontiers’ suite of Probabilistic Catastrophe Loss Models for Australia and New Zealand will be available on AIR Worldwide’s Touchstone® 5.0 platform for licensing from Risk Frontiers in June 2017. The suite of models comprises the following:
* Tropical Cyclone (Australia) – CyclAUS 3.1
* Earthquake (Australia and New Zealand, post Christchurch) – QuakeAUS 5.1, QuakeNZ 2.0
* Bushfire (Australia) – FireAUS 2.1
* Hail (Australia) – HailAUS 6.2
* Flood (Australia) – FloodAUS 3.1
During a demo at AIR’s Envision Conference in Las Vegas in April, Risk Frontiers models worked seamlessly on a preview release of the Touchstone 5.0 environment.
Starting in June, clients who license both Touchstone 5.0 and Risk Frontiers for Touchstone will be able to run Risk Frontiers’ models on exposures stored in Touchstone directly from the Touchstone user interface.
Risk Frontiers maintains and continues to develop its own Multi-Peril platform, but this new delivery method provides an easy access option for Touchstone users.
Please contact Risk Frontiers or AIR for further information about licensing Risk Frontiers’ models on Touchstone.
Contacts: Dr Ryan Crompton (firstname.lastname@example.org) | Dr Foster Langbein (email@example.com) | Carol Robertson (firstname.lastname@example.org) Telephone: (02) 9850 9683 | Dr Kunal Joarder (email@example.com)Telephone: +1-617-267-6645
Should governments allow fire affected communities to rebuild?
Associate Professor Michael Eburn, email: firstname.lastname@example.org
In January 2017, the ABC’s 7.30 program reported on the rebuilding of Wye River and Separation Creek, two Victorian settlements that had been severely impacted by bushfire on Christmas day 2015. During the course of the program, Michael Buxton, Associate Professor of Planning and Environment at RMIT University, Melbourne, argues against ‘allowing people to rebuild when fire-affected areas are burnt out’.
The 2009 Victorian Bushfires Royal Commission recommended:
The State develop and implement a retreat and resettlement strategy for existing developments in areas of unacceptably high bushfire risk, including a scheme for non-compulsory acquisition by the State of land in these areas.
The State did not adopt that recommendation. The then Premier of Victoria, the Hon John Brumby, said:
We have hundreds of thousands of Victorians who choose to live in our bush and in areas close to our beautiful state and national parks. These places are, by their very definition, in high fire-danger-risk areas, but I will always defend people’s right to live in these areas and enjoy the beauty of our natural bush.
Associate Professor Buxton was ‘disappointed the Government stopped short at implementing’ this recommendation. He says:
Governments just keep allowing people to rebuild when fire-affected areas are burnt out… Every rational factor says, “Don’t do it. Don’t allow people to rebuild in these really dangerous areas.” But governments, um… I think they have this emotional reaction.
In answer to the question ‘Why shouldn’t people be allowed to stay and rebuild…?’ he says, ‘I think governments have a responsibility to prevent people from doing extreme harm or potential harm to themselves’.
This brief review raises several questions such as: Does every rational factor really say ‘Don’t allow people to rebuild in these really dangerous areas’? Is the government response any more, or less, an ‘emotional reaction’? Do governments have a responsibility to prevent people from doing extreme harm or potential harm to themselves? I suggest that the answer to all those questions is ‘no’.
Does every rational factor really say ‘Don’t allow people to rebuild in these really dangerous areas’?
The United Nations defines risk as ‘The combination of the probability of an event and its negative consequences’. That definition of ‘risk’ is largely replicated in risk registers that define a risk as low, medium or high in a matrix with probability as the ‘x’ axis and consequences on the ‘y’ axis.
Assessing and managing risk is not simply a matter of having accurate risk figures. There is nothing inherent in a 2%, 1%, or 0.4% probability of flood, fire or other hazard that says a risk is or should be acceptable, but another risk is too high. What is acceptable depends not only on the statistical probability of the hazard event but also its potential consequences and the assessment of those consequences depends on the values of those at risk.
For people who focus on bushfires or natural hazards, the risk of death and destruction from that hazard is axiomatic. A land use planner who takes a broad, landscape view and identifies that one community is at a higher risk of bushfire than another may identify that an effective way to manage the risk is to move people out of the high risk area, to the low risk area. Such an action is not risk free nor does it create a risk free environment. There may be risk to people’s health and wellbeing if they are forced to leave an area that is important to them. There is a risk of social dislocation if people are forced to move and move into other small communities that are not resourced to support the newcomers. The risk of loss due to fire may be reduced but a sociologist or psychologist may see forced relocation as an action that unacceptably increases risk of other harms. For the people who value the life and lifestyle of their community, the loss of homes to fire may be a significant outcome, but being denied the right to continue to live there may also be severe.
People who understand the risk, who place different values on various losses and who have considered those issues are being equally rational when they decide to rebuild their homes. The question of whether rebuilding is a rational response depends on what values one is trying to preserve. To put that another way, it depends on what factors are taken into account when deciding where a consequence sits on the ‘severity’ scale of the risk register.
Is the government response any more, or less, an ‘emotional reaction’?
Which risk is to be prioritised is not based on the risk matrix but on our emotional perception of risk. Risk Frontiers have identified 974 bushfire deaths between 1900 and 2015.updated from vii However, bushfires cause fewer deaths than other natural hazards such as floods, cyclones and heat waves which, during the same period (1900-2015) have caused 1912, 1216 and at least 4561 deaths respectively. Each year, road accidents kill more people than have ever died in Australian bushfires. In 2015 alone, 1205 people were killed in road accidents: that is nearly twice the entire number of people killed in bushfires in the preceding century. Even so the emotional reaction to bushfire losses is much greater than our reaction to the annual death toll on the road.
A risk that governments seek to manage, apart from the risk of death and destruction due to a hazard such as bushfire, is the risk of being blamed for a disaster. If potential blame is a risk then it is a risk that can be managed.
… experts who are being made increasingly accountable for what they do are now becoming more preoccupied with managing their own risks. Specifically, secondary risks to their reputation are becoming as significant as the primary risks for which experts have knowledge and training.
In order to manage this ‘secondary risk’ governments have to balance the risk to their reputation should a hazard event such as a terrorist attack, a catastrophic bushfire with loss of life, a domestic murder or a fatal car accident occur, with the risk to their electoral standing should they take measures to reduce that risk. Those risks can be reduced e.g. by refusing entry to everyone from a list of proscribed countries, requiring all homes in fire risk areas be built as underground concrete bunkers, refusing bail for anyone alleged to have committed domestic violence or banning private cars. The solutions may be effective, but not at a cost the community is willing to pay.
Not allowing people to rebuild their homes may also be a price that is too high, and the demand that people not be allowed to rebuild is as much an emotional reaction, based on factors other than a quantifiable risk, as the decision to allow people the ‘right to live in these areas and enjoy the beauty of our natural bush’.
The policy of the National Strategy for Disaster Resilience is to build resilient communities and share responsibility for all aspects of disaster management. A disaster resilient community is one where people understand the risks that may affect them. They have comprehensive local information about hazards and risks and have taken action to mitigate their risk and to develop plans to respond should a hazard occur. Compelling people to move out of an area does not create a resilient community. A community that has never faced, and never will face, a bushfire is not ‘resilient’ to bushfire. Forcing people to move away from an area that they love, and the neighbours and relationships that they have established, is to destroy a community, not make it resilient. If individuals and communities are to take responsibility for their own risk, then governments, insurers, and other communities (such as the community of land use planners) should accept that those individuals and communities are free to make choices that others would not make. If governments are going to share the responsibility for risk management with individuals or communities then there has to be room for those individuals and communities to prioritize those values in a way that is both rational and informed, even if others, including governments, would prefer to give greater priority to other values such as individual safety.
Do governments have a responsibility to prevent people from doing extreme harm or potential harm to themselves?
In the High court of Australia, Crennan and Kiefel JJ said:
The common law generally does not impose a duty upon a person to take affirmative action to protect another from harm… So far as concerns situations brought about by the action of the person at risk, it is the general view of the common law that such persons should take responsibility for their own actions…
Governments may not have an obligation ‘to prevent people from doing extreme harm or potential harm to themselves’ but they may have an obligation to prevent them doing harm to others. Building codes can ensure that developers and landlords don’t expose subsequent purchasers or tenants to undue risk. Prohibition of building in high risk areas may be necessary to protect vulnerable people, such as children, who cannot make an informed choice to accept a risk. Restrictions may be justified on the basis that the cost of providing necessary infrastructure, such as evacuation routes and fire fighting services, imposes too great a cost on the broader community.
There may be good grounds for refusing to allow communities to rebuild after they have been razed by fire but the claim that these are ‘really dangerous areas’ is not sufficient. Danger and risk are in the eye of the interest holder. Experts in fire, flood or hazard management may well be able to determine the probability of a hazard event that is the relevant point on the ‘x’ axis of the risk matrix. Where a consequence sits on the ‘y’ axis, in the range from minor to extensive, depends upon the interests and values that the person making the assessment chooses to prioritize. In these days of ‘shared responsibility’ and ‘resilient communities’ acceptable risk should be a matter for negotiation.
If individuals and communities are to take responsibility for their own risk, then governments, insurers, and other communities must accept that those individuals and communities are free to make choices that others would not make, or would prefer them not to make. If, on the other hand, governments believe that individuals or communities are actually incapable of making informed risk decisions and determining for themselves what is an acceptable risk, then it is time to rethink the National Strategy for Disaster Resilience.
The research that informs this paper has been supported by the Bushfire and Natural Hazards Cooperative Research Centre. The author also acknowledges the Disaster and Development Network, Northumbria University, Newcastle-upon-Tyne (UK) for allowing space and time to prepare this paper during the author’s sabbatical leave from the Australian National University.
i ‘Bushfire-ravaged towns should not be rebuilt, planning expert’ 7.30 (6 January 2017) <http://www.abc.net.au/7.30/content/2017/s4601010.htm>.
ii Victoria, 2009 Victorian Bushfires Royal Commission, Final Report (2010) Recommendation 46.
iii Victoria, Parliamentary Debates, Legislative Assembly, August 10, 2010, 2984 (the Hon John Brumby, Premier).
iv United Nations International Strategy for Disaster Risk Reduction, 2009 UNISDR Terminology on Disaster Risk Reduction (2009), 25.
v CGE Risk Management Solutions, Risk Matrices (2017) <http://www.cgerisk.com/knowledge-base/risk-assessment/risk-matrices>.
vi Michael Eburn, ‘Bushfires and Australian emergency management law and policy: Adapting to climate change and the new fire and emergency management environment’ in Lloyd Burton and Lisa Sun (eds.) Cassandra’s Curse: Law and Foreseeable Future Disasters (2015, Studies in Law, Politics and Society; Elsevier).
vii Coates L, Haynes K, O’Brien J, McAneney J, Dimer de Oliveira, F. 2014. Exploring 167 years of vulnerability: An exanimation of extreme heat events in Australia 1844-2010. Environmental Science & Policy, 42:33-44.
viii Michael Power, The Risk Management of Everything (2004, Demos), 14.
ix Council of Australian Governments, National Strategy for Disaster Resilience (2011, Commonwealth of Australia), 5.
X Stuart v Kirkland Veenstra (2009) 237 CLR 215, .
Risk Frontiers’ Thomas Mortlock presented this week on recent work regarding calibrating a global storm surge model in collaboration with Deltares, at the 2017 Coast and Ports conference in Cairns this week.
The work combines Deltares’ capabilities in global ocean modelling with Risk Frontiers’ knowledge of coastal cyclone risk in Australia
In particular, the presentation focused on the importance of assimilating high-resolution coastal bathymetries into surge models is demonstrated in the context of the recent Severe Tropical Cyclone Debbie.
For further information please contact Thomas Mortlock at email@example.com
Chas Keys and Andrew Gissing
This week sees a significant but little-heralded anniversary in New South Wales: 150 years ago, on the 23rd of June, a devastating flood peaked at Windsor on the Hawkesbury River. For height reached and area inundated, that event has not been matched on the river since. Indeed no other flood since European settlement has come within 4 metres of that one at the Windsor gauge. The 1867 flood reached 19.7 metres: by comparison, the 1961 flood (the highest in living memory today) peaked at only 15.1 metres. The approximate extent of the 1867 flood is shown in Figure 1.
For context, the river in non-flood times reaches only about 1 metre at the gauge. So the 1867 flood peaked more than 18 metres above low-flow level.
Windsor became two small islands. Had the flood risen a further 3 metres, the town would have been completely inundated and many people would have been swept away.
Along the river, much of Windsor and Richmond and substantial tracts of farmland were flooded. Twelve people died, many dwellings were destroyed and hundreds of settlers made destitute.
Decades before, Governor Lachlan Macquarie had implored people to make their homes not on the river flats but on the high ground of the five ‘Macquarie towns’ he designated nearby. His advice was little heeded over following decades: people did not want to commute to their plots or have difficulty protecting their crops and livestock.
Unbeknownst to Macquarie, the town sites he had selected were within the reach of genuinely big floods. The 1867 flood proved as much.
Flooding not only impacted the Hawkesbury-Nepean catchment, but spread across NSW impacting other areas such as Parramatta, Liverpool, Bankstown, Wollongong, Nowra, Moruya, Tamworth, Bathurst, Mudgee, Dubbo, Forbes and Wagga Wagga. Such spread and intensity of impacts would no doubt stretch today’s emergency services (Yeo et al., 2017).
Over the following decades, population growth continued along the Hawkesbury, the area eventually becoming part of Sydney’s sprawl. Many houses were built within reach of much less severe floods than the 1867 event: McGraths Hill is a case in point. Even more dwellings were not far above the ‘shoreline’ of that event.
Here it must be appreciated that the highest flood possible at Windsor is estimated likely to reach to about 26 metres on the local gauge. All of Windsor would be inundated well before this level was reached. The islands of 1867 would disappear.
Such a big flood would occur only very rarely, but something like the flood of 1867 or higher must be expected at some stage. Adding height to Warragamba Dam as is proposed will not eliminate this potential.
By the late 1990s it was clear that the roads by which people evacuate would be cut by floodwaters well before a genuinely big flood reached its peak. All means of escape by road would be lost in a flood reaching a gauge height of only 14 metres at Windsor, with many thousands of residents cut off and at risk should the event develop into megaflood proportions as in 1867.
It would be utterly impossible to rescue all the trapped people by boat or helicopter. The death toll in a really big flood could be huge.
The state government’s strategy to avoid such an outcome was to build a high bridge between Windsor and Mulgrave. That structure was completed in 2007 at a cost of $120 million, its deck at a level equivalent to 17 metres at the Windsor gauge.
The bridge was intended to make it possible to get the potentially trapped people of Windsor and surrounding areas ─ 90,000 of them today ─ to safety in the face of severe flooding. It does not fully ‘solve’ the problem, though: inevitably, some people will not accept the recommendation to evacuate.
This reluctance to evacuate happens in every serious flood. Lismore, in late March 2017, was just the latest manifestation of this worrying response to flood in Australia. Scores of people ignored the warnings, failed to evacuate and had to be rescued. People underestimate the flood danger and put their own safety and that of emergency responders at risk.
This is the equivalent of the early settlers’ refusal to move their homes off the lower floodplain of the Hawkesbury.
The Windsor-Mulgrave bridge represents a belatedly learned lesson of the 1867 flood. The necessity for it grew from decades of residential development that ignored the reality of big floods. The other lesson, still not well appreciated despite many big floods, is that people should avoid being in the path of a severe flood. They need to understand, on the infrequent occasions when one of those is developing, that they should evacuate; otherwise, there is a real likelihood of deaths to themselves, families and friends.
The NSW Government has recently completed a review of Hawkesbury-Nepean flood management and has released a new Hawkesbury-Nepean Flood Risk Management Strategy entitled “Resilient Valley, Resilient Communities”. The strategy outlines key outcomes including raising the Warragamba Dam wall; preparation of a Regional Evacuation Road Master Plan and a Regional Land Use Planning Framework; raising community flood awareness; improving flood predictions; upgrading local evacuation routes and maintaining emergency plans. Continued Government support to ensure prudent management of current and future flood risk throughout the catchment is of upmost importance. The strategy can be downloaded at www.infrastructure.nsw.gov.au/expert-advice/hawkesbury-nepean-flood-risk-management-strategy.aspx.
Chas Keys is a former Deputy Director General of the NSW State Emergency Service and an Honorary Associate of Risk Frontiers at Macquarie University. For more information please contact Chas at firstname.lastname@example.org or Andrew Gissing at email@example.com.
YEO, S., BEWSHER, D., ROBINSON, J. & CINQUE, P. 2017. The June 1867 floods in NSW: causes, characteristics, impacts and lessons. Floodplain Management Australia National Conference. Newcastle, NSW.
In partnership with the NSW State Emergency Service, Risk Frontiers conducted a survey of twenty participants at the Floodplain Management Australia conference held in May, 2017. Participants were from the floodplain risk management industry and represented Local and State Government, emergency services, research groups and private sector consultants from NSW, QLD and VIC.
Overall, there was clear recognition of the importance of involving community members in floodplain risk management and emergency planning processes, with acknowledgment that community members have valuable knowledge about local flood risks and vulnerable people in their communities. Though participants believed that community members should be active in decision-making processes, there were mixed views as to whether the community should have the ultimate say about how floods are managed.
Most participants identified that communities are currently consulted in the development of floodplain risk management plans and emergency plans (40%). Others enabled the community to comment on draft plans or said the community was not involved at all. Only three respondents indicated that communities work in collaboration with local authorities to develop floodplain risk management plans, and one respondent indicated the community works collaboratively with emergency services to develop joint emergency plans.
The largest barrier to the involvement of community members in floodplain risk management and emergency planning was said to be a lack of practitioner skills or confidence to effectively engage with communities. Other barriers included lack of community interest in participation, over confidence in the ability of experts to make decisions on behalf of communities, time and budget pressures and the inertia of existing historical practices. There was acknowledgement that engagement needed to be well facilitated to achieve effective and inclusive outcomes that did not just recognise the loudest voices in the room.
Over 90% of respondents believed that processes to involve communities in floodplain risk management and flood emergency planning needed to be improved. Participants nominated the following ways to improve community participation:
- Ensure enough time is allocated to enable community involvement, whilst recognising that the community has its own timelines and ignorance of these will stem the community’s willingness to engage;
- Tailor engagement approaches for each community, including understanding the unique needs of the community and the ways in which they want to be involved;
- Communicate actual real world flood experiences, by giving flood risk a sense of realism;
- Build critical flood awareness amongst community members and then seek their involvement; and
- Attempt to communicate technical concepts in plain English.
With disaster management policies in Australia placing greater emphasis on encouraging community participation in emergency management there would appear to be a need based upon this limited research to promote further skills and experience amongst practitioners regarding effective community engagement practices. Further research should also focus on the identification of community member motivations and barriers to involvement in disaster risk management planning across multiple community contexts.
Thanks to those who took part in the survey. For further information on recent research into community involvement in emergency planning see ajem.infoservices.com.au/items/AJEM-32-02-15.
This article by James Foster, Associate Researcher, University of Hawaii, appeared in The Conversation on March 15, 2016. As shown in the last figure, cargo ship routes provide much better coverage of the northern hemisphere than the southern hemisphere.
Racing across ocean basins at speeds over 800 kilometres per hour, tsunamis can wreak devastation along coastlines thousands of miles from their origin. Our modern tsunami detection networks reliably detect these events hours in advance and provide warning of their arrivals, but predicting the exact size and impact is more difficult. Evacuating coastal zones can cost millions of dollars. To reliably predict whether a tsunami is large enough to require evacuations, many more observations from the deep ocean are needed.
Researchers from the University of Hawai‘i (including me), funded by the National Oceanic and Atmospheric Administration (NOAA), are partnering with the Matson and Maersk shipping companies and the World Ocean Council to equip 10 cargo ships with real-time high-accuracy GPS systems and satellite communications. Each vessel will act as an open-ocean tide gauge. Data from these new tsunami sensors are streamed, via satellite, to a land-based data center where they are processed and analyzed for tsunami signals. It is a pilot project to turn the moving ships into a distributed network of sensors that could give coastal communities more time to evacuate.
Monitoring the world’s oceans
Despite the advances in tsunami monitoring and modeling technology over the last decade, it remains difficult for hazard response agencies to get enough information about potential tsunami threats. The problem is that there are too few observations of tsunamis to provide sufficiently accurate predictions about when, where and how severely tsunamis might occur.
In particular, there are very few sensors in the deep oceans that often lie between tsunami sources – usually earthquakes occurring under the ocean trenches that mark where tectonic plates meet – and the distant coastlines that might be threatened. Gaps in the coverage of the network, as well as routine outages of instruments, limit the ability of the current detection system to accurately assess the hazard posed by each event.
The deep ocean sensor networks that do exist are expensive to build and maintain, so only a limited number are deployed, at locations chosen based on our best current understanding of the hazards. But the unexpectedly huge 2011 Tohoku, Japan, earthquake and the unanticipated type of fault slip that caused the 2012 event at Queen Charlotte Islands, Canada, highlighted weaknesses in this approach.
For the 2012 tsunami from the Queen Charlotte Islands earthquake, the lack of deep ocean data meant a tsunami warning and evacuation was issued for some of Hawaii’s coastlines, though the event turned out to be smaller than predicted. This emphasized the need for more densely spaced deep ocean observing capabilities. Even just a few more observations in the right places would have enabled the scientists to improve their estimates of the tsunami size.
A solution arrives by chance
The potential solution to this problem came about by chance. In 2010, I was running an experiment with colleagues using high-accuracy GPS on the UH research vessel Kilo Moana. On its way to Guam, the Kilo Moana was passed by the tsunami generated by the magnitude 8.8 earthquake in Maule, Chile, on February 27 of that year.
In the deep ocean this tsunami wave was only about 10 cm (about 4 inches) high with a wavelength of more than 300 miles. Its passage would normally have remained undetected, lost amid the several meters of heave of the ship in the regular waves. However, careful analysis of the data collected by the GPS proved that the system we had in place accurately recorded the tsunami signal.
The ability of the GPS-based system to detect tsunamis among the much larger ocean waves comes from the distinct difference between their respective intervals, called periods. Ocean swells that rock even the largest ships come at intervals of 15 to 20 seconds. Tsunami swells, however, take 10 to 30 minutes to pass – or even longer. Looking at the height of the ocean’s surface – and of a ship afloat – over this longer time period, the normal fluctuation of ocean swells cancel each other out. The data then reveal the long-period perturbations caused by a passing tsunami.
The recognition that tsunamis can be detected from ships is a game-changer. There are thousands of large cargo ships sailing the shipping lanes across the world. Rather than building and deploying many more of the expensive traditional sensors to try to fill gaps in coverage, it makes sense to use the ships that are already out there. This new approach offers a cost-effective way of acquiring many more observations to augment the current detection networks. While these new observations will not necessarily lead to quicker detection of tsunamis, they will lead to more accurate predictions being made more quickly.
Working with the NOAA Tsunami Warning Centers ensures that the newly installed network provides their scientists with the most useful data to help with their predictions. Collaborating with industry partners, we will be developing a new version of the shipboard package that can be deployed easily on a much greater number of ships.
The new ship-based detection network is the first step toward the creation of the dense global observing network needed to support the efforts of all tsunami warning centers to provide the best possible predictions of tsunami hazard to coastal communities.