Risk Frontiers shares insights at the 2017 Coast and Ports conference

Risk Frontiers’ Thomas Mortlock presented this week on recent work regarding calibrating a global storm surge model in collaboration with Deltares, at the 2017 Coast and Ports conference in Cairns this week.

The work combines Deltares’ capabilities in global ocean modelling with Risk Frontiers’ knowledge of coastal cyclone risk in Australia

In particular, the presentation focused on the importance of assimilating high-resolution coastal bathymetries into surge models is demonstrated in the context of the recent Severe Tropical Cyclone Debbie.

For further information please contact Thomas Mortlock at thomas.mortlock@riskfrontiers.com

The Great Hawkesbury Flood turns 150 today

Chas Keys and Andrew Gissing

This week sees a significant but little-heralded anniversary in New South Wales: 150 years ago, on the 23rd of June, a devastating flood peaked at Windsor on the Hawkesbury River. For height reached and area inundated, that event has not been matched on the river since. Indeed no other flood since European settlement has come within 4 metres of that one at the Windsor gauge. The 1867 flood reached 19.7 metres: by comparison, the 1961 flood (the highest in living memory today) peaked at only 15.1 metres. The approximate extent of the 1867 flood is shown in Figure 1.

For context, the river in non-flood times reaches only about 1 metre at the gauge. So the 1867 flood peaked more than 18 metres above low-flow level.

Windsor became two small islands. Had the flood risen a further 3 metres, the town would have been completely inundated and many people would have been swept away.

Along the river, much of Windsor and Richmond and substantial tracts of farmland were flooded. Twelve people died, many dwellings were destroyed and hundreds of settlers made destitute.

Decades before, Governor Lachlan Macquarie had implored people to make their homes not on the river flats but on the high ground of the five ‘Macquarie towns’ he designated nearby. His advice was little heeded over following decades: people did not want to commute to their plots or have difficulty protecting their crops and livestock.

Unbeknownst to Macquarie, the town sites he had selected were within the reach of genuinely big floods. The 1867 flood proved as much.

Flooding not only impacted the Hawkesbury-Nepean catchment, but spread across NSW impacting other areas such as Parramatta, Liverpool, Bankstown, Wollongong, Nowra, Moruya, Tamworth, Bathurst, Mudgee, Dubbo, Forbes and Wagga Wagga. Such spread and intensity of impacts would no doubt stretch today’s emergency services (Yeo et al., 2017).

Over the following decades, population growth continued along the Hawkesbury, the area eventually becoming part of Sydney’s sprawl. Many houses were built within reach of much less severe floods than the 1867 event: McGraths Hill is a case in point. Even more dwellings were not far above the ‘shoreline’ of that event.

Here it must be appreciated that the highest flood possible at Windsor is estimated likely to reach to about 26 metres on the local gauge. All of Windsor would be inundated well before this level was reached. The islands of 1867 would disappear.

Such a big flood would occur only very rarely, but something like the flood of 1867 or higher must be expected at some stage. Adding height to Warragamba Dam as is proposed will not eliminate this potential.

By the late 1990s it was clear that the roads by which people evacuate would be cut by floodwaters well before a genuinely big flood reached its peak. All means of escape by road would be lost in a flood reaching a gauge height of only 14 metres at Windsor, with many thousands of residents cut off and at risk should the event develop into megaflood proportions as in 1867.

It would be utterly impossible to rescue all the trapped people by boat or helicopter. The death toll in a really big flood could be huge.

The state government’s strategy to avoid such an outcome was to build a high bridge between Windsor and Mulgrave. That structure was completed in 2007 at a cost of $120 million, its deck at a level equivalent to 17 metres at the Windsor gauge.

The bridge was intended to make it possible to get the potentially trapped people of Windsor and surrounding areas ─ 90,000 of them today ─ to safety in the face of severe flooding. It does not fully ‘solve’ the problem, though: inevitably, some people will not accept the recommendation to evacuate.

This reluctance to evacuate happens in every serious flood. Lismore, in late March 2017, was just the latest manifestation of this worrying response to flood in Australia. Scores of people ignored the warnings, failed to evacuate and had to be rescued. People  underestimate the flood danger and put their own safety and that of emergency responders at risk.

This is the equivalent of the early settlers’ refusal to move their homes off the lower floodplain of the Hawkesbury.

The Windsor-Mulgrave bridge represents a belatedly learned lesson of the 1867 flood. The necessity for it grew from decades of residential development that ignored the reality of big floods. The other lesson, still not well appreciated despite many big floods, is that people should avoid being in the path of a severe flood. They need to understand, on the infrequent occasions when one of those is developing, that they should evacuate; otherwise, there is a real likelihood of deaths to themselves, families and friends.

The NSW Government has recently completed a review of Hawkesbury-Nepean flood management and has released a new Hawkesbury-Nepean Flood Risk Management Strategy entitled “Resilient Valley, Resilient Communities”. The strategy outlines key outcomes including raising the Warragamba Dam wall; preparation of a Regional Evacuation Road Master Plan and a Regional Land Use Planning Framework; raising community flood awareness; improving flood predictions; upgrading local evacuation routes and maintaining emergency plans. Continued Government support to ensure prudent management of current and future flood risk throughout the catchment is of upmost importance. The strategy can be downloaded at www.infrastructure.nsw.gov.au/expert-advice/hawkesbury-nepean-flood-risk-management-strategy.aspx.

Figure 1 – Approximate extent of the 1867 flood. Source NSW State Emergency Service

Chas Keys is a former Deputy Director General of the NSW State Emergency Service and an Honorary Associate of Risk Frontiers at Macquarie University. For more information please contact Chas at chas.keys@keypoint.com.au or Andrew Gissing at andrew.gissing@riskfrontiers.com.

References

YEO, S., BEWSHER, D., ROBINSON, J. & CINQUE, P. 2017. The June 1867 floods in NSW: causes, characteristics, impacts and lessons. Floodplain Management Australia National Conference. Newcastle, NSW.

Risk Frontiers complete modelling study to help understand long-term stability of the NSW coast

 
Risk Frontiers – in association with the Marine Climate Risk Group at Macquarie University – have delivered a modelling study for NSW Office of Environment and Heritage, as part of the NSW Adaptation Research Hub, to help understand the long-term stability of the NSW coast. 
 
As beaches respond to rising sea levels an offshore transport of sand is generally expected (leading to erosion) – but an onshore supply from deep water has the potential to offset some of the impacts of sea-level rise. The rate of sand supply to the NSW coast from deep water is a major uncertainty in projecting future coastal response to sea-level rise.
 
This work used a combination of wave, hydrodynamic and sediment transport modelling to look at sand supply rates over long time-scales (decades to millennia). Modelling suggests that over these time-scales, sand is being transported from water depths of up to 40 m – much deeper than is currently accounted for in coastal engineering studies. 
 
This may have implications for coastal management over the long-term, including how we nourish beaches with sand from offshore sources.
 
Risk Frontiers operate a suite of wave, flow and transport models to model a range of hazards in the coastal zone – and develop solutions to managing their impacts. 

Floodplain manager skillsets are key to effectively growing community engagement practice in disaster resilience.

Andrew Gissing.

In partnership with the NSW State Emergency Service, Risk Frontiers conducted a survey of twenty participants at the Floodplain Management Australia conference held in May, 2017. Participants were from the floodplain risk management industry and represented Local and State Government, emergency services, research groups and private sector consultants from NSW, QLD and VIC.

Overall, there was clear recognition of the importance of involving community members in floodplain risk management and emergency planning processes, with acknowledgment that community members have valuable knowledge about local flood risks and vulnerable people in their communities. Though participants believed that community members should be active in decision-making processes, there were mixed views as to whether the community should have the ultimate say about how floods are managed.

Most participants identified that communities are currently consulted in the development of floodplain risk management plans and emergency plans (40%). Others enabled the community to comment on draft plans or said the community was not involved at all. Only three respondents indicated that communities work in collaboration with local authorities to develop floodplain risk management plans, and one respondent indicated the community works collaboratively with emergency services to develop joint emergency plans.

The largest barrier to the involvement of community members in floodplain risk management and emergency planning was said to be a lack of practitioner skills or confidence to effectively engage with communities. Other barriers included lack of community interest in participation, over confidence in the ability of experts to make decisions on behalf of communities, time and budget pressures and the inertia of existing historical practices. There was acknowledgement that engagement needed to be well facilitated to achieve effective and inclusive outcomes that did not just recognise the loudest voices in the room.

Over 90% of respondents believed that processes to involve communities in floodplain risk management and flood emergency planning needed to be improved. Participants nominated the following ways to improve community participation:

  • Ensure enough time is allocated to enable community involvement, whilst recognising that the community has its own timelines and ignorance of these will stem the community’s willingness to engage;
  • Tailor engagement approaches for each community, including understanding the unique needs of the community and the ways in which they want to be involved;
  • Communicate actual real world flood experiences, by giving flood risk a sense of realism;
  • Build critical flood awareness amongst community members and then seek their involvement; and
  • Attempt to communicate technical concepts in plain English.

With disaster management policies in Australia placing greater emphasis on encouraging community participation in emergency management there would appear to be a need based upon this limited research to promote further skills and experience amongst practitioners regarding effective community engagement practices. Further research should also focus on the identification of community member motivations and barriers to involvement in disaster risk management planning across multiple community contexts.

Thanks to those who took part in the survey. For further information on recent research into community involvement in emergency planning see ajem.infoservices.com.au/items/AJEM-32-02-15.

 

A new way to detect tsunamis: cargo ships

This article by James Foster, Associate Researcher, University of Hawaii, appeared in The Conversation on March 15, 2016.  As shown in the last figure, cargo ship routes provide much better coverage of the northern hemisphere than the southern hemisphere.


Racing across ocean basins at speeds over 800 kilometres per hour, tsunamis can wreak devastation along coastlines thousands of miles from their origin. Our modern tsunami detection networks reliably detect these events hours in advance and provide warning of their arrivals, but predicting the exact size and impact is more difficult. Evacuating coastal zones can cost millions of dollars. To reliably predict whether a tsunami is large enough to require evacuations, many more observations from the deep ocean are needed.

Researchers from the University of Hawai‘i (including me), funded by the National Oceanic and Atmospheric Administration (NOAA), are partnering with the Matson and Maersk shipping companies and the World Ocean Council to equip 10 cargo ships with real-time high-accuracy GPS systems and satellite communications. Each vessel will act as an open-ocean tide gauge. Data from these new tsunami sensors are streamed, via satellite, to a land-based data center where they are processed and analyzed for tsunami signals. It is a pilot project to turn the moving ships into a distributed network of sensors that could give coastal communities more time to evacuate.

Monitoring the world’s oceans

Despite the advances in tsunami monitoring and modeling technology over the last decade, it remains difficult for hazard response agencies to get enough information about potential tsunami threats. The problem is that there are too few observations of tsunamis to provide sufficiently accurate predictions about when, where and how severely tsunamis might occur.

In particular, there are very few sensors in the deep oceans that often lie between tsunami sources – usually earthquakes occurring under the ocean trenches that mark where tectonic plates meet – and the distant coastlines that might be threatened. Gaps in the coverage of the network, as well as routine outages of instruments, limit the ability of the current detection system to accurately assess the hazard posed by each event.

NOAA tsunami detection buoys can come unmoored in the deep ocean, as this one did, requiring recovery and replacement – and not detecting any usable data until it’s back in place. LCDR Mark Wetzler, NOAA

The deep ocean sensor networks that do exist are expensive to build and maintain, so only a limited number are deployed, at locations chosen based on our best current understanding of the hazards. But the unexpectedly huge 2011 Tohoku, Japan, earthquake and the unanticipated type of fault slip that caused the 2012 event at Queen Charlotte Islands, Canada, highlighted weaknesses in this approach.

For the 2012 tsunami from the Queen Charlotte Islands earthquake, the lack of deep ocean data meant a tsunami warning and evacuation was issued for some of Hawaii’s coastlines, though the event turned out to be smaller than predicted. This emphasized the need for more densely spaced deep ocean observing capabilities. Even just a few more observations in the right places would have enabled the scientists to improve their estimates of the tsunami size.

A solution arrives by chance

The potential solution to this problem came about by chance. In 2010, I was running an experiment with colleagues using high-accuracy GPS on the UH research vessel Kilo Moana. On its way to Guam, the Kilo Moana was passed by the tsunami generated by the magnitude 8.8 earthquake in Maule, Chile, on February 27 of that year.

In the deep ocean this tsunami wave was only about 10 cm (about 4 inches) high with a wavelength of more than 300 miles. Its passage would normally have remained undetected, lost amid the several meters of heave of the ship in the regular waves. However, careful analysis of the data collected by the GPS proved that the system we had in place accurately recorded the tsunami signal.

A tsunami hides in the waves of the open ocean: At top, GPS-determined ocean surface height changes due to both regular waves and the tsunami wave. Inset: A close-up to show ocean wave fluctuation. At bottom, just the tsunami wave, with the GPS observations in blue and the red dashed line representing a model for the tsunami that has been updated using the GPS data. James Foster, CC BY-ND

The ability of the GPS-based system to detect tsunamis among the much larger ocean waves comes from the distinct difference between their respective intervals, called periods. Ocean swells that rock even the largest ships come at intervals of 15 to 20 seconds. Tsunami swells, however, take 10 to 30 minutes to pass – or even longer. Looking at the height of the ocean’s surface – and of a ship afloat – over this longer time period, the normal fluctuation of ocean swells cancel each other out. The data then reveal the long-period perturbations caused by a passing tsunami.

The recognition that tsunamis can be detected from ships is a game-changer. There are thousands of large cargo ships sailing the shipping lanes across the world. Rather than building and deploying many more of the expensive traditional sensors to try to fill gaps in coverage, it makes sense to use the ships that are already out there. This new approach offers a cost-effective way of acquiring many more observations to augment the current detection networks. While these new observations will not necessarily lead to quicker detection of tsunamis, they will lead to more accurate predictions being made more quickly.

Working with the NOAA Tsunami Warning Centers ensures that the newly installed network provides their scientists with the most useful data to help with their predictions. Collaborating with industry partners, we will be developing a new version of the shipboard package that can be deployed easily on a much greater number of ships.

The new ship-based detection network is the first step toward the creation of the dense global observing network needed to support the efforts of all tsunami warning centers to provide the best possible predictions of tsunami hazard to coastal communities.

Cargo and commercial shipping lanes (shown here in purple) crisscross much of the world.
T. Hengl, CC BY-SA

 

 

Preservation of thin tephra

by Russell Blong , Neal Enright and Paul Grasso.

Abstract:  The preservation of thin (<300 mm thick) tephra falls was investigated at four sites in Papua New Guinea (PNG), Alaska and Washington, USA. Measurements of the variations in the thickness of: (i) Tibito Tephra 150 km downwind from the source, Long Island (PNG) erupted mid-seventeenth century; (ii) St Helens W tephra (erupted 1479–80 A.D.) on the slopes of the adjacent Mt. Rainier in Washington State; (iii) Novarupta (1912) tephra preserved on Kodiak Island (Alaska, USA); and (iv) an experimentally placed tephra at a site near Mt. Hagen (PNG) allow tentative conclusions to be drawn about the relative importance to tephra preservation of slope gradients, vegetation cover and soil faunal activity. Results for the experimental tephra suggest that compaction occurs rapidly post-deposition and that estimates of tephra thickness and bulk density need to indicate how long after deposition thickness measurements were made. These studies show that erosional reworking of thin tephra is not rapid even on steeper slopes in high rainfall environments. In Papua New Guinea a 350-year old tephra is rarely present under forest but is well-preserved under alpine grasslands. On Mt. Rainier 500-year old tephra is readily preserved under forest but absent under grasslands as a result of gopher activity. Despite the poor relationship between tephra thickness and slope steepness the thickness of thin tephras is highly variable. On Kodiak Island thickness variability across a few metres is similar to that observed across the whole northeast of the island. The measured variability reported here indicates large sample sizes are necessary to adequately estimate the mean thickness of these thin tephra. These results have implications for the preparation of isopach maps, estimation of tephra volumes and elaboration of the potential consequences of tephra falls.

Read entire article

An analysis of human fatalities from cyclones, earthquakes and severe storms in Australia

A new report on natural hazard fatalities has been produced by Risk Frontiers, which undertook the project, “An analysis of human fatalities and building losses from natural disasters in Australia” for the Bushfire and Natural Hazards CRC (BNHCRC). The BNHCRC is conducting end-user driven research to improve disaster resilience and reduce the human, social, economic and environmental costs from natural hazards. This new report is the second deliverable of this important project aimed at providing an evidence base for future emergency management policy, practice and resource allocation and enabling efficient and strategic risk reduction strategies. The current focus was to measure and gain a greater understanding of the impacts of tropical cyclones, earthquakes and severe storms in terms of the toll of human life. Please click here to read all about it, in Coates, L., Haynes, K., Radford, D., D’Arcy, R, Smith, C., van den Honert, R., Gissing, A. (2016). An analysis of human fatalities from cyclones, earthquakes and severe storms in Australia. Report for the Bushfire and Natural Hazards Cooperative Research Centre.

Read more

Risk Frontiers to present at forum on impacts of Tropical Cyclone Debbie

Severe Tropical Cyclone Debbie made landfall near Airlie Beach on the north Queensland coast at midday on 28th March 2017. A team from Risk Frontiers travelled to the landfall site and surrounds a day later to assess the damage. Their findings on coastal and flood impacts, and lessons learnt in terms of warnings and communications, are to be presented at the Tropical Cyclone Debbie forum hosted by the Bureau of Meteorology in Brisbane on 6 June. Below are a selection of images taken by the team during their visit.

1. The team prior to the aerial damage survey.
2. Evidence of storm surge and overwash
3. Seaforth post-Debbie
4. Flooding around the Bruce Highway outside Proserpine
5. Water quality impacts with floodwater discharge at Hamilton island
6. Airlie Beach post-Debbie
7. South Mole Island post-Debbie
8. Water quality impacts with floodwater discharge at Whitehaven Beach
9. Boat damage
10. The clean-up operation at Seaforth
11. The beach at Seaforth post-Debbie
12. The beach at Seaforth post-Debbie (2)
13. The beach at Seaforth post-Debbie (3)
14. Displaced pontoons at Laguna Quay Marina
15. Displaced pontoons at Laguna Quay Marina (2)
16. Yacht washed up at Airlie Beach
17. Example of the impact of local topography on wind damage

Improving Decision Making about Natural Disaster Mitigation Funding in Australia—A Framework

This article by Robin C. van den Honert appeared in MDPI  Resources 2016, 5(3), 28; doi:10.3390/resources5030028.

Abstract:

Economic losses from natural disasters pose significant challenges to communities and to the insurance industry. Natural disaster mitigation aims to reduce the threat to people and assets from natural perils. Good decisions relating to hazard risk mitigation require judgements both about the scientific and financial issues involved, i.e., the efficacy of some intervention, and the ethical or value principles to adopt in allocating resources. A framework for selecting a set of mitigation options within a limited budget is developed. Project selection about natural disaster mitigation options needs to trade off benefits offered by alternative investments (e.g., fatalities and injuries avoided, potential property and infrastructure losses prevented, safety concerns of citizens, etc.) against the costs of investment. Such costs include capital and on-going operational costs, as well as intangible costs, such as the impact of the project on the visual landscape or the loss of societal cohesion in the event of the relocation of part of a community. Furthermore, dollar costs of any potential project will need to be defined within some prescribed budget and time frame. Taking all of these factors into account, this paper develops a framework for good natural hazard mitigation decision making and selection. View Full-Text