Forecast of Increased Earthquakes due to Slowing of Earth’s Rotation

Paul Somerville, Risk Frontiers.

In the past few weeks there have been sensational reports about a forecast accelerated rate of occurrence of large earthquakes in 2018.  Fortunately, one of the authors of the work that lies behind these reports has explained her calm view of the situation.  The following article, written by Sarah Kaplan, appeared in the Washington Post, last updated 22 November 2017.

Rebecca Bendick would like you to not panic. The University of Montana geophysicist knows you may have read the articles warning about “swarms of devastating earthquakes” that will allegedly rock the planet next year thanks to a slowdown of the Earth’s rotation. And she feels “very awful” if you’ve been alarmed. Those dire threats are based on Bendick’s research into patterns that might predict earthquakes – but claims of an impending “earthquake boom” are mostly sensationalism.

There is no way to predict an individual earthquake. Earthquakes occur when potential energy stored along cracks in the planet’s crust gets released, sending seismic waves through the Earth.  Since scientists know where those cracks exist, and how they are likely to convulse, they can develop forecasts of the general threat for an area. But the forces that contribute to this energy buildup and trigger its release are global and complex, and we still cannot sort out exactly how it might unfold.

In a paper published in August in the journal Geophysical Research Letters, Bendick and colleague Roger Bilham, a geophysicist at the University of Colorado, Boulder, did find a curious correlation between clusters of certain earthquakes and periodic fluctuations in the Earth’s rotation. By examining the historic earthquake record and monitoring those fluctuations, scientists might be able to forecast years when earthquakes are more likely to occur, they suggest.

“Something that people have always hoped to find . . . is some kind of a leading indicator for seismicity, because that gives us a warning about these events,” Bendick said. But that conclusion is by no means set in stone. It hasn’t been demonstrated in the lab or confirmed by follow-up studies. Several scientists have said they’re not yet convinced by Bendick’s and Bilham’s research. “The main thing I came away thinking was real old-fashioned scientific ‘let’s check this’ kind of thoughts,” research geophysicist Ken Hudnut told Popular Science. Hudnut, who works on earthquake-risk programs at the US Geological Survey, was not involved in the paper. And that reaction is okay with Bendick. That’s how these things are supposed to go: “Someone says something kind of marginally outlandish, and everyone checks their work and that’s how science progresses,” she said.

Historically, the field of earthquake forecasting has seen some particularly outlandish claims. People have tried to predict temblors based on the behaviour of animals, gas emissions from rocks, low-frequency electric signals rippling through the Earth – all without much success.  For that reason, Bendick said, “it’s a little bit scary to get into the game.” But getting a prediction right can mean the difference between life and death for countless people. The stakes are too high not to try.

For their recent paper, she and Bilham looked through the century-long global earthquake record to see if they could spot any signs that temblors around the world are linked. Initially, the data appeared completely random. But then Bendick and Bilham added a new number to their analysis: the “renewal interval,” or the amount of time a given earthquake zone requires to build up potential energy for a really big quake. “Basically you can think of earthquakes as something like a battery or a neuron; they have a certain amount of time they need to be charged up,” Bendick said.

A certain class of earthquakes – those with a magnitude of 7.0 or more, and a short renewal interval between 20 and 70 years – seemed to cluster in the historic record. Every three decades or so, the planet seemed to experience a bunch of them – as many as 20 per year, instead of the typical 8 to 10. It was as if something was causing the earthquakes to synchronise, even though they were happening in spots scattered around the globe. Contrary to some reports on the study, “it’s not exactly the case that every 32 years we have a bad patch,” Bendick said. “If it were that, people would have found [the pattern] ages ago. That would be super obvious in the record.” Instead, she explained, “events with that renewal interval happen together more often than they happen at random, and that pattern is statistically significant.” Sure, it’s a less flashy finding than, “we know when earthquakes will happen,” she acknowledged. But that’s geophysics for you. “We’re scientists, not magicians,” she said.

Next, Bendick and Bilham tried to figure out what mechanism might explain these earthquake clusters. They studied a wide range of global phenomena that unfold over the same time scales: sloshing of the molten rock in the mantle, ocean circulation changes, momentum transfer between the Earth’s core and the lithosphere (the planet’s solid, outermost shell).

The best fit were tiny, cyclical changes in the speed of the Earth’s rotation. The planet slows down infinitesimally every 30 years or so, and roughly five years later, a cluster of these severe, short-interval earthquakes appears. Russian geophysicists Boris Levin and Elena Sasorova have pointed out this correlation before, Bendick noted. So she and Bilham tried to take it a step further: They found a mechanism that might link the Earth’s rotation and clusters of quakes.

See, when the Earth’s rotation rate changes, its shape shifts. As the planet speeds up, mass moves toward the equator, much the way a dancer’s skirt flares out when she spins. When it slows, that mass shifts back toward the poles. The cumulative effect is tiny – a millimetre difference in the width of the globe. But if potential energy has already built up at a number of faults – “if they’re locked and loaded, as we’d say in Montana,” Bendick noted – “that tiny change is enough to kick some proportion of the faults over into their failure mode, which is earthquakes.”

Earth is currently at the end of a slowing period, Bendick pointed out, and the historic record would indicate another “cluster” may be on its way. She and Bilham hope the pattern might help scientists and public officials make some sense of the Earth’s unpredictable shaking. If disaster planners can say with some assurance that the planet is entering a period in which quakes are more likely, they might have an easier time making the case for preparedness measures.

But that doesn’t necessarily mean 2018 will be a particularly devastating year. For one thing, the kinds of temblors Bendick and Bilham analysed happen in areas that are already earthquake-prone – Japan, New Zealand, the west coast of the United States. For people who live in those regions, there is always a risk of a quake, and it is always good to be prepared.

Their study is about probabilities, not predictions, Bendick cautioned. Earth’s slowing does not mean that a quake will happen in the next year or so, just that the likelihood may have gone up. Moreover, this pattern of earthquake occurrence is definitely not the only factor influencing the Earth’s behaviour – if it were, scientists would have noticed the pattern a long time ago. There are doubtless other earthquake cycles on the planet, driven by phenomena not considered in the paper.

The research got a lot of attention after Bilham presented it at the October meeting of the Geological Society of America. Several critics noted that correlation is not causation – earthquake clusters and fluctuations of Earth’s rotation might happen on the same time scales, but that does not mean they are linked. Bendick acknowledged that there is less evidence for the proposed mechanism than for the pattern itself. But she’s confident the pattern is there. “I think this is likely to inspire many people to look at this pattern, and it’s possibly someone will come up with an even better explanation,” she said.

Notes by Paul Somerville

The following is excerpted from the abstract of Bilham and Bendick (2017).

On five occasions in the past century a 25-30% increase in annual numbers of Mw≥7 earthquakes has coincided with a slowing in the mean rotation velocity of the Earth, with a corresponding decrease at times when the length-of-day (LoD) is short. The correlation between Earth’s angular deceleration (d[LoD]/dt) and global seismic productivity is yet more striking, and can be shown to precede seismicity by 5-6 years, permitting societies at risk from earthquakes an unexpected glimpse of future seismic hazard.

The cause of Earth’s variable rotation is the exchange of angular momentum between the solid and fluid Earth (atmospheres, oceans and outer core). Maximum LoD is preceded by an angular deceleration of the Earth by 6-8 years. We show delayed (increase in) global seismic productivity is most pronounced at equatorial latitudes 10°N-30°S.

The observed relationship is unable to indicate precisely when and where these future earthquakes will occur, although we note that most of the additional Mw>7 earthquakes have historically occurred near the equator in the West and East Indies. A striking example is that since 1900 more than 80% of all M≥7 earthquakes on the eastern Caribbean plate boundary have occurred 5 years following a maximum deceleration (including the 2010 Haiti earthquake).

The 5-6 year advanced warning of increased seismic hazards afforded by the first derivative of the LoD is fortuitous, and has utility in disaster planning. The year 2017 marks six years following a deceleration episode that commenced in 2011, suggesting that the world has now entered a period of enhanced global seismic productivity with a duration of at least five years.

The correlation between the change in Earth’s rotation rate and the frequency of Mw>7 earthquakes from Bendick and Bilham (2017) is shown in Figure 1.  I have not seen the Bilham and Bendick (2017) presentation.

Figure 1. Changes in the length of the day correlate with decadal fluctuations in annual M ≥ 7 earthquakes, smoothed with 10 year running mean. Peak seismic activity and rotational acceleration occur at 15, 33, 60, and 88 year intervals. Source: Bendick and Bilham, 2017.


Bendick, R., and R. Bilham (2017), Do weak global stresses synchronize earthquakes?.  Geophys. Res. Lett., 44, 8320–8327, doi:10.1002/2017GL074934

Bilham, R. and R. Bendick (2017). A five year forecast for increased global seismic hazard.  Invited presentation, Geological Society of America Meeting, Seattle, Washington.

Victoria on alert for worst floods in over 20 years

This article by Anna Prytz was published in today’s issue of The Age.

Heavy rain is forecast to arrive in Melbourne on Friday. Photo: AAP

Record-breaking rain is bearing down on Victoria, triggering warnings of dangerous flash flooding across the state.

After a scorching end to spring, Melbourne is set to get one month’s worth of rain in just two days, and possibly an entire summer’s worth of rain in the season’s first three days.

A severe weather warning has been issued for all of Victoria as the Bureau of Meteorology prepares for what could be the state’s most significant rain event in over 20 years.  Read more:

Changes in Earthquake Hazard Levels in the draft Geoscience Australia National Seismic Hazard Assessment (NSHA18)

Paul Somerville, Risk Frontiers


Geoscience Australia (GA) has embarked on a project to update the seismic hazard model for Australia through the National Seismic Hazard Assessment (NSHA18) project.  The following information is excerpted from Allen et al. (2017) and from discussions that took place at the Annual Conference of the Australian Earthquake Engineering Society (AEES) in Canberra, November 24-26, 2017 and a pre-conference workshop organised by GA on the NSHA18 project held on November 23.

The draft NSHA18 update yields many important advances on its predecessors, including:

  1. calculation in a full probabilistic framework using the Global Earthquake Model’s OpenQuake-engine;
  2. consistent expression of earthquake magnitudes in terms of moment magnitude, Mw;
  3. inclusion of epistemic uncertainty through the use of alternative source models;
  4. inclusion of a national fault-source model based on the Australian Neotectonic Features database;
  5. the use of modern ground-motion models; and
  6. inclusion of epistemic uncertainty on seismic source models, ground-motion models and fault occurrence and earthquake clustering models.

The draft NSHA18 seismic design ground motions are significantly lower than those in the current (1991-era) Standards Australia AS1170.4:2007 hazard map at the 1/500-year annual ground-motion exceedance probability (AEP) level. The large reduction in seismic hazard at the 1/500-year AEP level has led engineering design professionals to question whether the new draft design values will provide enough structural resilience to potential seismic loads from rare large earthquakes. These professionals are planning to use a seismic design factor of 0.08g as a minimum design level for the revised AS1170.4 standard, due to be released in 2018, and are discussing the idea of transitioning to a 1/2475-year AEP in the longer term, consistent with the trend in other countries including Canada and the United States.

The primary reason for the significant drop in seismic hazard is due to adjustments to earthquake catalogue magnitudes. Firstly, prior to the early 1990’s, most Australian seismic observatories relied on the Richter (1935) local magnitude (ML) formula developed for southern California. At regional distances (where many earthquakes are recorded), the Richter scale will tend to overestimate ML relative to modern Australian magnitude formulae. Because of the likely overestimation of local magnitudes for Australian earthquakes recorded at regional distances, there is a need to account for pre-1990 magnitude estimates due to the use of inappropriate Californian magnitude formulae. A process was employed that systematically corrected local magnitudes using the difference between the original (inappropriate) magnitude formula (e.g., Richter, 1935) and the Australian-specific correction curves (e.g., Michael-Leiba and Malafant, 1992) at a distance determined by the nearest recording station likely to have recorded a specific earthquake (Allen, 2010).

Another important factor determining the reduction in hazard is the conversion of catalogue magnitudes such that magnitudes are consistently expressed in terms of moment magnitude, MW. Moment magnitude is the preferred magnitude type for probabilistic seismic hazard analyses (PSHAs), and all modern ground-motion models (GMMs) are calibrated to this magnitude type. Relationships between MW and other magnitude types were developed for the NSHA18. The most important of these is the relationship between ML and MW because of the abundance of local magnitudes in the Australian earthquake catalogue. The preferred bi-linear relationship demonstrates that MW is approximately 0.3 magnitude units lower than ML for moderate-to-large earthquakes (4.0 < MW < 6.0). Together, the ML corrections and the subsequent conversions to MW effectively halve the number (and subsequently the annual rate) of earthquakes exceeding magnitude 4.0 and 5.0, respectively. This has downstream effects on hazard calculations when forecasting the rate of rare large earthquakes using Gutenberg-Richter magnitude-frequency distributions in PSHA.

The secondary effect of the ML and MW magnitude conversion is that it tends to increase the number of small and moderate-sized earthquakes relative to large earthquakes. This increases the Gutenberg–Richter b-value, which in turn further decreases the relative annual rates of larger potentially damaging earthquakes (Allen et al., 2017).

The final main factor driving the reduction of calculated seismic hazard in Australia is the use of modern ground motion models (GMMs). While seismologists in stable continental regions (SCRs) worldwide recognise the complexity in characterising the likely ground motions from rare large earthquakes, more abundant ground-motion datasets of moderate-magnitude earthquakes are emerging. The NSHA18 hazard values are based on modern GMMs with improved understanding of instrumental ground-motion source amplitudes and attenuation in Australia and analogue regions. The peak ground accelerations (PGAs) predicted by these modern models in general are up to a factor of two lower than the Gaull et al. (1990) peak ground velocity (PGV)-based relationships at distances of engineering significance (generally less than 100 km). At larger distances, the lower rates of attenuation of the Gaull et al. (1990) relationships yield ground-motion values up to factors of 10 higher than modern GMMs (Allen et al., 2017).

It is anticipated that the National Seismic Hazard Assessment (NSHA18) project will be complete in mid-2018, at which time Geoscience Australia has agreed in principle to provide a briefing on it in Sydney for the insurance Industry.    The updated version of AS1170.4 will be released in 2018.


Allen, T., J. Griffin, M. Leonard, D. Clark and H. Ghasemi (2017). An updated National Seismic Hazard Assessment for Australia: Are we designing for the right earthquakes? Proceedings of the Annual Conference of the Australian Earthquake Engineering Society in Canberra, November 24-26, 2017.

Standards Australia (2007). Structural Design Actions, Part 4 Earthquake Actions in Australia. AS1170.4:2007.

The Mw 7.1 Puebla, Mexico Earthquake of 19 September 2017 – the anniversary of the Mw 8.0 Michoacan earthquake of 1985.

By Paul Somerville, Risk Frontiers

As reported by the USGS, the September 19, 2017, Mw 7.1 Puebla earthquake in Central Mexico occurred as the result of faulting within the subducted Cocos plate at a depth of approximately 50 km and about 120 km southeast of Mexico City. At least 220 people were killed at Mexico City, 74 in Morelos, 45 in Puebla, 13 in Estado de Mexico, 6 in Guerrero and 4 in Oaxaca. At least 6,000 people were injured. At least 44 buildings collapsed and many others were damaged at Mexico City. Many other buildings were damaged or destroyed in the surrounding area. Significant damage occurred to the electrical grid in Estado de Mexico, Guerrero, Mexico City, Morelos, Oaxaca, Puebla and Tlaxcala.

This earthquake occurred on the anniversary of the devastating Mw 8.0 Michoacan earthquake of 19 September 1985, which caused extensive damage to Mexico City and the surrounding region. That event occurred as the result of thrust faulting on the plate interface between the Cocos and North America plates, about 450 km to the west of the September 19, 2017 earthquake.

Most of Mexico City is founded on a clay-filled lake. The clay has a resonant period of 1 to 2 seconds and has very unusual properties – it is very elastic (has low damping), which allows a very large resonance to build up due to the trapping of energy within this shallow sedimentary basin (Figures 1 and 2).  This resonance caused the collapse of buildings, especially ones having natural periods of 1 to 2 seconds, and generated a seiche in Lake Chapultepec (part of the original lake that has not been filled in) seen in a widely viewed video, in which the waves have a period of about 2 seconds.

The 1 to 2 second resonance of the lakebed can also be set up by marching soldiers.  This occurred exactly 33 years earlier to the day, when I was on holiday in Mexico City.  It was September 19, Independence Day, and the soldiers were marching down Reforma Avenue.  I was standing on the roof of my ten story hotel, which was swaying noticeably.  One year to the day later, at 07:17 am on 19 September 1985, the Mw 8.0 Michoacan earthquake occurred.  I doubt that my hotel  survived the earthquake.

After the 1985 earthquake I spoke with my colleague, Lloyd Cluff, who had been at a meeting with Mexican government officials on the day of the earthquake to discuss seismic issues for nuclear power plants.  The meeting was held on the edge of Mexico City outside the lakebed area (blue area of Figure 1).  After he returned to his hotel that evening he turned on the TV and saw photos of a disastrous earthquake.  It took him some time to recognise the scene of the disaster as Mexico City.  No one at the meeting had known that it had occurred early that morning in Mexico City, because the shaking outside the lakebed area had been so weak.

Figure 1. Response spectral accelerations at 1 second period in Mexico City from the Mw 7.1 Puebla, Mexico earthquake of 9 September 2017. The largest ground motions, shown in the red and yellow colours, occurred in the parts of the city founded on the lakebed. Source: UNAM.
Figure 2. Locations of damaged buildings in Mexico City from the 2017 Puebla earthquake showing correlation with the western part of the ground motion map in Figure 1.

It doesn’t always take superstorms to get supersurges

By Thomas Mortlock (

Severe Tropical Cyclone Debbie made landfall at Airlie Beach on the Whitsunday Coast earlier on in the year, with an estimated property insurance market loss estimate over AUD $1.6 billion (PERILS, 2017). Debbie had all the ingredients for a large storm surge potential – a low and dropping pressure before landfall (down to 943 mB), high and sustained onshore wind speeds (landfalling as a Cat 4 system), a track perpendicular to the coast, and a very slow forward moving speed (7 km/hr at landfall).

Debbie also coincided with a relatively high state of tide (landfall occurring 2 hours after high water) and large waves (> 9 m), to produce a storm tide inundation, according to Risk Frontiers’ own survey estimates, of around 5 m above mean sea level. This was roughly equivalent to the height of most coastal foredunes, meaning direct coastal inundation damage to property was limited.

While the storm tide inundation could have been much higher if Debbie had made landfall two hours earlier, the storm surge itself (a combination of elevated coastal water levels due to high wind speeds and low atmospheric pressure, minus tides and waves) should have been bigger. The open question posed since March has been – why was it not?

A similar question has been asked of storms and surges in the Wadden Sea, a fringe basin in the North Sea between the Netherlands and Denmark. The article below, written by Giordano Lipari in the Netherlands, makes the point that superstorms don’t always lead to supersurges, especially in coastal areas fronted by islands. In the case of the Wadden Sea, and the article below, these are “barrier [sand] islands”, but in the case of Debbie and the Whitsunday region, the same effect may also be caused by the numerous rock and coral islands and reefs that fringe the mainland coast.

Below is an edited version of Lipari’s article. The original version can be found at

Figure 1 The Wadden Sea

1. Big can fail, little can hit 

The Wadden Sea (Figure 1) is a fringe basin of the North Sea delimited by a strip of barrier islands. When it comes to storms and surges, the Wadden Sea stages an intriguing three-way interaction between physiographic features (in plain language: the water container), atmospheric systems (weather), and flow patterns (water). In the Dutch part of basin, in particular, this interplay defeats the intuition that the most severe surges are caused by the most severe storms when ranked by wind speed alone.

2. Back by the beach

When raging winds raise the water against the coast, it is generally taken as ground truth that the higher the peak wind speed, the higher the peak water level. Some tide gauges in the Dutch Wadden Sea, however, showed that record-breaking surges were not caused by the most severe winds in the same control period.

The unlimited presence of water is self-evident on a shore squarely facing the ocean’s expanse. In contrast, the water volume contained in the Wadden Sea depends on the course of the waters flowing in and out across its several tidal inlets. However hard the wind pushes in the water across one tidal inlet, some water may still escape from another, leading to no noteworthy accumulation of water inside the basin. In the extreme, there is no surge if no extra water stays in and for long enough. Hence, there could be much barking in the wind, little biting in the water.

Figure 2 Hydrodynamic modelling in the Wadden Sea – the fringing barrier islands and the complex flow directions behind may be an indication of the type of situation that occurred to the lee of the Whitsunday Islands during Debbie

In a basin delimited by barrier islands [or rock islands and reefs, in the case of the Whitsunday coastline] the surges are significantly modulated by the physical geography. Only those storms causing a substantial piling-up of water behind the islands can cause severe surges, once they have managed to bring in the excess water to raise in the first place.

The arrows in the picture above, based on computer simulations, indicate qualitatively where water is going in and out at a storm’s given moment: clearly, it’s not the same everywhere, nor will it stay unchanged while the storm unfolds itself.

In sum, the Wadden Sea evidence is that high wind speeds alone are neither necessary nor sufficient to cause, or expect, record-breaking surges. Since the container drives both water storage and motion, the Wadden Sea itself determines effectively which storms result in a surge with a certain level of flood hazard with possibly counter-intuitive outcomes. The scope and degree of generality to which the cautionary tale is applicable to all/other situations is matter of orderly scientific discourse. Certainly, the severity of storm surges is pretty much a situation-specific matter, and it cannot be reduced to a single number defining the storm alone, such as the Beaufort or the Saffir-Simpson scale, except in the simplest configurations.

3. Thinking onwards and upwards

There are many ways in which the Wadden Sea insights might be helpful beyond the specifics. Societal concerns for the coastal areas are justified due to the growing concentration of the global population, to the weather anomalies and outliers expected to increase after climate change, and to the land subsidence aggravating flood-proneness.

The investigations on the Wadden Sea made it at least clear that the excess of simplification in the superstorm-supersurge anticipation could mishandle the exposure and vulnerability of certain coastal areas.

As often said — and hence attributed to Albert Einstein for good measure —, every problem statement should be as simple as possible, but not simpler. In the case of storm surge prediction – this may not be that simple.

Risk Frontiers staff and associates have significant experience in coastal process and hydrodynamic modelling, in particular, understanding the dynamics and impacts of extreme waves and water levels in Australia. In association with a consortium of eight other research institutes and government agencies, we are coordinating the analysis of an unprecedented number of coastal impact observations post-Debbie, to be published soon.


Have We Increased our Vulnerability to Big Floods?

By Chas Keys.

In New Orleans: 11 Years after Katrina (Briefing Note 317: May 2016), John McAneney and Foster Langbein cite an observation from sociologist Shirley Laska, Professor Emerita at the University of New Orleans. Laska argues that decisions and actions taken over three centuries had reduced the vulnerability of the city of New Orleans to small and moderate floods but increased its vulnerability to very large ones. The reduction of sediment loads in the Mississippi River had led to the destruction of the marshlands and barrier islands that once protected the city from storm surges. The import of this was that ‘routine’ floods were kept out of built-up areas by the levees but the reduction of the coastal landmass increased the likelihood of the embankments being overwhelmed when very big floods struck. The disaster that was Hurricane Katrina, which led to the breaching of the levees and well over a thousand deaths, supports Laska’s point.

Laska piqued a thought that I have long harboured about flood management in NSW. We have, over the past 60 years, invested heavily in mitigating the effects of the flood threat. Much has been achieved and dozens of communities are now better off than they once were in terms of exposure to flooding. Structural protection by way of levees, flood bypasses, the rock-armouring of stream banks and the construction of mitigation dams and retarding basins has been much improved as have warning services, rescue and other response capabilities, land use management, better flood modelling, insurance coverage and, to a degree, community education about flooding and the steps people can take to manage it in their own interests. Without doubt, communities are more able to live with the flood threat created by virtue of their locations and developmental histories.

But what, precisely, is the nature of the improvement? Laska’s thinking provides a clue. What NSW has done since the late 1950s has contributed greatly to the mitigation of the effects of modest sized floods, which for many urban communities, have ceased to exist: they have been contained to nearby rural areas. But bigger, less frequent, floods are the most consequential in terms of loss of life and damage to private and public assets. No levees can be guaranteed to keep out these, partly because there is never complete certainty about levee integrity and partly because very few levees are built to exclude floods larger than the 1-in-100 year event. At the same time, we have not accompanied our engineering efforts with measures to ensure that community members understand the level of protection provided and what incomplete protection inevitably means.

In effect we have allowed, indeed encouraged, people to believe that the levees have overcome the flood problem and made it benign. This is not true.

Take the case of Maitland. On the Hunter River, its Central Business District, and more than 3000 residents in central and South Maitland, Horseshoe Bend, Lorn and part of East Maitland, are protected by levees. With the exception of Lorn, which has experienced inundation only once in more than a century, these areas have long flood histories with many killed and much property damage over the decades. The period 1949-55 saw parts of these areas flooded several times, catastrophically so in 1955, and Maitland became part of the reason that drew the state government and then the commonwealth into the field of flood mitigation. Hitherto, flood mitigation (along with warning and response) had been the responsibility of private and local council efforts and local funding. Often, it was managed poorly.

Maitland’s modern flood mitigation scheme was completed in about 1970 and has done a fine job of protecting the community from floods that have flowed past, rather than into the built-up areas. Some areas would have experienced inundation several times had the primitive levees of previous times not been superseded by well-engineered ones after the floods of the 1950s.

None of these recent floods, however, has come close in peak height or volume to those of the February, 1955 flood. That event, if repeated today, would overtop some of the levees and inundate much of urban Maitland. The so-called ‘ring levee’ which partly surrounds the town on its southern edge is designed to be overtopped in a 1-in-50 year flood, half a metre lower than the 1955 flood, which is thought likely to be equalled or exceeded only in a 1-in-200 year flood (AEP = 0.5%).

The problem is that there is a strong feeling in the community that the levees have rendered Maitland flood-free. (Andrew Gissing and his team at Risk Frontiers have seen the same sentiments expressed in Lismore.) The fact that the ring levee is designed to admit floodwaters in floods much smaller than the 1955 event is unknown to many, probably most. One indication of this was in 2007, when a flood for a time thought likely by the Bureau of Meteorology to be the highest since 1955, produced a rather desultory property protection and evacuation response from many members of the community.

During the 1950s, with flood after flood assailing them, Maitlanders became expert at neighbourhood-level self-help endeavours like lifting furniture in situ, trucking it to the nearby hill suburbs of East Maitland and Telarah and evacuating people to safety. Some had to evacuate eight times between 1949 and 1955. Of necessity a strong flood culture existed in those days as it had in earlier times. When floods were approaching, groups of men would move from house to house, helping residents lift furniture and other home contents or carrying them out to drays and trucks for transporting to the nearby hill suburbs. Families followed, staying with friends and relatives while waiting for the floodwaters to recede.

Since the 1950s only two floods have produced forecasts that would have justified raising or removing belongings and leaving for high ground. These were the floods of 1971 (peaking nearly a metre lower than in 1955 and coming close to overtopping the ring levee) and 2007 (when the peak, thought initially to have been likely to slightly exceed that of 1971, turned out to have been substantially over-predicted). On the evidence of 2007, a big flood now would see a substantial under-response on behalf of community members, with material damage and perhaps deaths higher than would have been the case had the the behavioural modes of the 1950s been in place.

And there is further reason for pessimism about Maitland: the local council, in its concern about the commercial viability of the Central Business District, has sought to reverse some of the land use restrictions that have been in place since the 1950s. The ‘old city’ has lost population steadily over the decades through out-migration and the expansion of commercial land uses into residential areas, and the CBD’s market has shrunk considerably. To bolster the viability of the CBD the council now seeks to restore the residential population of nearby areas to the level of 1954, when well over 5000 people lived in them compared with fewer than 1800 today.

Faced with severe state-instituted restrictions on building in these areas, the council proposed in 2015 that the restrictions on residential floor height construction in levee-protected areas be abandoned provided that new dwellings were built with at least 50% of their habitable flood space above the flood standard (the modelled 1-in-100 year flood level plus half a metre freeboard). The reasoning was that residents, on hearing a flood forecast and being advised to raise items of value, would have the opportunity to move items from the lower floors of their dwellings to the higher floors.

An appeal to the state Minister for Planning for a relaxation of the building restrictions was, however, rejected. The Minister’s decision implied a recognition that the proposed change might easily have led to increased flood damage. Now the Office of Environment and Heritage has argued that new residential development − even development which adheres to the existing planning restrictions − should not be undertaken until the road infrastructure that will support evacuation from the ‘old city’ is upgraded. The council’s ambitions for population growth in the old city are being thwarted. Had the council’s preferred solution to the woes of the CBD been implemented then the community’s vulnerability to flooding would have been increased simply by virtue of many more people becoming residents of flood prone areas.

Given the reality of the flood situation, there is a strong case for community flood education to include messages about the inevitability, in very large floods, of inundation of areas behind the levees. One initiative, undertaken in the early 1980s by the Department of Public Works, involved the fixing to power poles of markers indicating the heights reached in the flood of 1955. A few of these in the built-up areas were more than four metres above ground level; many were more than two metres above. The council was never enthusiastic about the markers, and when Public Works vacated the field of flood education, they slowly disappeared ─ the victims of power pole replacement, people concerned about the value of their properties, and the rusting of the nails that held them in place. Today there are fewer than ten of the several dozen original markers left. An inexpensive, easy-to-maintain means of reminding or informing residents and others of the potential for severe flooding has gradually disappeared.

Since 2001, the State Emergency Service has assumed the role of providing community flood education. Yet these worthy efforts there is much to suggest that the community at large does not comprehend the flood risk implied by the potential for levee failure or overtopping. What we have in today’s Maitland is levees designed to let water into built-up areas in big floods, a community that is inexperienced in flood management, many residents who are oblivious to the threat that big floods pose and a council that seeks to increase the population in areas that will be severely affected by big floods and which shows little interest in flood education. This is a potentially lethal combination.

The levees have done an excellent job in protecting the community but carry the downside of an altered perception of the flood risk. The policy message is that in building levees we should also build an understanding of their limitations and stress that they can only mitigate, not eliminate, the flood threat. Co-ordinated, properly resourced and appropriately evaluated programmes seeking to do this do not exist in Australia.

The education needs to extend to elected councillors to help them understand that their decisions can contribute greatly to the oft-demonstrated ‘levee paradox’ in which the provision of structural protection too easily leads to intensified development in the protected areas. Councils are accustomed to dealing with the tension between developmental and environmental considerations, but less so in managing conflict between community safety and developmental objectives.

In New Orleans it was largely the progressive erosion of natural coastal defences that increased the city’s vulnerability to big floods. Many died in Hurricane Katrina as a consequence. In Maitland we risk the same effect being wrought. By pursuing a land use management policy that will put more people in harm’s way when big floods occur, and at the same time by not making a fully-fledged effort to ensure that people comprehend the nature of the threat posed by such floods, the level of the community’s flood vulnerability has been increased. A big flood, even one not as big as the flood of 1955, will demonstrate this some day. Maitland’s story also has the potential to be reproduced in many other leveed areas in Australia.

Happy Graduates – Stuart and Tetsuya

From left: Christina Magill, Stuart Mead, Tetsuya Okada, Kat Haynes

Congratulations to Stuart Mead and Tetsuya Okada pictured here with their supervisors Christina Magill and Kat Haynes.

Tetsuya Okada’s PhD investigated recent disaster recovery and risk reduction processes in Australia and Japan. Read more.

Stuart Mead’s PhD developed and integrated computational models of lahar hazard in order to quantify the risk and potential losses caused by lahars. Read more.

Where, Why And How Are Australians Dying In Floods?

This article by Freya Jones, published in Asia Pacific Fire Magazine, October 3, refers to research undertaken the CRC research team led by Katharine Haynes, Risk Frontiers.

Fatalities from floods are a major cause of natural hazard deaths around the globe. Here in Australia, floods are ranked second only to heatwaves in terms of the total number of natural hazard fatalities since 1900. Recent cases over the last two years, such as June 2016 in New South Wales and Tasmania, along with the aftermath of Severe Tropical Cyclone Debbie in northern NSW in April this year, highlight the significant dangers of floodwaters and as the research suggests, many of the flood deaths are avoidable.

To gain a greater understanding of human behaviour and why people choose to enter floodwaters, the CRC research project Analysis of human fatalities and building losses from natural disasters has measured the impacts of floods. The research looks at the toll on human life, injuries and building damage while analysing trends over time. Read more.

A Machine Learning Model Of Tropical Cyclone Wind Risk

This article by Thomas Loridan, Risk Frontiers, was published in Asia Pacific Fire Magazine, October 3, 2017.

Extreme winds from tropical cyclones (TCs) regularly threaten communities worldwide. In recent decades significant efforts have been put towards improving our understanding of the mechanisms involved. In particular detailed analysis of satellite imagery and observations from aircraft reconnaissance missions have allowed formulation of a well-accepted framework whereby asymmetries in the TC wind field structure are attributed to the forward motion of the system: stronger winds occur to the right (left) of a moving TC in the northern (southern) hemisphere with the magnitude of the left/right asymmetry increasing as the storm moves faster. Read more.

Disaster Risk Management: Australian Challenges

This article by Andrew Gissing, Risk Frontiers, was published in Asia Pacific Fire Magazine, October 3, 2017.

Australia is exposed to a variety of natural and technological disaster risks, which vary in their significance across the nation. Communities are faced with the increasing costs of disaster losses due to higher wealth and the increasing development of hazardous areas, whilst Government budgets are under pressure. Climatic, demographic, economic, political and technological changes are acting to shape future disaster risks.

Internationally, the Sendai Framework for Disaster Risk Reduction exists with the goal to: “prevent new and reduce existing disaster risk through the implementation of integrated and inclusive economic, structural, legal, social, health, cultural, educational, environmental, technological, political and institutional measures that prevent and reduce hazard exposure and vulnerability to disaster, increase preparedness for response and recovery, and thus strengthen resilience”.  Read more.