When Rising Seas Transform Risk Into Certainty


This nicely written and well-researched article by Brooke Jarvis appeared in the New York Times Magazine on April 23, 2017, with the headline: Under Water. It nicely illustrates many of the issues discussed by John and other authors in McAneney et al. (2016) in Government sponsored insurance schemes: a view from down under and other scholars. In particular it points out the role of seal level rise and sinking coastal landscapes — in some cases both together — in changing the vulnerability of coastal communities to flooding, the ‘immoral’ hazard of subsidizing people to live in harms way, the stupidity of building slab-on-ground construction in flood plains, the reality signal sent by risk-informed insurance premiums and how homes with repeated claims are threatening the viability of the National Flood Insurance Program in the US.


 

Norfolk flooding
Larchmont-Edgewater, a Norfolk, Va., neighborhood frequently plagued by floods. The house in the center has been raised above flood levels; the one at left has not. Credit Benjamin Lowy for The NY Times

In 1909, a group of Virginia developers placed an ad in The Norfolk Ledger-Dispatch announcing the creation of a subdivision that — because it was built on a pair of peninsulas where the Lafayette and Elizabeth Rivers poured into Chesapeake Bay — came to be known as Larchmont-Edgewater. The developers set up private jitney service to downtown and advertised the area as “Norfolk’s only high-class suburb.” People flocked to live by the water’s edge.

Today the neighborhood is known for the venerable crepe myrtles that line its streets, for its fine houses and schools and water views and for the frequency with which it is not just edged by, but inundated with, water. Melting ice and warming water are raising sea levels everywhere. But because the land in the Hampton Roads area of Virginia (which includes Norfolk) is also sinking, relative sea levels there are rising faster than anywhere on the Atlantic coast. Water levels are already as much as 18 inches higher than they were when the developers created Larchmont-Edgewater a century ago, and they are still rising. As a result, it’s much easier for winds, storms and tides to push flood water into streets, yards and homes that once stood high and dry.

When Elisa Staton found a small house a block from the water in Larchmont-Edgewater in 2005, she was thinking of the neighborhood’s grand trees and Tudor-style houses, of the elementary school she hoped to send her kids to, once she had them. She wasn’t thinking much about flooding, though she knew the house was in a hundred-year flood zone, which meant that to get a federally backed mortgage, she was required to pay for flood insurance through the National Flood Insurance Program (N.F.I.P.), a government-subsidized system overseen by the Federal Emergency Management Agency. The insurance was reasonable, and there was no record of the house ever being flooded before. She bought it for $320,000.

A “hundred-year flood” sounds like a factor of time, as if the land were expected to flood only once every 100 years, but what it’s really meant to express is risk — the land has a 1 percent chance of flooding each year. As waters rise, though, flooding in low-lying places without sea walls, like Larchmont-Edgewater, will become more and more common until the presence of water is less about chance and more about certainty. And few insurers are willing to bet against a certainty.

Ten years later, Staton’s rec room had been flooded twice, and her insurance premiums, like those of many coastal property owners, had skyrocketed. She was seeing the effects not only of those local floods but also of rising waters elsewhere. As storm damage becomes more costly, it has left the N.F.I.P. tens of billions of dollars in debt and federal officials scrambling to bridge the divide between the rapidly growing expense of insuring these properties and the comparatively tiny, taxpayer-subsidized premiums that support it.

In 2012 and 2014, Congress responded to the N.F.I.P.’s troubles with bills known, thanks to the accidental aptness of their sponsors’ names, as Biggert-Waters and Grimm-Waters. The first law cut subsidies and phased out grandfathered rates so that premiums would start to reflect the true risk that properties like Staton’s face — reaching what the N.F.I.P. calls “actuarial soundness.” The second tried to slow the rate of those increases when it became clear how hard they would hit property owners.

Staton married and left Norfolk, renting out her house as she followed her husband’s job in the military. But eventually she was paying nearly $6,000 in flood premiums on top of her mortgage every year, nearly always more than she could make in rent. “I decided to cut my losses and get out,” she said. “The flood insurance kept going up, and I was drowning in it.” A real estate agent she consulted told her that she’d be lucky to sell the house for $180,000, barely more than half of what she paid for it and significantly less than what she still owed on the mortgage. Everyone looking at places near the river, the agent said, asked about flood insurance first. It wasn’t the risk of high waters that spooked buyers; it was the certainty of high premiums.

Staton lay awake at night wondering what to do. “I hate that house — that house has been my nightmare for 10 years,” she said last month, on a day when the dogwood and quince were bursting into flower in the front yard and the sun was sparkling off the calm, tidal river biding its time a block away. “I never got to get my head back above water.”

Insurance serves as a bulwark, both financial and mental, against the fact that we live in a fundamentally uncertain and dangerous world. “The revolutionary idea that defines the boundary between modern times and the past,” the financial historian Peter L. Bernstein wrote in his 1996 book, “Against the Gods,” “is the mastery of risk: the notion that the future is more than a whim of the gods and that men and women are not passive before nature.” Calamity can come for us all, but by bundling enough separate peril together we manage to form a general stability, a collective hedge against helplessness. As climate insecurity mounts, though, that math will get harder.

Frank Nutter, president of the Reinsurance Association of America, put it in more direct terms: “Constant risk — that’s not what insurance is about.”

Flooding is the most common, and most expensive, natural disaster in the United States. Private insurers have long declined to cover it, leaving the government on the hook for disaster assistance after floods. (Hence the famous lawsuits after Hurricane Katrina, when people who came home to empty slabs were asked to prove that their losses were a result of wind and not waves.) Congress created the N.F.I.P. in the late 1960s in response to a series of expensive floods caused by hurricanes and overflowing rivers. It offers insurance coverage, some of it subsidized, to communities that meet floodplain-management requirements; requires people who want loans to buy houses in dangerous places to buy it; and also provides grants for mitigation projects meant to reduce flooding damage, like elevating houses or buying out the owners of flood-prone homes. Private insurers including Farmers, Allstate and 68 other companies also sell and administer the policy on the government’s behalf — and take a sizable cut of the premium. If floods do come, though, it’s still the government that’s on the hook.

The N.F.I.P. was meant to encourage safer building practices. Critics argue that instead it created a perverse incentive — a moral hazard — to build, and to stay, in flood-prone areas by bailing people out repeatedly and by spreading, and in that way hiding, the true costs of risk. (In 1998, “repetitive-loss properties,” buildings that flood over and over, accounted for 2 percent of N.F.I.P.’s insured properties but 40 percent of its losses; since then, such losses have only increased.) As Larry Filer, an economist at the Center for Economic Analysis and Policy at Norfolk’s Old Dominion University, explains, “Somebody on a mountain in Colorado is helping the person in Virginia Beach live on the waterfront.”

Mike Vernon
Mike Vernon, an insurance agent in the Hampton Roads area of coastal Virginia who brands himself as “the Flood Insurance Guy.” Credit Benjamin Lowy for The New York Times

And then came Hurricanes Katrina, Wilma and Rita, which in 2005 left the N.F.I.P. with claims six times higher than it had seen in any previous year. To cover them, it borrowed $17.3 billion from the Treasury. Hurricane Sandy in 2012 meant another $6.25 billion in debt, along with allegations that insurance companies distributing FEMA funds were shorting policyholders; 2016, when there were floods in Louisiana, Texas, Virginia and elsewhere, managed to be the third-most-expensive year in the N.F.I.P.’s history even with no single standout catastrophe, deepening the hole further. Servicing the debt is expensive, but FEMA sees no way to repay it, Roy Wright, the N.F.I.P. administrator, told Congress last month.

More losses loom. A single major storm-and-flooding event could cause $10 billion in damage in Hampton Roads alone, according to one planning report. AIR Worldwide, which models the risks of catastrophic events for insurance companies and governments, found that $1.1 trillion in property assets along the Eastern Seaboard lie within the path of a hundred-year storm surge. “That’s a very staggering number,” says AIR’s chief research officer, Jayanta Guin — and it represents only the risk on that coast, and only under current sea levels. By the 2030s, according to a 2008 analysis by Risk Management Solutions (R.M.S.) and Lloyd’s of London, annual losses from storm surges in coastal areas around the world could double.

In 2015, the N.F.I.P. asked R.M.S. and AIR Worldwide to update its modeling by running thousands of computer simulations to show what possible storms might mean for the properties it insures, helping it to quantify its financial exposure across the country. In 2016 and 2017, the N.F.I.P. — in a first-of-its-kind action for a federal program — transferred some of its risk to large, private companies known as reinsurers, which pool risk on gigantic scales: insurance for insurance companies.

Although Katrina and Sandy “felt like once-in-a-lifetime events,” Wright wrote in a recent blog post explaining the decision, “there is actually a 50 percent chance within a 10-year period the N.F.I.P. will once again experience Hurricane Sandy-size losses.” Removing subsidies is one partial solution, he told me — “There is no greater risk-communication tool than a pricing system” — but more hard decisions are coming. The N.F.I.P. is up for a reauthorization vote in September, its first since Biggert-Waters was passed; Wright believes the time has come to start limiting coverage for properties that are flooded over and over, a significant shift from the past. Multiple losses, he told me, “should force us to shift our position where we make an offer of mitigation to a homeowner, and if they do not choose to take it, we don’t renew their policy.”

After Biggert-Waters, some private insurers began showing an interest in covering flood insurance for the first time. A major factor is the end of subsidized coverage: As premiums increase, private insurers have a greater incentive to compete. Another, Guin says, is that risk analysis can be much more accurate than even a few years ago, thanks to powerful computers able to run more simulations that include more variables. Making money on insurance, after all, is a game of timing, and most policies are rewritten each year.

Evan Hecht, chief executive of the Flood Insurance Agency, based in Florida, read the details of Biggert-Waters and decided to expand his business. He had sold N.F.I.P. policies for years, but in 2013 he and his wife, Tiara, went out on their own, seeking private underwriting from Lloyd’s of London and an A.I.G. subsidiary. The vast majority of their policies — now totaling 19,000 in 37 states, including some in the Norfolk area, according to Hecht — are on properties that require flood coverage because of their locations, and on which FEMA is raising rates. On average, he estimates, premiums from the Flood Insurance Agency cost 30 to 35 percent less than those bought through FEMA. And the agency plans to offer further discounts for properties with waterproof alternatives to easily damaged materials like wood floors and Sheetrock.

At a congressional hearing on flood insurance reform in March, Hecht asked lawmakers to approve legislation that makes it simpler for private flood insurance to satisfy mortgage requirements. FEMA supports this move as a way of spreading out risk — the bottom line, Wright says, is that “we need more people covered for their flood peril” — but also cautions that it could make things worse for taxpayers if, with the help of better data, private insurers are willing to cover only lower-risk properties, or purposefully price themselves out of high-risk ones, leaving FEMA with an even more dangerous portfolio than it started with.

Hecht believes his company’s interest in policies FEMA considers underpriced for their risk is evidence that such an outcome won’t occur. But private insurance, he noted in the hearings, is “of course” not interested in covering severe-repetitive-loss properties or buildings whose exposure is higher than what can be recouped in premiums. What will happen, I asked him, to houses that flood too often? “Insurance policies aren’t written for 100 years,” he replied, “so we’ll react as it happens.” He described a driver who has had so many speeding tickets and accidents that his auto insurance skyrockets: “Those houses will not exist, just like that driver will no longer have a car. There’s no magic answer.”

Elisa Staton still owns the house in Larchmont-Edgewater. After delivering the painful estimate of the house’s new value, Staton’s real estate agent suggested she call a man named Mike Vernon, an insurance agent in Hampton Roads who brands himself as “the Flood Insurance Guy.” His specialty is finding clever ways to reduce flood premiums. When Vernon visited Staton’s house, he saw a solution right away: The rec room, once a garage, sat lower than the first floor, lowering the minimum elevation level of livable space inside the house, which FEMA uses to calculate premiums. By converting it back to “low-value storage space,” lifting the electrical system to a higher elevation and adding flood vents, Staton could get her premiums close to $800 a year. She paid for the work, Vernon updated her policy and she put the house on the market for $100,000 more than the agent first advised — but it has yet to sell.

We’re often actually making the building worse to bring down premiums,” Vernon told me: filling in basements, or preparing a house to let water flow through it instead of keeping it out (yes, the house may be damaged by moisture, but at least it won’t be pushed off its foundation). “Or we’re eliminating something good, like a sunroom on a slab.”

Vernon’s business is flourishing. A former consultant, he got the idea for his own venture after advising a flood-vent inventor around the time federal flood premiums began to increase: “Biggert-Waters passed, and I’m seeing dollar signs.” He’s hardly alone in looking for the financial silver linings of rising seas — local universities and the city itself are pointing to their growing expertise in flood mitigation and adaptation as a source of future revenue. Vernon gets most of his business from referrals from real estate agents, whose clients, unable to sell their houses, often come to him in tears. “People are getting killed,” he said. “To an appraiser it’s still worth $300,000, but to the real world it ain’t worth nothing, because it’s not going to sell.”

On a recent Wednesday morning, Vernon, seeking new business, described his work in the packed beige meeting room of a Hampton Roads real estate agency. He showed the agents a slide that listed the threats facing the area: changing weather patterns; bigger, stronger storms; rising sea levels; long-term erosion; sinking land mass; and poor building decisions. He got a laugh with a line about the absurdity of building houses with basements in Norfolk. “Was it a bad building decision back in 1900?” he continued. “Probably not, but it has turned into one.”

floor elevation
Two houses, one raised with a garage, the other with a higher foundation, sandwich a student rental property with a first-floor elevation below the flooding safe zone in Norfolk, Va. Credit Benjamin Lowy for The New York Times

Vernon described which problems are fairly easy to remedy and which are not. Houses built directly on slabs, which are especially common in low-income neighborhoods, have the fewest alternatives: Basically, raise it up or raze it down. (“If you ever want to make an enemy, or get back at one,” he’ll tell agents, “just sell them a house on a slab in a required flood zone.”) With flood insurance, Vernon said, the agents should be prepared for the three Fs: frustration, fear and foreclosure. “I’ve seen people, they just walk the keys down to the bank and say, ‘You can have it.’ ”

The biggest reaction came when Vernon explained that, because of the effort to make the N.F.I.P. more financially sound, premiums are set to go up by 18 to 25 percent every year, and cited a study that found that each $500 annual increase in flood insurance lowers a home’s value by $10,000. The room filled with gasps and whistles. “What was that ratio again?” an agent named Carmon Pizzanello called out from the back. “In two years,” Vernon replied, “you’ve lost tens of thousands of dollars on your house.”

Pizzanello volunteered that she’d sent one of her clients, living in a below-flood-elevation house on a slab and paying $3,200 annually for N.F.I.P. coverage, to talk to Vernon. Short of options, they looked into private insurance. The lowest quote that came back was $22,000 a year. It was one of those raise-or-raze situations, Vernon told the gathering, saying, “Elevation certificates are literally about tenths of feet.”

Spend a few days talking about floods and real estate in Norfolk, and you’ll quickly learn the importance of even tiny inclines. Locals know where, on what appears to the uninitiated to be a flat street, to park their cars to keep them from flooding past the axles when the wind pushes the tide up. Landscapers build what are essentially decorative earthen dikes around houses. When I asked one man how close storm and tidal surges come to his front porch, he pointed at the bricks under my feet, which I had taken for the wall of a flower bed. “You’re actually standing on a bulkhead,” he said.

In the coming decades, these fine distinctions will mean little, as the risk of flooding becomes the certainty of it. The operative measurement for rising waters in Norfolk is not inches but feet — as many as six of them by the end of the century, according to the Army Corps of Engineers, though estimates vary. City planners are forthright that they’re preparing for a future in which parts of the city do not survive. “We absolutely cannot protect 200 miles of coastline,” George Homewood, Norfolk’s planning director, says. “We have to pick those areas we should armor, and the places where we’re going to let the water be.”

Norfolk now mandates that new construction be built three feet above current base flood elevation (as if the houses were boats, this distance from the waterline is called freeboard), and 18 inches above what Homewood says is “euphemistically known as the 500-year floodplain.” But Norfolk is an old, established city, where changing new construction can only get you so far. In 2008, the city hired a Dutch engineering firm, experienced with life below sea level, to help develop a plan for adaptation. The firm suggested $1 billion in changes, more than half of which would go to simply updating existing infrastructure.

Like insurers, residents are playing a game of risk and timing. “Adaptation is a range,” says Fred Brusso, a former city flood manager. “Do you need to just move your car? Do you have to put your washer and dryer on cinder blocks? Or do you need to get the heck out of town?” Sean Becketti, the chief economist for Freddie Mac, cautioned in a report last year that economists aren’t sure if coastal property values will decline gradually, as the life expectancy of homes shrinks, or precipitously, “the first time a lender refuses to make a mortgage on a nearby house or an insurer refuses to issue a homeowner’s policy.”

Skip Stiles, the executive director of the local nonprofit Wetlands Watch, took me on a tour of frequently flooded areas of Norfolk — when waters are down, Stiles uses rusty storm drains and marsh plants growing in yards and medians to show where they’ve been — and pointed out buildings that had been elevated. Often their awkwardness made them obvious: ordinary, colorful houses perched uncomfortably atop walls of bare concrete blocks. While FEMA does pay to elevate risky houses, it struggles to keep up with demand: Wetlands Watch compared the number of people on the FEMA waiting list in Norfolk with the number of houses raised in a year, and concluded that it would take 188 years to complete them all. By then, of course, waters would be far higher.

This is the hardest reality to discuss, Stiles said, and a reason flood insurance is serving as a kind of advance scout into a more difficult future. “When you go out to the end of the century, some of these neighborhoods don’t exist, so it’s hard to get community engagement,” he said. “Nobody wants to talk beyond where the dragons are on the map, into uncharted territory.”

Flood Risk Perceptions of Lismore Businesses

By Andrew Gissing and Jonathan Van Leeuwen

Lismore has a long history of flooding, with the community known for its ‘flood culture’. The areas of North, South and Central Lismore were flooded on Friday the 31st of March, the worst since 1974. It was the first flood to have overtopped the Central Lismore levee, which protects the CBD comprising some 400 businesses. Built in 2005, the levee provides protection from floods to an average recurrence interval of some ten years.

Many people ignored evacuation orders and later needed help when their properties were directly threatened by floodwater. The SES has reported some 400 rescues, many of them from properties where people chose not to evacuate.

Previous Risk Frontiers research has explored evacuation rates from natural hazards, concluding that they are variable and that is very difficult to achieve complete compliance. As a result authorities need to plan for large-scale rescue operations. Flood risk perceptions and previous flood experience have likely played a part in how individuals responded to warnings, as well as the rapid rise of the Wilsons River that caught some off-guard.

Many previous studies after floods in Lismore have concluded that the community has been well prepared and adjusted to the flood hazard. Smith (1981), for example, found that previous flood experience and flood warnings had resulted in relatively low damages during the 1974 flood.

After the business district last flooded in 2001, Risk Frontiers identified that businesses had avoided major losses as a consequence of preparedness measures they had undertaken and activated once flood warnings were received. These included mitigation measures such as lifting fittings and equipment, use of mezzanine floors and implementation of Flood Action Plans (Gissing and Leigh, 2001).

Recent Research on Lismore Business Flood Risk Perceptions

Risk Frontiers undertook a telephone survey of 50 business operators in the Lismore CBD in November 2016 and February 2017 to understand the impact of the Central Lismore levee has had on risk perception and preparedness of businesses in the area.

Over 70% of businesses surveyed had operated in the Lismore CBD for more than 10 years with some 56% of respondents reporting that they had had experience with flooding. Wilson River floods recalled were those in 1954, 1974, 1984, 1987, 1989 and 2001.

Almost all respondents (95%) were aware of the Lismore CBD levee. Perceptions regarding how often they could expect to be flooded varied from never to once every year. 43% believed they would be flooded on average once every ten years and 22% once every five years. 32% over estimated the protection afforded by the levee believing they would be flooded less often than once in every ten years on average.

Some businesses acknowledged that they had been lucky not to have experienced flooding and acknowledged the value of SES and council efforts to educate communities:

We’re massively lucky that we haven’t had any big floods for so long, new business owners don’t really know what to expect.

 I went to an SES meeting for business owners about a year ago about floods and learnt a lot, I think it should be mandatory to go to things like that and listen to what people who have had to deal with full on floods have to say.

Some held unrealistic beliefs about the protection offered by the levee:

A flood would now have to be of biblical proportions with all the work done on the
levee.

The levee protects us from floods so we haven’t had to deal with any since it was built, I wouldn’t expect to be flooded in the near future.

Others felt that:

The levee gives a sense of false security, people aren’t really packing up any more when we get flood warnings. In the future when a big flood comes, people might lose a lot.

The majority of people (75%) believe that the levee provides more time for people to evacuate from the CBD in a flood event.

In respect of flood preparedness, the majority of respondents believed that despite the levee, it was still necessary to be prepared for floods with only 9% of respondents believing it unnecessary. However, 34% of respondents believe it is less important to be prepared now than was the case before the levee was constructed. 31% of respondents invoked global warming to mean that it was even more important to be prepared for worse floods.

80% of respondents had a Flood Action Plan. The completion of plans varied slightly with businesses that had experienced flooding 10% more likely to have developed Flood Action Plans than were businesses without prior flood experience. Of those businesses that had developed plans, many had had them in place for some time with respondents stating since moving in and forever. Only 37% of respondents had documented their Flood Action Plan, however, meaning it would be difficult for any new employee to effectively respond to flooding.

The majority of businesses either did not have flood insurance cover (56%) or were unsure if they did (31%). Those that did not have flood insurance believed it was not available to them or that it would be too expensive.

Discussion and conclusions

The construction of the levee in 2005 has likely impacted the perception of flood risk as evidenced by the number of respondents who believed the levee provided more protection than was designed for and the number of respondents who believed it was less important to be prepared for floods than prior to the construction of the levee.

In 2002 Gissing (2003) undertook a similar study. In comparison to the 78% of businesses identified as having Flood Actions Plans today, almost all Lismore businesses in 2002 had Flood Action Plans (97%). This comparison may imply a comparable decline in flood preparedness across business operators following construction of the levee, a decline that may have been worse in the absence of flood education programs offered by the NSW SES and Lismore City Council.

There are already concerns from business owners and residents about the length of time the levee has held floodwater within the CBD area, with some suggesting that consideration must be given to raising the levee to protect against future flooding.

Over the coming weeks Risk Frontiers will visit the area and provide further briefing notes on the outcomes of research.

We wish business operators well in their recovery and acknowledge the incredible efforts of all involved especially emergency service volunteers.

References

GISSING, A. 2003. Flood action plans-making loss reduction more effective in the commercial sector.
Australian Journal of Emergency Management, The,
18, 46.

GISSING, A. & LEIGH, R. 2001. February 2001 Lismore Flood. Natural Hazards Research Centre Newsletter.

SMITH, D. I. 1981. Actual and potential flood damage: a case study for urban Lismore, NSW, Australia. Applied Geography, 1, 31-39.

Twitter can predict hurricane damage as well as emergency agencies

By John Bohannon  Mar. 11, 2016.


Following the community–initiated Facebook groups that emerged during the 2010/11 Queensland and Victorian floods, Risk Frontiers undertook research into the use of social media as a complementary form of hazard and risk communication. The online questionnaire underpinning the study concluded that it has value to the emergency services, not only as a tool to disseminate information but also as an important resource to tap into and review informal communications.

The power of social media and the value it has to emergency services was demonstrated again following Hurricane Sandy, and this Briefing Note discusses the research that showed that social media could be used to rapidly assess damage in the aftermath of an event.


Mapping out the intensity of tweets during and just after a hurricane produced a map of damage on par with the government’s.

In October 2012, meteorologists noticed a massive low-pressure system forming over the waters south of Cuba. In just 5 days, it spun into one of the largest hurricanes on record, cutting a path up the eastern U.S. coast and devastating communities with flooding and 140-kilometer-per-hour winds. Superstorm Sandy posed a massive problem for government clean-up crews. Where should they send their limited emergency supplies and services? A new study suggests a way to get that answer fast: Just listen to Twitter.

Mapping damage is a crucial first step in hurricane response. Inaccurate mapping, as there was with Sandy and even more so with Hurricane Katrina in 2005, can add up to weeks—and in some cases months—before help arrives to those most in need. To predict where the worst damage has occurred, the U.S. Federal Emergency Management Agency (FEMA) puts together models that look at everything from geography to infrastructure to storm characteristics, and then flies over the affected areas to further refine their map. Surveying people on the ground in natural disaster zones is just too difficult.

A team led by Yury Kryvasheyeu, a computational physicist at Australia’s National Information and Communications Technology Research Centre of Excellence in Melbourne, wondered whether better data might already be waiting online. By 2012, people were relying on social media apps such as Twitter to communicate about real-time events. But can a map of tweets be translated to a map of damage?

Kryvasheyeu’s first task was to get the data. Though Twitter opened up its full archive to researchers back in 2014, many academics have been worried about the legal strings that might be attached to using the California-based company’s data. But the team only needed a subset for their experiment, so they bought it from one of the many third-party companies that collects, processes, and resells Twitter data. The database included all tweets in the world between 15 October and 12 November 2012. The team then narrowed the set to those with words like “hurricane,” “Sandy,” “frankenstorm,” and “flooding.”

Many tweets already had map coordinates locating their origin. But others did not. So the researchers also analyzed user accounts and message contents to further pin down the location of tweets. All in all, the team mapped out nearly 10 million tweets from more than 2 million user accounts.

The first discovery was reassuring. The relevant tweets weren’t just scattered randomly on the map: The closer people were to the hurricane, the more they had to say about it. But does such Twitter activity translate into actual damage? It was possible, for example, that local media coverage could amplify fear, even in areas that weren’t hit hard by the storm. So the researchers obtained data on the true extent of the damage from FEMA and the state governments of New Jersey and New York.

It turns out that Twitter was a remarkably good source of information on hurricane damage. The more damage Sandy actually did to a neighborhood, as measured by the per capita cost of the repairs, the higher the intensity of relevant tweeting from those areas just after the storm. In fact, Twitter was slightly better than FEMA’s own models in predicting the location and severity of damage, the team reports today in Science Advances. The main advantage of the technique is that it is a “virtually zero-cost solution,” says co-author Manuel Cebrian, a computer scientist at the Commonwealth Scientific and Industrial Research Organisation in Clayton, Australia.

Still, Twitter data have many limitations and pitfalls, says Urbano França, a computational public health researcher at Harvard Medical School in Boston. These include everything from “Twitter-bots” that robotically generate tweets to the quirks of who does and does not use social media. But, he says, the researchers in this case “seem to have thought of most, if not all, issues and potential loopholes.” The next step, he says, is to look for data on other social platforms, like Facebook, which has a much higher user base and “could potentially provide more precise results.” Then again, getting those data may prove even more difficult than dealing with Twitter’s data firehose.

Building evidence for risk-based insurance

Professor John McAneney and Andrew Gissing were invited to contribute to the 2016 World Disaster Report by the International Federation of Red Cross and Red Crescent Societies. Their contribution is provided below.


 Building evidence for risk based insuranceImproving societal resilience in the face of the growing cost of disasters triggered by natural disasters and how to do so in a fair and affordable manner is an increasing challenge. Many governments are looking to insurance as a partial solution to this problem.

Insurance is a contract between a policy-holder and a company that guarantees compensation for a specified loss in return for the payment of a premium. Conventional insurance works by pooling risks, an approach that works well for car accidents and house fires but not for the spatially-related nature of losses from disasters caused by natural hazards. It is the global reinsurance market that ultimately accepts much of this catastrophe risk (Roche et al., 2010). Relatively new financial instruments such as Catastrophe Bonds and Insurance-Linked Securities are also being employed to transfer some catastrophe risks to the capital markets.

Insurance is part of the essential infrastructure of a developed economy but it would be a mistake to see it as an instrument of social policy. It cannot in itself prevent flooding or earthquakes. On the other hand, insurance can promote socially desirable outcomes by helping policy-holders fund their post-disaster recovery more effectively. The greater the proportion of home-owners and businesses having insurance against naturally-triggered disasters, the more resilient the community will be.

Insurers can also help promote risk awareness by property owners and motivate them and communities, as well as governments, to take mitigation actions to reduce damaging losses (McAneney et al., 2016). The mechanism for doing this is by way of insurance premiums that properly reflect risk. Insurance is not the only means of providing transparency on the cost of risk, but private insurers are the only ones with a financial incentive to acknowledge such costs. Moreover, they are the only entities that can reward policy-holders when risks are reduced (Kunreuther, 2015; McAneney et al., 2016).

It is in the interest of communities to have a viable private sector insurance market and, arguably, governments should only become involved in the case of market failure (Roche et al., 2010). Of those government-authorized catastrophe insurance schemes examined by McAneney et al. (2016), many are actuarially unsound and end up creating a continuing liability for governments, and/or, in not pricing individual risks correctly, they encourage property development in risky locations while failing to provide incentives for retrofitting older properties at high risk. In less-developed insurance markets some government involvement may encourage the uptake of insurance (e.g., Tinh and Hung, 2014).

How do we assemble the evidence to support risk-reflective insurance premiums? New technologies such as catastrophe loss modelling, satellite imagery and improved geospatial tools are proving helpful in allowing insurers to better understand their exposure to natural hazard risks. While these technologies are increasingly available, in some countries the normal outcomes of such data gathering and analysis – insurance premiums – are constrained politically. This is the case in the United States of America where there has been a tendency to keep premiums low across the board and to have policy-holders in low-risk areas cross-subsidizing those at higher risk (Czajkowski, 2012). Such practices do little to constrain poor land-use planning decisions that lie at the heart of many disasters triggered by natural hazards (e.g., Pielke Jr et al., 2008; Crompton and McAneney, 2008). McAneney et al. (2010) show that most of the homes destroyed in the 2009 Black Saturday fires in Australia were located very close to fire-prone bushland with some 25 per cent actually constructed within the bushland. Effectively these homes were part of the fuel load and their destruction was unsurprising.

One way to build a wider evidence base for collective action to support risk-based insurance policies is for governments to share information on risks of disasters related to natural hazards, both with insurers as well as the community. This information might be hazard footprints as well as the likely cost of the damage (The Wharton School, 2016). In Australia, governments have been reluctant to do this. In some developing insurance markets, home-owners or farmers may have a better understanding of the risks than do insurers, who will price this uncertainty into premiums. Unrestricted access to hazard data for all parties would encourage fairer insurance pricing.

Gathering hazard data for building evidence for risk-reflective premiums depends on the type of hazard. For example, the distance of buildings from fire-prone bushland or the local likelihood of flooding are key determinants of vulnerability to these location-specific hazards. In other areas, or within the same areas in some cases, the annual likelihood of exceeding damaging levels of seismic ground-shaking, wind speed or volcanic ash are important metrics, as are distance from the sea and the elevation of a property when it comes to coastal hazards like tsunami and storm surge.

When this risk evidence is established and becomes reflected in national construction standards, improvements in resilience follow. For example, improvements in construction standards introduced in Australia after the destruction of Darwin by Tropical Cyclone Tracy in 1974 have been credited with reducing subsequent losses from tropical cyclones by some 67 per cent (McAneney et al., 2007).

The availability of such data may result in reductions in some insurance premiums, an increase for others, or, in extreme cases, the withdrawal of insurers from areas where the risk is considered to be too high. The latter outcome will send a strong signal to communities and government for investments in mitigation; subsidized insurance is not the answer. Governments should also ensure that humanitarian aid provided after disasters is targeted effectively, in order to avoid creating disincentives for people to purchase insurance.

Lastly, and to return to the issue of poor land-use planning, it is worth remembering that the 1945 thesis of the famous American geographer, Gilbert White, that “Floods are an act of God, but flood losses are largely an act of man”, still rings true and applicable to a wider range of disasters triggered by natural hazards than just floods.

A full copy of the report can be found at http://www.ifrc.org/Global/Documents/Secretariat/201610/WDR%202016-FINAL_web.pdf.

 

Crowds are wise enough to know when other people will get it wrong

Unexpected yet popular answers often turn out to be correct.

This article by Cathleen O’Grady was published by Ars Technical on 29th January, 2017. https://arstechnica.com/science/2017/01/to-improve-the-wisdom-of-the-crowd-ask-people-to-predict-vote-outcome/Cathleen O’Grady  is Ars Technica’s contributing science reporter. She has a background in cognitive science and evolutionary linguistics.

Flickr user. Hsing Wei

The “wisdom of the crowd” is a simple approach that can be surprisingly effective at finding the correct answer to certain problems. For instance, if a large group of people is asked to estimate the number of jelly beans in a jar, the average of all the answers gets closer to the truth than individual responses. The algorithm is applicable to limited types of questions, but there’s evidence of real-world usefulness, like improving medical diagnoses.

This process has some pretty obvious limits, but a team of researchers at MIT and Princeton published a paper in Nature [Nature, 2016. DOI: doi:10.1038/nature21054] this week suggesting a way to make it more reliable: look for an answer that comes up more often than people think it will, and it’s likely to be correct.

As part of their paper, Dražen Prelec and his colleagues used a survey on capital cities in the US. Each question was a simple True/False statement with the format “Philadelphia is the capital of Pennsylvania.” The city listed was always the most populous city in the state, but that’s not necessarily the capital. In the case of Pennsylvania, the capital is actually Harrisburg, but plenty of people don’t know that.

The wisdom of crowds approach fails this question. The problem is that questions sometimes rely on people having unusual or otherwise specialized knowledge that isn’t shared by a majority of people. Because most people don’t have that knowledge, the crowd’s answer will be resoundingly wrong.

Previous tweaks have tried to correct for this problem by taking confidence into account. People are asked how confident they are in their answers, and higher weight is given to more confident answers. However, this only works if people are aware that they don’t know something—and this is often strikingly not the case.

In the case of the Philadelphia question, people who incorrectly answered “True” were about as confident in their answers as people who correctly answered “False,” so confidence ratings didn’t improve the algorithm. But when people were asked to predict what they thought the overall answer would be, there was a difference between the two groups: people who answered “True” thought most people would agree with them, because they didn’t know they were wrong. The people who answered “False,” by contrast, knew they had unique knowledge and correctly assumed that most people would answer incorrectly, predicting that most people would answer “True.”

Because of this, the group at large predicted that “True” would be the overwhelmingly popular answer. And it was—but not to the extent that they predicted. More people knew it was a trick question than the crowd expected. That discrepancy is what allows the approach to be tweaked. The new version looks at how people predict the population will vote, looks for the answer that people gave more often than those predictions would suggest, and then picks that “surprisingly popular” answer as the correct one.

To go back to our example: most people will think others will pick Philadelphia, while very few will expect others to name Harrisburg. But, because Harrisburg is the right answer, it’ll come up much more often than the predictions would suggest.

Prelec and his colleagues constructed a statistical theorem suggesting that this process would improve matters and then tested it on a number of real-world examples. In addition to the state capitals survey, they used a general knowledge survey, a questionnaire asking art professionals and laypeople to assess the prices of certain artworks, and a survey asking dermatologists to assess whether skin lesions were malignant or benign.

Across the aggregated results from all of these surveys, the “surprisingly popular” (SP) algorithm had 21.3 percent fewer errors than a standard “popular vote” approach. In 290 of the 490 questions across all the surveys, they also assessed people’s confidence in their answers. The SP algorithm did better here, too: it had 24.2 percent fewer errors than an algorithm that chose confidence-weighted answers.

It’s easy to misinterpret the “wisdom of crowds” approach as suggesting that any answer reached by a large group of people will be the correct one. That’s not the case; it can pretty easily be undermined by social influences, like being told how other people had answered. These failings are a problem, because it could be a really useful tool, as demonstrated by its hypothetical uses in medical settings.

Improvements like these, then, contribute to sharpening the tool to the point where it could have robust real-world applications. “It would be hard to trust a method if it fails with ideal respondents on simple problems like [the capital of Pennsylvania],” the authors write. Fixing it so that it gets simple questions like these right is a big step in the right direction.

 

Solving the Puzzle of Hurricane History

This article was posted on the NOAA website on 11 Feb 2016.

If you want to understand today, you have to search yesterday.”  ~ Pearl S. Buck

One of the lesser-known but important functions of the NHC [National Hurricane Centre, Miami, Florida] is to maintain a historical hurricane database that supports a wide variety of uses in the research community, private sector, and the general public.  This database, known as HURDAT (short for HURricane DATabase), documents the life cycle of each known tropical or subtropical cyclone.  In the Atlantic basin, this dataset extends back to 1851; in the eastern North Pacific, the records start in 1949.  The HURDAT includes 6-hourly estimates of position, intensity, cyclone type (i.e., whether the system was tropical, subtropical, or extratropical), and in recent years also includes estimates of cyclone size.  Currently, after each hurricane season ends, a post-analysis of the season’s cyclones is conducted by NHC, and the results are added to the database. The Atlantic dataset was created in the mid-1960s, originally in support of the space program to study the climatological impacts of tropical cyclones at Kennedy Space Center.  It became obvious a couple of decades ago, however, that the HURDAT needed to be revised because it was incomplete, contained significant errors, or did not reflect the latest scientific understanding regarding the interpretation of past data.  Charlie Neumann, a former NHC employee, documented many of these problems and obtained a grant to address them under a program eventually called the Atlantic Hurricane Database Re-analysis Project.  Chris Landsea, then employed by the NOAA Hurricane Research Division (HRD) and now currently the Science and Operations Officer at the NHC, has served as the lead scientist and program manager of the Re-analysis Project since the late 1990s.

In response to the re-analysis effort, NHC established the Best Track Change Committee (BTCC) in 1999 to review proposed changes to the HURDAT (whether originating from the Re-analysis Project or elsewhere) to ensure a scientifically sound tropical cyclone database.  The committee currently consists of six NOAA scientists, four of whom work for the NHC and two who do not (currently, one is from HRD and the other is from the Weather Prediction Center).

Over the past two decades, Landsea, researchers Andrew Hagen and Sandy Delgado, and some local meteorology students have systematically searched for and compiled any available data related to each known storm in past hurricane seasons.  This compilation also includes systems not in the HURDAT that could potentially be classified as tropical cyclones.  The data are carefully examined using standardized analysis techniques, and a best track is developed for each system, many of which would be different from the existing tracks in the original dataset.  Typically, a season’s worth of proposed revised or new tracks is submitted for review by the BTCC.  Fig. 1 provides an example set of data that helped the BTCC identify a previously unknown tropical storm in 1955.

 

Figure 1. Surface plot of data from 1200 UTC 26 Sep 1955, showing a previously unknown tropical storm.

The BTCC members review the suggested changes submitted by the Re-analysis Project, noting areas of agreement and proposed changes requiring additional data or clarification. The committee Chairman, Dr. Jack Beven, then assembles the comments into a formal reply from the BTCC to the Re-analysis Project. Occasionally, the committee’s analysis is presented along with any relevant documentation that would help Landsea and his group of re-analyzers account for the differing interpretation.   The vast majority of the suggested changes to HURDAT are accepted by the BTCC.  In cases where the proposed changes are not accepted, the BTCC and members of the Re-Analysis Project attempt to resolve any disagreements, with the BTCC having final say.

In the early days of the Re-analysis Project, the amount of data available for any given tropical cyclone or even a single season was quite small, and so were the number of suggested changes.  This allowed the re-analysis of HURDAT to progress relatively quickly.  However, since the project has reached the aircraft reconnaissance era (post 1944), the amount of data and the corresponding complexity of the analyses have rapidly increased, which has slowed the project’s progress during the last couple of years.

The BTCC’s approved changes have been significant. On average, the BTCC has approved the addition of one to two new storms per season.  One of the most highly visible changes was made 14 years ago, when the committee approved Hurricane Andrew’s upgrade from a category 4 to a category 5 hurricane.  This decision was made on the basis of (then) new research regarding the relationship between flight-level and surface winds from data gathered by reconnaissance aircraft using dropsondes.

Figure 2 show the revisions made to the best tracks of the 1936 hurricane season, and gives a flavor of the type, significance, and number of changes being made as part of the re-analysis.  More recent results from the BTCC include the re-analysis of the New England 1938 hurricane, which reaffirmed its major hurricane status in New England from a careful analysis of surface observations.  Hurricane Diane in 1955, which brought tremendous destruction to parts of the Mid-Atlantic states due to its flooding rains, was judged to be a tropical storm at landfall after re-analysis.   Also of note is the re-analysis of Hurricane Camille in 1969, one of three category 5 hurricanes to have struck the United States in the historical record.  The re-analysis confirmed that Camille was indeed a category 5 hurricane, but revealed fluctuations in its intensity prior to its landfall in Mississippi that were not previously documented.

The most recent activity of the BTCC was an examination of the landfall of the Great Manzanillo Hurricane of 1959.  It was originally designated as a category 5 hurricane landfall in HURDAT and was the strongest landfalling hurricane on record for the Pacific coast of Mexico. A re-analysis of ship and previously undiscovered land data, however, revealed that the landfall intensity was significantly lower (140 mph).  Thus, 2015’s Hurricane Patricia is now the strongest landfalling hurricane on record for the Pacific coast of Mexico, with an intensity of 150 mph.

Figure 2. Revisions made to the best tracks of the 1936 hurricane season

The BTCC is currently examining data from the late 1950s and hopes to have the 1956-1960 re-analysis released before next hurricane season.  This analysis will include fresh looks at Hurricane Audrey in 1957 and Hurricane Donna in 1960, both of which were classified as category 4 hurricane landfalls in the United States.   As the re-analysis progresses into the 1960s, the committee will be tackling the tricky issue of how to incorporate satellite images into the re-analysis, including satellite imagery’s irregular frequency and quality during that decade. The long-term plan is to extend the re-analysis until about the year 2000, when current operational practices for estimating tropical cyclone intensity became established using GPS dropsonde data and flight-level wind reduction techniques.

https://noaanhc.wordpress.com/2016/02/11/solving-the-jigsaw-puzzle-of-hurricane-history/

 

The Mw 7.9 Taron, PNG Earthquake of 17 December 2016

by Paul Somerville, Risk Frontiers.

The December 17, 2016, M 7.9 earthquake originated about 46 km east of Taron, New Ireland, Papua New Guinea at a depth of 103 km (Figure 1). It occurred as the result of reverse faulting at an intermediate depth. At the location of the earthquake, the Australia plate converges with and subducts beneath the Pacific plate at a rate of about 105 mm/yr towards the east-northeast. The earthquake occurred within the interior of the subducted Australia plate lithosphere, rather than on the shallow thrust interface between these two plates.  The detailed slip map and time function of the earthquake show that it had a source duration of 80 seconds (Figures 2 and 3).

Figure 1. Slip map of the Taron Earthquake.  Source: USGS

Figure 2. Detailed slip map of the Taron earthquake.  Source: USGS

Figure 3. Source time function, describing the rate of moment release with time after the Taron earthquake origin. Source: USGS.

Because of its occurrence at a depth of about 100 km, this earthquake did not trigger a significant tsunami.  A tsunami measuring less than 1 metre struck the coast of New Ireland shortly after the earthquake.  Many residents in the northern parts of the autonomous region of Bougainville sought higher ground amid warnings that tsunami waves were possible. A nurse at Buka General Hospital in Bougainville said the quake was so strong it felt like the building she was sleeping in would topple. She said patients were being moved a few kilometres to higher ground.  Due its remoteness from land, there were no immediate reports of damage, although it caused a blackout in the town of Kokopo at the northeastern tip of New Britain.  Nevertheless, despite their considerable depth, earthquakes of this kind occurring within subducted slabs can generate damaging ground motion levels, as occurred during the 1993 Mw 7.9 Kushiro earthquake beneath southeastern Hokkaido, Japan, which generated peak ground accelerations of about 0.35g in Kushiro. The large spatial extent of the earthquake is indicated in the shakemap shown in Figure 4, with MMI intensity VIII (PGA of 22%g) extending from Rabaul, New Britain across southern New Ireland to Sahano, Bougainville.

Figure 4. Shakemap of the Taron earthquake.  Source: USGS.

The Mw 7.8 November 14, 2016 Kaikoura Earthquake: Briefing 5

by John McAneney.

This fifth briefing contains observations arising from a visit to Wellington and Blenheim (Dec. 5 – 12, 2016).

Briefing by Dr Kelvin Berryman (General Manager: Natural Hazards Strategic Relationships, GNS)

As shown in previous Risk Frontiers’ Briefing Notes, the NE quarter of the South Island is marked by a complex network of faults. Many of these, both onshore and offshore, were implicated in the November 14 event, but there were two main ruptures with these unzipping during a total event duration of ~90 – 120sec. The main energy release occurred some 120 km northeast of the epicentre, accompanied by surface fault displacements of ~5 – 13m with energy directivity towards the north. The earthquakes did not rupture the ground surface along the entirety of the faults.

The Kekerengu fault had a surface rupture of 70 km with about half of that occurring under the sea. It has ruptured three times previously in the past 1250 years and its current slip rate is comparable to that on the Alpine fault ~ 20 – 26 mm/y. It’s capable of a Mw 7.6 quake on its own. Its vertical movements raised the seabed, pushed the Cape Campbell Lighthouse 2 m northward, diverted streams and caused landslides. The former high-tide mark at Kaikoura is now 6 m lower than its pre-earthquake mark. Further north the Hope and Awatere faults moved but didn’t rupture the ground surface. A formerly unknown fault called the Papatea fault caused land movements and damage near Clarence.

The earlier GNS maps showing increased probability of earthquakes confined to the South Island assume a traditional exponential aftershock sequence.  This ignores the bigger question raised by Paul Somerville (Risk Frontiers) about stress transfer to the Hikurangi Subduction Zone (HSZ) and the degree to which this transfer might increase the likelihood of a larger subduction zone earthquake. The HSZ underlies Wellington at a depth of 23 km.

Kelvin agrees that there would have been stress transfer but argues that this has happened often enough in the past with the HSZ remaining firmly locked underneath Wellington for some 500 years. In truth no one knows the answer to this question.

GNS consider a large HSZ earthquake as a tsunamigenic event capable of killing large numbers of people in and around Wellington. In their view a rupture on the Wellington fault is capable of generating the largest Probable Maximum Loss for the city.

 

Briefing by David Brunsdon (Director, Kestrel Group) and damage observations Wellington Dec 7 and 8

At the time of the visit about 40 buildings, most of which were formerly occupied by Government departments, had been evacuated. Most but not all of these are located on reclaimed land near the Port and Quays (Figure 1). These evacuations were thought to be mostly precautionary and awaiting more detailed engineering reports or because of non-structural damage such as the failure of roof panels, plumbing and electrical services. As a result about 10% of Wellington’s public servants are currently working offsite. Three buildings – 61 Molesworth Street, Statistics House and the Reading Cinema car park – are badly damaged.

Figure 1: Locations of buildings with or suspected of experiencing damage. (http://www.stuff.co.nz/national/nz-earthquake/86510017/questions-asked-in-capital-but-engineers-say-no-such-thing-as-earthquakeproof).

Some of the non-structural damage was significant enough to have caused injuries if the quake had occurred during work hours, with glass shards embedded in plaster. It seems that interior fit outs are not designed to the same stringent engineering design specifications as is external glass which generally seemed to perform well. This might need more consideration in the Building code.

The 40 buildings damaged or suspected of having had damage to some degree is a lower bound estimate because in the absence of a declared state of emergency, there was no obligation on landlords to have buildings inspected or to report damage if found.  The Wellington City Council has now introduced emergency legislation requiring owners of commercial properties to do so.

The Council is adopting a business-as-usual approach in the sense of not shutting down large parts of the city unnecessarily. This is a lesson learnt from the experience in Christchurch.

The vulnerability of buildings of a certain size (10 – 15 stories) has been discussed in Risk Frontiers Briefing Note 333. It appears that the damage to buildings in this height range is attributable to construction details. In particular, weaknesses have been identified in concrete frame buildings constructed in the 1980s and 90s that pushed engineering boundaries with pre-cast floors not tied into the flexible columns. These did not perform well under the high seismic loads of November 14. The former Statistics House building down near the Port is one such example.

Some brief notes on particular buildings follow.

Statistics House (Figure 2) — vulnerability to ground shaking had already been recognised and the top story already retrofitted. In the November 14 event, two lower floors partially collapsed together with a lot of non-structural damage. In Figure 3 collapsed floor planking is visible. Four other buildings owned by CentrePort also suffered damage including Customhouse, Shed 39, the BNZ building that had suffered severe non-structural damage in the Seddon earthquake in 2013, and Shed 35, an unreinforced masonry heritage building.

 

Figure 2: Statistics House – the major structural damage is to the lower left hand corner of the building, and there is nonstructural damage on other floors. (http://www.stuff.co.nz/national/nz-earthquake/86510017/questions-asked-in-capital-but-engineers-say-no-such-thing-as-earthquakeproof). Shed 39 is the masonry building to the left and behind Statistics House.

Figure 3: Failure of pre-cast floor element in Statistics House, and a lot of non-structural debris.

 

Reading Cinema Car Park: Most of Tory Street and access to the adjacent cinema and most shops in Tory Street and some in Courtney Place is blocked off pending a decision about the future of the car park, which was deemed likely to fail in an aftershock. The car park is about 20 years old, of long span design and was damaged in the 2013 Mw 6.5 Seddon earthquake. Owners of vehicles have not been allowed entry to bring out their cars.

61 Molesworth Street: This formerly ‘vacant’ multi-story building (Figure 4) is being demolished already. A column on level 4 sheared and left the curtain wall vulnerable. It’s believed that the owner of the building, Primeproperty, made a commercial decision to take the insurance payout and to demolish presumably with the intention of rebuilding something higher and better. Tenants who were living illegally in the building have not been allowed to retrieve personal belongings because of the risk of failure.

Figure 4: 61 Molesworth Street in the process of being demolished.

Queensgate Mall (Lower Hutt): The car park and cinema complex above the car park were already in the process of being demolished. Some of the Mall remains open for business. The Mall had been refurbished in 2006.

Asteron Centre 1 Featherston Street opposite railway station and bus terminus. This near new premium grade 17-story building possesses the largest office floor area in Wellington (48,000 m2). The issue seems to be with the stairs and tenants are again taking a precautionary approach. It was not clear if the stairwell is structurally important.

The Port. The land around the damaged buildings experienced liquefaction and some settlement or uplift. The reclaimed land is likely a smorgasbord of soil conditions. The port seemed to be operating normally as far as we were able to judge with logs still being delivered and ferries berthing. From the Oriental Bay side of the harbour, many of the stacks of containers (up to 6 containers high) appeared higgledy-piggledy.

Good News Stories:

Along the quays are a number of unreinforced masonry buildings that have been retrofitted and which performed well. These short stiff buildings may not have been tested to the extent of taller buildings because of the character of the seismic demand (Risk Frontiers Briefing Note 333). In the Woolstore Design Centre in Thorndon Quay, a century-old 3-story unreinforced masonry building located not far from the damaged buildings in the Port area, and where a lot of steel retrofitting has been done, shops selling high end ceramics and glassware didn’t even have product falling off shelves. Again how much of this can be put down to the local seismic demand and how much to the retrofitting is hard to know. It is likely that much of this retrofitting was encouraged by the Wellington City Council’s subsidies for retrofitting and remission of rates during such works and once a building is taken off the Earthquake-prone Buildings List.

A curious news story with implications for seismic design:

The original BNZ Centre at 1 Willis Street, Wellington, was designed with a ductile steel frame with construction commencing in 1974. When completed in 1984, it was NZ’s tallest building at 103 m. Its construction was held up for around nine years as a result of a political standoff between the Muldoon government and the Boilermakers Union, who claimed exclusive rights to weld structural steel. Ongoing strikes and go-slows eventually led to the union’s deregistration but this experience discouraged steel construction in New Zealand for many years.  However, most of the reconstruction of Christchurch’s CBD is being done in steel.

Visit to a Blenheim Winery

I was fortunate to gain access to a winery, as many of the big-name enterprises were remaining coy about their experience. The main forms of damage were not to buildings themselves but to stainless steel fermentation vats and I understand this to be common across the region.

Overall the damage in this winery was not overwhelming in part because of lessons learnt in the 2013 Seddon earthquake and the investment in stainless steel tie-downs (Figure 5).

Figure 5: ‘Elephant foot’ bulging in the walls of the fermentation vats. The stainless steel tie-downs are just visible and are spaced at roughly 50 to 75cm around the perimeter at the base of the vat on the concrete pad.

Figure 6: A tank learning against the catwalk.

The forms of damage have been seen before: first there was some distortion to the tops of vats where these either banged against each other or against the elevated cat walks that allow winemakers access the top of these vats (Figure 6). It was incredibly lucky that the earthquake occurred at night when no one was in the winery.

Secondly there was some limited damage to the glycol cooling channels on a few vats but the most common category was the ‘elephant foot’ at the base of the vats that occurs when ~150 tonnes of must (the fermenting grape juice) after being lifted by vertical ground accelerations comes smashing back down (Figure 5).

The ‘elephant foot’ bulges are reparable by welding in a skirt on the inside. This work was already underway with tank fitters coming from Timaru.

The winery had lost about 5% of its production but last year’s vintage was a large one and this loss of wine can likely be tolerated. The biggest concern for the industry is whether there will be sufficient vat storage for next year’s vintage and time to repair those that need it.

 Acknowledgements

This briefing note would not have been possible without the help of Pat Helm, Robyn Martin, John Hedges, Drs Kelvin Berryman and Mike Trought, Prof. Jason Ingrim, Dave Brunsden and Ben Miliauskas.

 

The Mw 7.8 November 14, 2016 Kaikoura Earthquake, Briefing 4.

by Paul Somerville, Risk Frontiers. 

Inspections and Occupancy of Government Buildings in Wellington.

Speaking to Parliament’s government administration committee on November 28, State Services Commissioner Peter Hughes said that the State Services Commission had made it clear from the start that properly qualified structural engineers had to carry out building assessments, while Hughes had worked with chief executives on how to ensure the checks were up to standard. He had asked government agencies to make early contact with unions and keep them informed of checks and repairs, while he met the Public Service Association last week to review what had been done so far.

The Mw 7.8 14 November Hanmer Springs Earthquake, Briefing No. 1

Paul Somerville, Risk Frontiers

The Mw 7.8 14 November Hanmer Springs earthquake occurred on or near the interface between the Pacific Plate and the Australian plate (Figure 1). In the North Island of New Zealand, the Pacific Plate is subducting beneath the Australian Plate along the Hikurangi subduction zone in dip-slip faulting.