Engaging Preparedness for Drought

NASA-generated groundwater drought map from the NIDIS website (https://www.drought.gov).

NASA-generated groundwater drought map from the NIDIS website (https://www.drought.gov).

Drought has historically been the disaster that fails to focus our attention on its consequences until it is too late to take effective action. While other disasters like earthquakes, hurricanes, tornadoes, and most floods have a quick onset that signals trouble, and a clear end point that signals that it is time to recover and rebuild, drought has been that slow-onset event that sneaks up on whole regions and grinds on for months and years, leaving people exhausted, frustrated, and feeling powerless. Our species and life itself depend on water for survival.

Over time, our nation has responded to most types of disasters both with an overall framework for response, centered on the Federal Emergency Management Agency, and with resources for specific types of disasters with operations like the National Hurricane Center. It took longer for drought to win the attention of Congress, but in 2006, Congress passed the National Integrated Drought Information System (NIDIS) Act, creating an interagency entity with that name, led by the National Oceanic and Atmospheric Administration. NIDIS was reauthorized in 2014. Its headquarters are in Boulder, Colorado.

For the last five years, I have been involved in various ways with NIDIS and the National Drought Mitigation Center (NDMC), an academic institute affiliated with the University of Nebraska in Lincoln. To date, the major byproduct of that collaboration has been the publication by the American Planning Association (APA) of a Planning Advisory Service (PAS) Report, Planning and Drought, released in early 2014. Both NIDIS and NDMC have helped make that report widely available among professionals and public officials engaged in preparing communities for drought. The need to engage community planners in this enterprise has been clear. Much of the Midwest was affected by drought in 2012, at the very time we were researching the report. Texas suffered from prolonged drought within the last few years, and California has yet to fully recover from a multi-year drought that drained many of its reservoirs. And while drought may seem less dangerous than violent weather or seismic disturbances, the fact is that, in the last five years alone, four drought episodes each exceeding $1 billion in damages have collectively caused nearly $50 billion in adverse economic consequences. The need to craft effective water conservation measures and to account adequately for water consumption needs in reviewing proposed development has become obvious. We need to create communities that are more resilient in the face of drought conditions.

A subgroup of the NIDIS EPC Working Group discusses ongoing and future efforts during the Lincoln meeting.

Part of the NIDIS EPC Working Group discusses ongoing and future efforts during the Lincoln meeting.

Over the past decade, NIDIS has elaborated its mission in a number of directions including this need to engage communities in preparing for drought. It was this mission that brought me to Lincoln at the end of April for the NIDIS Engaging Preparedness Communities (EPC) Working Group. This group works to bring together the advice and expertise of numerous organizations involved in drought, including not only APA, NOAA, and NDMC, but state agencies like the Colorado Water Conservation Board, tribal organizations such as the Indigenous Waters Network, and academic experts in fields like agriculture and climatology. Over ten years since the creation of NIDIS, this and other working groups have made considerable strides toward better understanding the impact of drought on communities and regions and increasing public access to information and predictions about drought in order to give them a better basis for decision making in confronting the problem. NIDIS has conducted a number of training webinars, established online portals for databases and case studies, and otherwise tried to demystify what causes drought and how states and communities can deal with it. Our job for two days in Lincoln was simply to push the ball farther uphill and to help coordinate outreach and resources, especially for communication, to make the whole program more effective over the next few years.

Much of that progress is captured in the NIDIS Progress Report, issued in January of this year. More importantly, this progress and the need to build further national capabilities to address drought resilience, captured the attention of the White House. On March 21, the White House issued a Presidential Memorandum signed by President Obama, which institutionalized the National Drought Resilience Partnership, which issued an accompanying Federal Action Plan for long-term drought resilience. This plan enhances the existing muscle of NIDIS by laying out a series of six national drought resilience goals: 1) data collection and integration; 2) communicating drought risk to critical infrastructure; 3) drought planning and capacity building; 4) coordination of federal drought activity; 5) market-based approaches for infrastructure and efficiency; and 6) innovative water use, efficiency, and technology.

Drought clearly is a complex topic in both scientific and community planning terms, one that requires the kind of coordination this plan describes in order to alleviate the economic burden on affected states and regions. With the growing impacts of climate change in coming decades, this issue can only become more challenging. We have a long way to go, and many small communities lack the analytical and technical capacities they will need. Federal and state disaster policy should be all about building capacity and channeling help where it is needed most. The institutional willingness of the federal government to at least acknowledge this need and organize to address it is certainly encouraging.

 

Jim Schwab

 

Drifting into Disaster

Scene from the Sacramento-San Joaquin Delta

Scene from the Sacramento-San Joaquin Delta

Across the United States of America, about one in five people live under the rules and structures of some sort of private association that governs common property interests. These can be condominium associations, homeowners associations, or similar entities that are somehow responsible for levying fees and maintaining communal property. To degrees they often may not realize, the residents are thus controlled and constrained by the decisions these associations make, which often may concern themselves with details that a local government would not even consider, such as the color of aluminum siding, allowable holiday decorations, and other matters with minor impacts on the quality of life. Many homeowners associations are established by developers at the time they get permits to create a new subdivision. In some states, local governments are happy to offload responsibility for infrastructure maintenance, such as private roads, onto these associations while coveting the property taxes they will still pay.

The implications of all this were brought to my attention in the past week or two by Chad Berginnis, the executive director of the Association of State Floodplain Managers (ASFPM). He has been working with me on material for a future report we plan to publish at the American Planning Association on subdivision design as it relates to areas with flood hazards. The issue that concerned him as he wrote a chapter on subdivision standards for local governments, which have the primary responsibility for permitting new development, is how well these private owner associations can sustain over time the financial responsibilities for infrastructure designed to protect their properties from disaster, most notably but not exclusively, flooding.

Among the items that have come to my attention is a paper by two California attorneys, Tyler P. Berding and Steven S. Weil, disturbingly titled, “Disaster! No Reserves. No Insurance. What’s Left if a Natural Disaster Destroys a Community Association?” They begin with a cautionary tale about the Bethel Island Municipal Improvement District, actually a California special district, not a homeowners association. Its mission is to maintain and improve the levees that surround the Sacramento Delta island of 2,500 residents, where the interior is seven to 15 feet below sea level. To say that their survival depends on well-maintained levees is no exaggeration. Moreover, in that part of California, the levees are subject to collapse from earthquake shaking as well as from overtopping in a flood. I have some idea of their peril because four years ago, a representative of the California Department of Water Resources (DWR) took me on a six-hour guided tour of the levee system in the delta area, plying me with a number of the background studies by DWR of the overall situation. There are hundreds of such islands throughout the Sacramento-San Joaquin Delta, many used for agriculture, and some developed. In their 2012 article, produced about the time of that tour, Berding and Weil note, “But the district is broke.” Voters “soundly” rejected a 2010 parcel tax measure to fund improvements, and much of the district staff was laid off. The levees were deteriorating, to some extent “suffering damage by beavers and rodents.”

It is disturbingly easy for homeowners association or other private association board members to take their eyes off the ball of maintaining adequate reserves and resources to address dangers that seem less than imminent, and even to forget why they are responsible for collecting assessments in the first place. And it is even easier for residents who must approve some of those assessments to lack meaningful knowledge of the consequences of either depleting or failing to maintain adequate reserves for unfortunate natural events like floods, earthquakes, or other disasters. Once they begin sliding down that slippery slope of amnesia and unawareness, it is not long before they have put a good deal of common and individual property at risk. The few who may be aware of the long-term consequences often may lack the ability to make their case to less concerned neighbors.

This issue is one of concern in the field of urban planning because new subdivisions, in particular, often arise at the edge of metropolitan areas in unincorporated county lands or small towns, where governance capacity may be limited and resistance to government regulation particularly high. The result is that oversight is weakest, and the desire for new development highest, precisely where the need for that oversight may be greatest. In regulatory terms, it is the theory of the weakest link. One of our motives for the new report (underwritten by the Federal Emergency Management Agency) is to help shore up those weak links with stronger guidance about sound practices in reviewing plans for new subdivisions. Berding and Weil were serving a similar purpose, at least in the California context, by describing sound practices for community associations, particularly in sustaining adequate reserves for contingencies such as disasters.

But finances are only part of the problem. Sometimes, the leadership of such associations can become so focused on issues like aesthetics and conformity that they lose sight of larger issues like public safety. In the past, the National Fire Protection Association, which supports the Firewise Communities initiative, has trained its attention on the question of covenants that run counter to public safety, for example, by inhibiting well-researched methods for containing wildfire threats. Many of these techniques involve either landscaping or building design, yet some associations have rules limiting tree trimming or landscaping that would aid in wildfire mitigation. In Safer from the Start, NFPA’s 2009 study of the issues involved in building and maintaining “firewise-friendly developments,” a sidebar notes that the state of Colorado’s recently passed “Colorado Common Interest Ownership Act,” among other measures, basically invalidated a number of types of association covenants and restrictions that inhibited defensible space around private dwellings in order to advance wildfire safety statewide. In effect, the state was saying, with regard to rules that made wildfire safety more difficult to enforce, “enough.” At the same time, the publication overall provided a significant amount of sound advice about best practices in wildfire protection in rural subdivisions and new developments.

That seven-year-old NFPA advice recently got a new boost from an interesting direction: Green Builder Media just recently issued its own e-brochure, “Design with Fire in Mind: Three Steps to a Safer New Home,” in cooperation with NFPA. Green Builder Media has more of a direct avenue to influence those developers who want to build safe, resilient, energy-efficient communities.

The fact that these resources have continued to materialize on a regular basis over the past decade or two indicates, to me, that the subject of good design and homeowner association responsibility is not going away any time soon. It is the job of planners, floodplain managers, and local and state officials to ensure that those responsibilities remain on their radar screens and are taken seriously. One-fifth of the American population depends to a significant degree on the quality of their oversight.

 

Jim Schwab

That Burning Smell Out West

IMG_0224Although plenty of other issues have competed for our attention in recent weeks, astute observers of the news, in the U.S. at least, have probably noticed that wildfires have been charring much of the landscape in western states, most notably along the Pacific Coast. Both California and Washington are struggling under the burden of numerous fires triggered or helped along by prolonged drought and a hot summer. While some may jump to the conclusion that this is another harbinger of climate change, and it may well have some connections to climate change, it is important to know there are other historical factors that are even more significant. We have seen them coming but not done nearly enough to forestall the outcome, which may grow worse in coming years.

Ten years ago, in Planning for Wildfires (PAS Report No. 529/530), Stuart Meck and I noted that, in the 2000 census, the five fastest-growing states all had a high propensity for wildfires. Not much has changed. Texas, which suffered significantly from wildfires during its drought in 2011, made the largest numerical gain in the 2010 census, though it was fifth in percentage gain, behind Nevada (35.1), Arizona (24.6), Utah (23.8), and Idaho (21.1). Of course, many of those people move into larger cities like Las Vegas and Phoenix. More to the point, many people continue to move specifically into more rural areas with weaker development restrictions and building codes. As their numbers rise in what fire experts call the wildland-urban interface, the area where the built environment interfaces with fire-prone wildlands, so does human and structural vulnerability to wildfires. Why do people choose to live there? Social scientists, including some who work for the U.S. Forest Service, have been examining this question for at least two decades. We noted that the reasons include a desire for proximity to wildlife, privacy, nature, and the love of a rugged lifestyle.

These desires spawn problems, however, if not accompanied by considerable prudence in both how and where homes are built, as well as in landscape maintenance once a subdivision exists. Firewise Communities, a program of the National Fire Protection Association, has since the late 1990s sought to educate communities and homeowner associations on the realities of life in the wildland-urban interface, including the need for noncombustible roofing materials, eliminating a wildfire pathway to homes and other structures by maintaining a perimeter of “defensible space,” whose radius largely depends on terrain and forest conditions, and other best practices to reduce the impact of wildfires on homes. Still, we live with the legacy of prior development in many areas, and one result is that firefighters are increasingly exposed to lethal risks in trying to protect these homes when wildfire approaches. Every year some lose their lives, 163 over the past 10 years. There is a point where some homeowners must be told that more lives cannot be risked in protecting every remote structure at any cost.

And those costs are rapidly growing. Just 20 years ago, in 1995, the Forest Service spent 16 percent of its annual budget on fire management. That has climbed to 52 percent today, and the trend is ever upward, squeezing a largely static $5 billion budget of funds for other functions. About 90 percent of the firefighting expenses involve protecting houses. It would be a far simpler matter to let some fires burn, or to use prescribed burns to reduce flammable underbrush to prevent or mitigate future fires, if fewer of those houses were in the wildland-urban interface. But part of that fire management expense is for thinning the forest to scale back a problem the Forest Service itself created over the past century, and which modern fire managers have effectively inherited. Put simply, most of the western wildland forest is much denser than it was prior to the 20th century. Not just a little bit denser, but several times denser in many cases. The result is more intensive, longer-burning wildfires in those cases where the Forest Service is unable to suppress the fire at an early stage.

Full-scale suppression, however, is what brought us to this pass. Toward the end of an era Stephen J. Pyne has called the “Great Barbecue” (1870-1920), which saw some of the deadliest wildfires in U.S. history starting with the nearly simultaneous ignitions of Peshtigo, Wisconsin, which killed about 1,500 people, and the Great Chicago Fire, which had more to do with hot weather and conditions in the lumberyards that processed the products of the upper Midwest forests than with Mrs. O’Leary’s cow, the Forest Service secured its role as the nation’s wildland firefighting service. One can learn more about the Peshtigo firestorm in a great book, Firestorm in Peshtigo, by Denise Gess and William Lutz. That era was drifting toward an end in 1910, when the Big Burn killed 78 people and scarred 3 million acres and a pitched effort to fight them put the Forest Service in the limelight and won it this new role. Timothy Egan tells that story in The Big Burn. But the collective works of Pyne, an Arizona State University environmental historian who has specialized in the history of fire, can deliver more depth than you may ever desire and fill in the blanks between those two episodes and beyond into recent times.

What we have learned is that over time, as the policy of all-out suppression of wildfires took hold in the federal government, the smaller fires that historically and naturally had served to thin the forest were no longer allowed to do their job. The gradual result was a denser, thicker forest that, when it did catch fire, produces far more dangerous fires than ever before. When drought and bark beetle infestations begin to kill some of that dense forest, the result is that there is simply far more kindling than would otherwise be there. Yet, as Forest Service Chief Tim Tidwell notes, the Forest Service currently simply does not have the resources to undertake more than a fraction of the forest restoration work needed to achieve healthier, less fire-prone forests. The problem will only grow worse with a warming climate, of course, but did not arise primarily because of it, but because of past firefighting practices and a more recent history of development in wildland areas. But climate change can be counted on to produce increasing average temperatures that will vary depending on location, but possibly 4 to 6 degrees Fahrenheit by 2100. California’s Cal-Adapt has been tracking these changes and producing a stream of research and temperature maps that provide significant perspective on the extent of the problem we face moving into the future. It’s a sobering picture we would all do well to consider.

 

Jim Schwab

Discussing Drought in California

Drought is just different from other kinds of disasters. It has a very slow onset, so slow that affected regions often do not realize they are ensnared in a prolonged drought until months or even years have passed and water supplies are severely depleted. How do we better plan for these drawn-out, stress-inducing, patience-draining tests of a community’s endurance?

Back in July, I was contacted by host Steve Baker at KVMR Radio in Nevada City, California, to join other experts on a one-hour show exploring this vital question, one that has been testing the patience and resilience of nearly 40 million Californians for the past couple of years. I joined the show from a hotel in Washington, D.C., but followed it online until I joined the discussion. Subsequently, KVMR shared the recording and allowed its use on the American Planning Association’s Recovery News blog. Two days ago we posted it, so I am linking the more than 7,500 current subscribers to this blog by offering a direct link here to the one-hour show. If the California drought concerns you, or you simply want to learn more, please listen.

Jim Schwab

Water: Our Public Policy Challenge

R1-08402-021AI grew up in suburban Cleveland. After a seven-year hiatus in Iowa and briefly in Nebraska, my wife’s home state, we ended up in Chicago. I am unquestionably a Midwesterner with most of my life lived near the Great Lakes. It will therefore not be surprising that for most of my adult life, I have heard people speculate about moving some of our abundant water to places that have less, mostly in the West. For just about as long, I have been very aware that their speculations were merely pipedreams (pun very much intended).

Because most people have at most only a cursory understanding of our nation’s intricate water laws and treaties (in the case of the Great Lakes), to say nothing of the costs and challenges of water infrastructure development, I suppose they can be forgiven for their naivete in even entertaining such notions as piping water from Lake Michigan to California. For both legal and practical reasons, the water is not likely any time soon to leave the Great Lakes Basin, let alone find its way to the West Coast. Enough said.

That is all backdrop to noting that, at the moment, Lakes Michigan and Huron, which essentially share identical water levels because they are joined by a strait, are experiencing rising water levels after declining to levels well below average in 2012, in the midst of a drought and high temperatures. The lack of precipitation and high evaporation levels reduced the two lakes to 576 feet above sea level, about 2.8 feet below the average since 1918, when the Army Corps of Engineers began keeping records. All the Great Lakes tend to rise and fall over time, and somewhat in tandem because they are part of a continuous system that flows into the St. Lawrence River and out into the Atlantic Ocean. But Lake Superior is higher before it dumps into Michigan and Huron, which are higher than Lake Erie, and certainly Lake Ontario, which is on the receiving end of Niagara Falls. Gravity is obviously how all this water finds its way to the sea.

High recent rainfall—in June we had seven inches in Chicago with lower temperatures than normal—has kept the lake levels rising. Colder winters because of the polar vortexes have maintained ice cover, reducing evaporation. As a result, the lakes are now three feet higher than they were in 2012. Amid all this rise and fall, some facts should be noted: These lakes are thousands of years old. They are the result of glacial melt as the Ice Age receded, so most of the water is the result not of precipitation but of ancient glacial retreat. And our record keeping is less than a century old, so what we think we know about the long-term fluctuation in water levels, let alone what we can accurately predict about long-term impacts of climate change in the Midwest, remains far less than what we might ideally like to know. There are big gaps in our knowledge that can only partially be filled with other types of scientific analysis.

Nonetheless, based on such limited knowledge, the urge to build on shoreland materializing from nothing more than historical fluctuations sometimes motivates unwise development. Communities along the Great Lakes need to invest in wise lakefront planning that takes those fluctuations into account and does not create new hazards that are sure to arise in the face of lake levels that often rise again faster than we anticipated. Adequate buffers based on such fluctuations must be a part of the zoning and development regulations throughout such areas. It is best we approach what we know about Great Lakes water level fluctuations with a dose of humility and caution, lest nature make a fool of our aspirations.

There are resources for that purpose. Many of the state Sea Grant programs, based at state universities, can offer technical assistance. The National Oceanic and Atmospheric Administration, whose Coastal Zone Management Act responsibilities include the Great Lakes, has been developing resources for Great Lakes states. Using NOAA funding, the Association of State Floodplain Managers, with partners like the American Planning Association (APA), has developed, and is still expanding, a website containing its Great Lakes Coastal Resilience Planning Guide. It consists of case studies, a Great Lakes dashboard, and other tools. NOAA’s Digital Coast Partnership has been working with cities like Toledo, Ohio; Duluth, Minnesota; Green Bay, Wisconsin; and Milwaukee to address flooding and development problems along the Great Lakes in varying contexts.

I mention all this because, even amid this temporary abundance of water on the Great Lakes amid a withering drought on the West Coast, water, as always, remains a preoccupying public policy challenge everywhere around the world and across the United States. It is not nearly enough of a focus of public debate, however, and the complexities of the issue seem to evade most people’s attention, including those who ought to be thinking harder about it. Even those who do focus on the question are often siloed into narrow segments of water policy—wastewater, drinking water, flood protection and mitigation, drought planning, coastal zone management, and so forth. We need to approach water challenges more holistically.

APA’s board of directors approached the subject with that larger picture in mind in empowering a special task force to examine the issue. About two months ago, the task force released its report, which began by emphasizing that “water is a central and essential organizing element in a healthy urban environment.” It went on to call for viewing water resource management as “interdisciplinary, not multidisciplinary,” in other words, calling for collaboration among the professions involved. But it also called on the planning profession and university planning schools to provide more training, more education, and more resources centered around the subject of water and its importance to our society. And it calls for APA to “partner with national water service membership agencies” to “foster cross-industry participation and learning opportunities.” It is a far-reaching document that planning leadership in the U.S. is still absorbing, including me. But I commend it as an overdue conversation so that our future conversations about who uses water how, and for what purpose, can be considerably more sophisticated, as they clearly need to be.

We need to move away from pipedreams to serious conversations. Whether in California, where there is too little, or the Great Lakes, where there is currently plenty, we need to get it right because the stakes are high. Very high.

 

Jim Schwab

Drone Coverage in Napa

Readers may well be waiting for me to post something substantial soon, and I plan to compose a significant article this Labor Day weekend. It’s been very hectic for me the last two weeks, and I am currently in Washington on a round of ten meetings in two days, pursuing business new and old.

But while all that is happening, Mike Johnson from our IT department at the American Planning Association latched onto something very interesting, I think, and added it to our Recovery News blog. Amid all the debate about the proper and allowable uses of drones, Evan Kilkus in California has found one use that gives us handy new insights into the nature of damage from the recent earthquake in Napa, California. Use the link above to see his drone-filmed video of the damaged buildings from a perspective you won’t get from the street.

Recovery News is a vehicle we created to deliver news and resources pertaining to post-disaster recovery in connection with our project, nearing completion, with FEMA to prepare the Next Generation guidance on planning for post-disaster recovery. Kudos to Mike for turning up the latest innovation in this area, and to Evan Kilkus for getting it done.

Jim Schwab