The Ostrich Paradox

As an urban planner, my entry into the world of disasters has been through the twin portals of public policy and planning practice—how we frame the priorities of government and how we carry out the tasks of community planning. One thing I have learned from years of interaction with other types of professionals is that many other portals exist that provide insights into the nature and causes of disasters, how we define them, and how we prepare for and react to them. The behavioral sciences–including psychology, sociology, communications, and economics—have played a significant role in helping us understand some of these questions. They have helped me understand that what may seem like a straight line in public policy between a problem and a solution can be laden with land mines that are built into the evolution of the human brain. We are capable, as a species, of contemplating long-term consequences of our behaviors, but only when we have opportunities to gain some distance between our immediate needs and the problems we are considering. Very often, however, life forces us to react quickly and with inadequate forethought, and our brains reach for more instinctive reactions that our species learned over millennia, even those we inherited from other species from which we evolved.

And so there is the proverbial ostrich, putting its head in the sand, supposedly to avoid seeing any painful realities. The authors of a new book, The Ostrich Paradox: Why We Underprepare for Disasters, published by Wharton Digital Press, note near the outset that, despite the widely accepted stereotype of ostriches, they are “astute escape artists” who use speed to compensate for their inability to fly. They suggest humans become more like ostriches, not less, by recognizing our own limitations and consciously seeking to address them. But first, we need to know what those shortcomings are and why, because of them, humans routinely fail to anticipate and prepare for disasters.

They start by reviewing a concept of the human brain discussed at length by Daniel Kahneman in Thinking, Fast and Slow, several years ago—that of the two systems that allow us to do precisely what Kahneman’s title suggested. System 1 operates more rapidly with learned and instinctive responses to everyday situations, such as slowing down or swerving to avoid car crashes, or stepping away from snakes. The reactions are quick reflexes that are often entirely unconscious. System 2, which could never respond to the multitude of routine stimuli fast enough to allow us to cope or survive, instead helps us focus and reflect, sometimes allowing us to train our minds to react differently but also, importantly, to gain perspective on issues facing us. Planning, for instance, is largely an intellectual activity in which we process information, mull it over, and try to anticipate how future conditions may affect our community and its ability to achieve stated goals. It also takes time and does not allow us to react to immediate threats, for example, a bolting horse or the sound of gunfire. When we hear the gunfire, we don’t contemplate what it is; we duck or run for cover.

With the respective limitations of those two systems in mind, Robert Meyer, a professor of marketing, and Howard Kunreuther, professor of decision sciences and public policy, both at the Wharton School of the University of Pennsylvania, outline what they describe as the six biases of human beings in dealing with low-probability, high-consequence events “for which we have little stored knowledge.” In other words, we developed fast reactions to a car swerving into our path because we have acquired a good deal of “stored knowledge” from past driving experience. But how often do we experience a tornado? Even people in states like Kansas or Oklahoma, who may hear about such events often enough, may not have enough direct experience with them to know how best to prepare. For events like Hurricane Katrina, which Gulf Coast residents may directly experience once in a lifetime, the stored knowledge is limited indeed. The will to think about such events without having been prodded to do so by prior experience is even more limited. And so, disaster becomes not only a function of natural events, but of the human resistance to considering their possibility.

So, what are those six biases? First, myopia, the tendency to focus on the short term and more likely events at the expense of more significant, long-term dangers. Second, amnesia, the willingness to turn off or ignore more distant memories that may inform our awareness of potential hazards. Third, optimism, or the will to believe that everything will turn out all right. Fourth, the inertia bias, which could also be described as our innate reluctance to disrupt the status quo. Fifth, the simplification bias, the highly understandable difficulty we face in coming to terms with more complex situations. And finally, the herding bias, otherwise known to most people as the tendency to follow the crowd, even though our reflective minds may tell us that the crowd may be dead wrong.

Now, to be honest, I am already engaging in a simplification bias by summarizing the core thesis of an entire book in one paragraph, as I just did. But I am very much aware, as a writer, of what I have done, with the explicit aim of spurring readers to explore the more detailed explanations the book offers. Even if you do not, however, there may be net gains in awareness just by exposing you to the concept that such biases exist. Let me complicate matters just a little by repeating the authors’ assertion that these biases are not all bad, just as they are not all good. What matters is our awareness that these biases exist and that they are a shared legacy of our humanity. None of us can operate without them, but at the same time, our System 2 brains are designed to help us overcome the limitations they embody.

And thus, Meyer and Kunreuther urge us all to be more like ostriches, “not less.” The ostrich compensates for its physical limitations—the inability to fly—with speedy retreats from danger. Humans, with advanced intellectual skills, can do far more. In thinking about risk, the authors suggest, we can “develop policies that take into consideration our inherent cognitive limitations.”

That is, I must say, an intriguing thought—one that deserves more than a reflexive reaction. Think about it.

 

Jim Schwab

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.