Kickstarting Urban Innovations

We often hear from conservatives that the public sector is inherently inefficient, lacking the competitive pressures that drive innovation. A great deal of the evidence seems anecdotal, although it’s not hard to come by. The work of most public agencies is at least somewhat more visible than that of most corporations. People build long memories, for instance, of long lines for driver’s license renewals. Today, however, most of us renew online unless there is some reason we need to retake the test or replace a lost license quickly. How often do we remember the agencies that have created more modern customer service operations? I read in the Chicago Tribune’s endorsement today of Cook County treasurer Maria Pappas for re-election that, in the past 20 years, she had reduced her staff from 250 to 88.5, while also digitizing more than 4.7 million pages of old records. Was anyone noticing?

Count me a skeptic on the broad theory of greater efficiency in the private sector. I’ve long been more inclined to assess efficiency and innovation on a case-by-case basis. I don’t think it is at all impossible to innovate and streamline public-sector operations, nor do I think all businesses operate efficiently. In due course, obviously, the latter may go out of business, but there are also ways for voters to hold public agencies accountable—if they are so inclined, which sometimes they are not. People are sometimes more than willing to re-elect dinosaurs for a variety of reasons, just as they sometimes continue to patronize sluggish businesses. Human decision making in both the political and economic realms can be rather fallible, a fact that the field of economics is finally coming to terms with (witness the work of Amos Tversky and Daniel Kahneman, the subjects of the Michael Lewis book The Undoing Project, or of Richard Thaler).

It is thus heartening to see someone like Gabe Klein produce a short, snappy volume, Start-up City, that provides both case studies and principles for public-sector innovations in our cities, with an eye to building the cities of tomorrow that are in fact competitive with their peers in fostering both better service to their constituents and better prospects for environmentally and socially beneficial economic development.

Gabe Klein. Images courtesy of Island Press.

Klein’s book discusses the innovations he introduced first in Washington, D.C., and later in Chicago, in both cases managing the city’s transportation department. When the mayor of Washington appointed him to the first post, he pulled Klein out of the private sector, where he had been involved with a bicycle company, food trucks, and Zipcar. Those enterprises exposed him to competitive entrepreneurial environments where innovation flourished, but also to obstacles that regulators can pose to innovation. Klein flourished in the public sector as well because he firmly believed that innovation in public agencies, even those that had been somewhat stodgy and resistant to change in the past, was eminently possible. He set out to prove it through a combination of public-private partnerships, new thinking about agency missions, and a determination to find a way to make things happen. He distills the lessons from these experiences in a series of short eight chapters that are called, well, lessons.

Those lessons are not bashful or pain-free. For example, Lesson #1: Don’t Be Afraid to Screw Up and Learn. Klein’s underlying point is simple: If you spend your time indulging a fear of making mistakes—which, sometimes, in the public sector, can be very visible—you will accomplish little or nothing. This preoccupation may ensure short-term survival, but in the end, it is self-defeating (not to mention boring). It helps, of course, to have a mayor or other public chief executive who has your back. (Get that assurance before you accept the post.)

I won’t belabor the points here because I’d rather encourage you to read this book, which is not a long read but is definitely an inspiring one. But I will note that, in both cities, despite some entrenched opposition, Klein managed to instill numerous and prominent bicycle lanes on city streets, as well as engineering the completion of the 606 Trail in Chicago, a project about which I have written in the past, and the Riverwalk in downtown Chicago. Along the way, he discusses the marketing and branding of public projects, creative approaches to financing those projects, and effective means of engaging the private sector in meaningful partnerships. You will also learn from Klein’s smooth prose that he is an effective salesman, communicator, and presenter, making it small wonder that Mayors Adrian Fenty in Washington and Rahm Emanuel in Chicago would seek him out to serve in their administrations.

But I also want to make clear that Klein is not sui generis and does not claim to be. One need only look at the innovations of Janette Sadik-Khan in New York City under Mayor Michael Bloomberg, creating pedestrian walkways in the middle of Manhattan while improving traffic flow, to know that such innovation can abound in the public sector. (Sadik-Khan has her own book, Street Fight, that details her experiences in city government.) All that is required is the political will to turn the entrepreneurial spirit loose in city government. The only real obstacle is a lack of commitment and imagination.

Jim Schwab

The Ostrich Paradox

As an urban planner, my entry into the world of disasters has been through the twin portals of public policy and planning practice—how we frame the priorities of government and how we carry out the tasks of community planning. One thing I have learned from years of interaction with other types of professionals is that many other portals exist that provide insights into the nature and causes of disasters, how we define them, and how we prepare for and react to them. The behavioral sciences–including psychology, sociology, communications, and economics—have played a significant role in helping us understand some of these questions. They have helped me understand that what may seem like a straight line in public policy between a problem and a solution can be laden with land mines that are built into the evolution of the human brain. We are capable, as a species, of contemplating long-term consequences of our behaviors, but only when we have opportunities to gain some distance between our immediate needs and the problems we are considering. Very often, however, life forces us to react quickly and with inadequate forethought, and our brains reach for more instinctive reactions that our species learned over millennia, even those we inherited from other species from which we evolved.

And so there is the proverbial ostrich, putting its head in the sand, supposedly to avoid seeing any painful realities. The authors of a new book, The Ostrich Paradox: Why We Underprepare for Disasters, published by Wharton Digital Press, note near the outset that, despite the widely accepted stereotype of ostriches, they are “astute escape artists” who use speed to compensate for their inability to fly. They suggest humans become more like ostriches, not less, by recognizing our own limitations and consciously seeking to address them. But first, we need to know what those shortcomings are and why, because of them, humans routinely fail to anticipate and prepare for disasters.

They start by reviewing a concept of the human brain discussed at length by Daniel Kahneman in Thinking, Fast and Slow, several years ago—that of the two systems that allow us to do precisely what Kahneman’s title suggested. System 1 operates more rapidly with learned and instinctive responses to everyday situations, such as slowing down or swerving to avoid car crashes, or stepping away from snakes. The reactions are quick reflexes that are often entirely unconscious. System 2, which could never respond to the multitude of routine stimuli fast enough to allow us to cope or survive, instead helps us focus and reflect, sometimes allowing us to train our minds to react differently but also, importantly, to gain perspective on issues facing us. Planning, for instance, is largely an intellectual activity in which we process information, mull it over, and try to anticipate how future conditions may affect our community and its ability to achieve stated goals. It also takes time and does not allow us to react to immediate threats, for example, a bolting horse or the sound of gunfire. When we hear the gunfire, we don’t contemplate what it is; we duck or run for cover.

With the respective limitations of those two systems in mind, Robert Meyer, a professor of marketing, and Howard Kunreuther, professor of decision sciences and public policy, both at the Wharton School of the University of Pennsylvania, outline what they describe as the six biases of human beings in dealing with low-probability, high-consequence events “for which we have little stored knowledge.” In other words, we developed fast reactions to a car swerving into our path because we have acquired a good deal of “stored knowledge” from past driving experience. But how often do we experience a tornado? Even people in states like Kansas or Oklahoma, who may hear about such events often enough, may not have enough direct experience with them to know how best to prepare. For events like Hurricane Katrina, which Gulf Coast residents may directly experience once in a lifetime, the stored knowledge is limited indeed. The will to think about such events without having been prodded to do so by prior experience is even more limited. And so, disaster becomes not only a function of natural events, but of the human resistance to considering their possibility.

So, what are those six biases? First, myopia, the tendency to focus on the short term and more likely events at the expense of more significant, long-term dangers. Second, amnesia, the willingness to turn off or ignore more distant memories that may inform our awareness of potential hazards. Third, optimism, or the will to believe that everything will turn out all right. Fourth, the inertia bias, which could also be described as our innate reluctance to disrupt the status quo. Fifth, the simplification bias, the highly understandable difficulty we face in coming to terms with more complex situations. And finally, the herding bias, otherwise known to most people as the tendency to follow the crowd, even though our reflective minds may tell us that the crowd may be dead wrong.

Now, to be honest, I am already engaging in a simplification bias by summarizing the core thesis of an entire book in one paragraph, as I just did. But I am very much aware, as a writer, of what I have done, with the explicit aim of spurring readers to explore the more detailed explanations the book offers. Even if you do not, however, there may be net gains in awareness just by exposing you to the concept that such biases exist. Let me complicate matters just a little by repeating the authors’ assertion that these biases are not all bad, just as they are not all good. What matters is our awareness that these biases exist and that they are a shared legacy of our humanity. None of us can operate without them, but at the same time, our System 2 brains are designed to help us overcome the limitations they embody.

And thus, Meyer and Kunreuther urge us all to be more like ostriches, “not less.” The ostrich compensates for its physical limitations—the inability to fly—with speedy retreats from danger. Humans, with advanced intellectual skills, can do far more. In thinking about risk, the authors suggest, we can “develop policies that take into consideration our inherent cognitive limitations.”

That is, I must say, an intriguing thought—one that deserves more than a reflexive reaction. Think about it.

 

Jim Schwab