The necessity and shortcoming of our universal myopia

 Distinguished Teaching Fellow Maura O’Neill, BCEMBA 04
Distinguished Teaching Fellow Maura O’Neill, BCEMBA 04

It’s no secret that we live in an era of information overload. How do we function amid the deluge of email, 24/7 news, and social media? We rely on a primal skill loaded with negative connotations: narrow-mindedness, says Haas Distinguished Teaching Fellow Maura O’Neill, BCEMBA 04, who explored the phenomenon in her interdisciplinary PhD dissertation in psychology, biology, and business at the University of Washington. O’Neill drew on more than 60 years of research and theories in neurobiology and psychology to describe how human memory contributes to what she calls “cognitive myopia.” Specifically, we rely heavily on our long-term memory to make decisions, and that’s where errors in judgment begin.

“It allows us to take shortcuts when we are bombarded with so much sensory data,” she says. By blocking information that doesn’t fit with what we already know, or think we want to know, our brains enable us to make decisions quickly and efficiently.

That’s the good news. On the downside, according to O’Neill, our narrow-mindedness can lead to bad choices based on too little information. Often, we overlook key data deliberately or unconsciously. Take, for instance, the taxi industry’s failure to recognize how smartphones could revolutionize ride-hailing services. Other times, we think about a problem too narrowly, like Uber’s belief that its treatment of workers and regulators would not impact its business reputation and growth.

“Sometimes we pay an inconsequential price for our narrow-mindedness,” says O’Neill, whose three decades in business and government include a stint as the chief innovation officer of the U.S. Agency for International Development under President Obama. “And other times these decision-making errors can lead to catastrophic results.” She cites as examples the collective failure ahead of 9/11 to conceive of terrorists turning hijacked planes into weapons and missed warnings in the years leading up to the 2008 global financial crisis.

While big data holds promise to unearth new solutions, it also comes with huge risks of aggravating our narrow-mindedness, accounting for even more spectacular mistakes. O’Neill is now in the early stages of researching the problem. Michigan, for example, laid off most of its human claims reps, instead using big data to discover fraudulent claims for state unemployment insurance. A mistake in the algorithm produced a 90+ percent error rate resulting in 20,000 job seekers wrongly punished. In Florida, a sentencing benchmark intended to predict recidivism proved to be wildly inaccurate because of misguided assumptions.

The primary problem, she says, is that statistics are used to predict future events based on past patterns—without considering other possibilities. “If we don’t get the algorithms right, we’re setting ourselves up for failure,” says O’Neill, who also serves as faculty director of the UC Berkeley Executive Leadership Program. She’ll present the latest neuroscience findings on leadership—including narrow-mindedness and how to overcome it—in the Berkeley Executive Leadership Program, Nov. 6–10, 2017.

To minimize the negative effects of narrow-mindedness, O’Neill outlines a deliberate process based on the latest cognitive research. But it starts with self-reflection. “Until everyone recognizes that we are all narrow-minded, we are not going to overcome it,” she says. “And the remedies to the most pressing business problems or in government or in our own personal lives are going to require new innovative solutions.”