Reason vs. belonging

We live in a society where facts don’t seem to mean much. If the president, for example, doesn’t like a report or scientific study, he simply calls it “fake news.”

When this ruse was revealed, the one group still thought they were better than average at identifying real suicide notes.

He’s not alone. Many of us take similar positions, and, according to some recent books, there are reasons for that. In her article “Why Facts Don’t Change Our Minds” (The New Yorker, Feb. 27), Elizabeth Kolbert reviews three of these books. All, by the way, were written before last November’s election.

Kolbert reports studies from 1975 at Stanford University in which two groups of students were asked to distinguish real from fake suicide notes. One group was told they scored high, identifying 24 out of 25 notes correctly, while the other group got 10 right.

In fact, neither group did better than the other. But when this ruse was revealed, the one group still thought they were better than average at identifying real suicide notes, while the other group believed they were worse.

More studies followed, and researchers noted that even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs.”

Why do people act this way? Kolbert notes that in their book The Enigma of Reason, cognitive scientists Hugo Mercier and Dan Sperber argue that “humans’ biggest advantage over other species is our ability to coöperate.” And reason is “developed [in human behavior] to resolve the problems posed by living in collaborative groups.”

Kolbert goes on to discuss “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.

Mercier and Sperber prefer the term “myside bias,” Kolbert writes. “Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.”

In another book by two other cognitive scientists, The Knowledge Illusion: Why We Never Think Alone, Steven Sloman and Philip Fernbach look at various studies that show the “illusion of explanatory depth.” This means people believe that they know way more than they actually do.

We do things as if we know how they work, when really we don’t, such as how a toilet works. But, say Sloman and Fernbach, politics is different.” It’s one thing for me to flush a toilet without knowing how it operates,” Kolbert writes, “and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about.”

“As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem. The more people we find who agree with us, the more we think we’re right.

One ray of hope is that when people are asked to explain the implications of a policy, they realize how clueless they are and moderate their views. This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”

The third book is Denying to the Grave: Why We Ignore the Facts That Will Save Us by Jack and Sara Gorman. “Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous,” Kolbert writes.

It’s helpful to know that we tend to go along with our group, even when the facts go against what we believe. But for our own good we must not leave reason behind.

 

All reviews express the opinions of the reviewer, not necessarily the views of Third Way.