[SUNDAY FEATURE] The problem with believing what we’re told

ILLUSTRATION: TOMASZ WALENTA

In an age of information overload, it’s important to find ways to resist the brain’s difficulty in separating fact from falsehood

By Gary Marcus and Annie Duke | The Wall Street Journal

In a perfect world, to determine whether an idea or fact is true, we would patiently evaluate how it fits with our own experience, take note of the credibility of the source and be prepared to reconsider if new information emerges. In reality, we are often too busy and distracted to be that careful. And in those cases, we tend to assume that whatever we hear is true.

The psychologist Daniel Gilbert and colleagues documented this phenomenon in a set of studies in the early 1990s. Undergraduates at the University of Texas at Austin were asked to evaluate factual statements such as those in a legal case; some were clearly marked as true and others as false. Given enough time and focus, the students were good at remembering the difference. When they were distracted, however, they were more likely to remember false things as being true—but not the other way around.

This tendency to assume truth first and ask questions later (if ever) has become a serious problem in our era of information overload, with the rise of so many sources of information that are either unreliable or intentionally misleading. Fortunately, there are ways to combat our bias, once we understand its evolutionary underpinnings.

Before our prehuman ancestors developed language, they formed beliefs mainly about things they experienced with their own senses. If an ancestor of yours saw a tree, there wasn’t much reason for them to question the tree’s existence. We tend to treat language as an extension of our senses, but it is much more open to manipulation.

The simple act of repeating a lie can make it seem like truth

The simple act of repeating a lie can make it seem like truth, as the Temple University psychologist Lynn Hasher and colleagues showed in a pioneering study published in 1977. The researchers asked 40 people to rate the truthfulness of a variety of statements, some true and some false. A number of statements were repeated in multiple rounds of the exercise over time. Test subjects became more likely to believe things as they were repeated, regardless of whether they were true or false. The third time they heard a false statement, they were just as likely to believe it as a true statement that they heard once.

We are even more easily snookered when pictures are included. Participants in a 2012 study, published in the Psychonomic Bulletin & Review, were given statements about celebrities or general knowledge. When pictures were attached, people were more likely to believe the statements, including the fake ones. If, for instance, you show people a picture of a giraffe with a statement saying it is the only mammal that can’t jump, they are more likely to forget about the other animals that can’t jump, such as elephants and hippos.

Related research shows that even adding unimportant details to a statement can have a similar effect. If you add vivid language, lies can spread even more quickly. A 2017 study in the Proceedings of the National Academy of Sciences, by Jay Van Bavel and colleagues at New York University, looked at about a half million social media messages. It found that the presence of moral and emotional words like “hate,” “destroy” or “blame” acted like an accelerant, increasing the chance that a message would spread by about 20% for each additional emotional word. The study also found that most of the sharing was done within political parties rather than across political divides, creating an echo-chamber effect.

Combine human cognitive weakness with social networks, and you have a recipe for chaos

Fake news tends to avoid nuance or neutral language and frequently adds layers of emotion and moralizing—all of which makes false items spread much faster than the real thing. A team at the Massachusetts Institute of Technology compared the spread of fake news and real news on Twitter in a 2018 study published in the journal Science. Looking at 126,000 tweets of news stories over the previous 11 years, they found that fake news stories were 70% more likely to be retweeted than true stories. Real news took about six times as long to reach a benchmark audience of 1,500 people as fake news did.

Savvy propagandists have long exploited the tendency of the human brain to take shortcuts. But social networks make it far easier, because they feed on a further human vulnerability: our need for approval, affection and positive feedback. Combine human cognitive weakness with social networks, and you have a recipe for chaos.

The good news is that there’s increasing evidence that the needed critical-thinking skills can be taught. In a study published in November in the journal SSRN, Patricia Moravec of Indiana University’s Kelly School of Business and others looked at whether they could improve people’s ability to spot fake news. When first asked to assess the believability of true and false headlines posted on social media, the 68 participants—a mix of Democrats, Republicans and independents—were more likely to believe stories that confirmed their own prior views. But a simple intervention had an effect: asking participants to rate the truthfulness of the headlines. That tiny bit of critical reflection mattered, and it even extended to other articles that the participants hadn’t been asked to rate. The results suggest that just asking yourself, “Is what I just learned true?” could be a valuable habit.

Similar research has shown that just prompting people to consider why their beliefs might not be true leads them to think more accurately. Even young children can learn to be more critical in their assessments of what’s truthful, through curricula such as Philosophy for Children and other programs that emphasize the value of careful questioning and interactive dialogue. Ask students to ponder Plato, and they just might grow up to be more thoughtful and reflective citizens.

Rather than holding our collective breath waiting for social media companies to magically resolve the problem with yet-to-be invented algorithms for filtering out fake news, we need to promote information literacy. Nudging people into critical reflection is becoming ever more important, as malicious actors find more potent ways to use technology and social media to leverage the frailties of the human mind. We can start by recognizing our own cognitive weaknesses and taking responsibility for overcoming them.

—Dr. Marcus is CEO and founder of Robust.AI and the co-author, most recently, of “Rebooting AI,” to be published by Pantheon in September. Ms. Duke is the author of “Thinking in Bets” and co-founder of The Alliance for Decision Education.

Share this!

Additional Articles

Top master-planned communities for 2024

By AZ Big Media Here are the Top 10 master-planned communities in Arizona, based on public voting for the 2024 edition of Ranking Arizona, the state’s biggest and most comprehensive business opinion poll. Ranking Arizona

Read More »
News Categories

Get Our Twice Weekly Newsletter!

* indicates required

Rose Law Group pc values “outrageous client service.” We pride ourselves on hyper-responsiveness to our clients’ needs and an extraordinary record of success in achieving our clients’ goals. We know we get results and our list of outstanding clients speaks to the quality of our work.