Interview with Gordon Pennycook

by Zena Ryder

Gordon Pennycook

Gordon Pennycook

Gordon Pennycook is an Assistant Professor of Behavioural Science at University of Regina’s Hill and Levene Schools of Business. He’s also an Associate Member of the Department of Psychology. He’s a member of the editorial board for the journal, Thinking & Reasoning, and a consulting editor for Judgment and Decision Making. In 2020, he was elected to be a member of the Royal Society of Canada’s College of New Scholars, Artists, and Scientists.

His research focuses on reasoning and decision-making, investigating the differences between intuitive processes (aka gut feelings) and more deliberative or analytical reasoning processes.

Pennycook has published numerous academic articles and book chapters, as well as articles in the popular press, including for CBC and The New York Times. His research has been covered in many venues, including the BBC, The Guardian, Scientific American, and the Smithsonian Magazine. He edited the book, The New Reflectionism in Cognitive Science, which was published by Routledge in 2018.

Pennycook and his research team won an Ig Nobel Prize for their 2015 study, “On the Reception and Detection of Pseudo-Profound Bullshit”, which was published in the journal, Judgment and Decision Making.

How did you first get interested in the issue of misinformation?

As a doctoral student, I studied reasoning, reasoning errors, and thinking styles. I started a postdoc at Yale in 2016, an American election year. There was all this hullabaloo about fake news. That enticed me to start conducting studies on people sharing misinformation on social media.

The good news is that people generally do care about the truth. Most people don’t want to share stuff that’s false. Even when two people have the same political ideology, it matters to each of them when the other person shares something false — even if it supports their shared political ideology.

If you ask people what’s important to them when they’re considering what news content to share, truth is the number one thing.

Why, then, do people share false or misleading content so often?

Often it’s because they’re relying on their intuitions or gut feelings, without pausing to ask themselves if the story is true. People find stories that are consistent with their ideology more intuitively plausible and that gut feeling prompts them to share.

Generally speaking, people who are more reflective, better at reasoning, and less likely to rely on their intuitions or gut feelings, are better this kind of thing. Even when news stories are consistent with their previously held beliefs and values, they’re better able to distinguish between true and false headlines.

What can we do about the problem of people sharing false or misleading content?

Luckily, we may not have to take on the massive task of teaching people to be better critical thinkers in order to greatly reduce the spread of misinformation. Our studies have shown that a much simpler intervention might help. Simply asking people to think about the accuracy of headlines makes them less likely to share false stuff afterwards.

There may be potential here for social media companies to institute an easy evidence-based intervention. They could prompt users to think about the accuracy of what they’re about to share and then measure how well it works.

So long as it’s evidence-based, I’m open-minded about how social media companies should tackle the problem of misinformation. At the moment, these companies might say they’re dealing with the issue, but we don’t know the details of what they’re doing, we don’t know if it’s evidence-based, and we don’t know how much it’s helping. It’s all behind closed doors.

So you don’t think countering misinformation with the truth is enough?

Although it is important to counter misinformation with facts, it’s not this easy. We have to be pragmatic about what people are likely to engage with. People are motivated to share news stories for all sorts of reasons that we need to understand. If false stories are more engaging and ‘sharable’ than true ones, for whatever reason, then just putting out facts in response to falsehoods isn’t enough; it’s not going to work.

Furthermore, if we’re always responding to misinformation with facts, we’re always going to be behind, trying to catch up. It’s hard to dislodge false beliefs that people already have. You have to get ahead of the misinformation so that people don’t see it and acquire the false belief in the first place.

Conspiracy theorists are extreme examples of people with false beliefs that are hard to dislodge. What has your research shown about conspiracy theorists?

Conspiracy theorists are notoriously resistant to altering their beliefs. Our research has uncovered one of the reasons why this might be so — conspiracy theorists hold their beliefs with much more confidence than other people hold theirs.

We ran tests where we showed subjects fuzzy images and they had to guess what the object was from a pair of choices. At the end of the series of tests, we asked subjects how many they think they got correct.

Most people say they were guessing, they have no idea, so they infer that they’re at about chance. (They actually performed slightly above chance.) But people who believed conspiracy theories are more likely to think they did well on the test, even though their performance was no different from everyone else’s.

For most people, if they believe something that nobody else believes, that’s a red flag for them. It makes them wonder whether they might be mistaken. But if someone is very confident in their beliefs, they might even take pride in the fact that nobody else believes what they do. It makes them feel smarter than other people.

Has the problem of conspiracy theories, and misinformation more generally, gotten worse over the last few years? If so, why is that?

Conspiracy theorists are an extreme example, but generally speaking, we really do not appreciate how massively different people’s information environments are. And this has huge consequences for what they believe and how they behave. I’m not talking just about social media echo chambers. The nature of the information they’re seeing is becoming more extreme.

An average American voter might start off getting most of their news from Fox News, which is itself a source of misinformation. But many viewers moved onto more extreme right-wing sources during the American election. Many Republican voters moved from Fox News to the One American News Network and Newsmax, which are more extreme. Online, many turned to Breitbart. When people are in a poisonous information environment like this, it’s not surprising that they have false beliefs.

And the problem has gotten worse over the last few years. Trump was a leader with a flippant disregard for the truth. Fact checks of Trump’s statements show he made tens of thousands of false claims over the course of his presidency.

Misinformation has gone mainstream largely because of Trump.


Is there a vaccine for the infodemic?

The Misinformation Age

The next Roger W. Gale Symposium in Philosophy, Politics and Economics is on March 4 and 5.

Join us for a diverse expert panel on how we can combat misinformation while preserving free speech. Gordon Pennycook speaks on March 5 from 2:30 p.m. to 3:15 p.m.

Learn more