2023: Lessons from Ukraine: Armed Conflict, Dictatorship, and Responses from Liberal Democracies 2022: A Wicked Problem: Individual Freedoms and Climate Change
by Zena Ryder
Timothy Caulfield is a Canada Research Chair in Health Law and Policy, a Professor in the Faculty of Law and the School of Public Health, and Research Director of the Health Law Institute at the University of Alberta. He’s a Fellow of the Royal Society of Canada and the Canadian Academy of Health Sciences.
Caulfield has published over 350 academic articles based on his interdisciplinary research on topics such as public representations of science, research ethics, and public health policy. His writing has appeared in numerous media outlets, including The Globe and Mail, NBC News, CBC, and The Walrus. He’s the host and co-producer of the TV documentary show, A User’s Guide to Cheating Death, which has been shown in over 60 countries.
Social media. In general, we’ve found conventional sources of news have done a fairly good job of representing the pandemic situation as it evolves. Not perfectly — misinformation is still there, false balance, bad headlines, not communicating uncertainty. But, overall, pretty good. The problem of misinformation has largely been a social media phenomenon.
Interesting studies have shown that someone who gets their news from social media platforms is more likely to be misinformed and to believe misinformation. Of course, there’s a correlation-causation methodological limit you have to watch for. So you have to come at this issue from different directions. But there’s a body of evidence that supports the view that social media is a big part of the problem.
Also the scope of the pandemic has created a lot of understandable fear and uncertainty. We know those factors can drive people to embrace misinformation and conspiracy theories. Science communicators and public health officials haven’t done a great job of representing scientific uncertainty. They’ve been learning, though, during this pandemic.
This pandemic also emerged at a time when public discourse was very polarized. So, all of a sudden, public health interventions became representative of ideological leanings, which is remarkable. Who would have guessed that wearing a mask would become a flag of your ideological position? The problem is that once something becomes ideological, it becomes part of someone’s personal identity, and it becomes more difficult to change people’s minds. And we’re seeing that with the anti-mask pushback. They’re using ideology to draw people in, with language like ‘choice, freedom, liberty’. That kind of language can be very persuasive. It invites people to become part of a community. And once someone is part of that community, it becomes easier for them to accept other anti-science views, such as anti-vaccination.
A silver lining of this pandemic is that there’s been a growing agreement that misinformation causes harm. And there is plenty of research showing that debunking — countering misinformation with facts — does work.
But you have to do it right. First, you have to use good, trustworthy science — especially if you can point to a scientific consensus.
Second, you can highlight the rhetorical tricks used to push information, such as that this content is relying on an anecdote, not scientific research. Or that it’s misrepresenting the degree of risk, or it’s pushing an unlikely conspiracy theory.
Third, you want to make sure the factual content is shareable and appealing on social media. It’s well-written and easy to understand, has good graphics, and so on.
Fourth, I personally think that it’s important to listen, to be humble and empathetic. The empirical evidence on the impact of this is mixed. There’s some recent evidence that tone may not matter that much. But if you’re hoping to have an ongoing engagement with someone, or an audience, I suspect it’s important to be empathetic and kind to people you’re disagreeing with.
Fifth, remember that the general public is the audience, not that hardcore denier. It’s easy to get sucked in to an argumentative vortex with a hardcore denier, but try not to let that happen. It’s really hard to change their minds. Your target is their followers, their audience. There’s actually only a small number of hardcore COVID-deniers. It’s the ‘movable middle’ you should focus on.
The data supporting the backfire effect is actually much more equivocal than it’s sometimes been portrayed. The studies supporting the backfire effect were conducted around 2010 and the effect was found in a very specific, political context. Most studies since then have either not found the backfire effect at all, or they’ve found it to be rare, or context-specific. So the takeaway is to not let the idea of entrenching people’s views scare you away from debunking misinformation.
There hasn’t been as much research exploring the amplification effect, but the studies so far suggest that it also shouldn’t scare you away from debunking. It’s OK to repeat the misinformation, when you’re correcting it so long as you make the correct information the memorable part.
Prebunking can be a few different things. It can be warning people about misinformation, or giving people critical thinking skills so they’re better able to deal with misinformation, or flagging content on social media. Anything that attempts to deal with misinformation before someone believes it.
There’s been some recent research by Gordon Pennycook and others showing that debunking worked better than prebunking in a specific scenario. But this is such a huge phenomenon and we need to come at it from every direction. We need to do some prebunking and some live debunking. It shouldn’t be only one or the other. Everyone can engage in addressing misinformation using different tools, different styles, different social media platforms, and to varying depths and breadths. Some people may feel comfortable just giving others prebunking tools, and that’s totally fine.
It can be easy to feel negative about whether battling misinformation works, because we see so much misinformation circulating out there. But just imagine what it would look like if we didn’t do this, if nobody was fighting misinformation! It would be a disaster!
I was doing the research for that book in 2014 and 2015, and I had no idea her wellness brand would get this popular. There are a lot of theories about why she’s so popular. Her supporters will say that the conventional healthcare system has treated women terribly, and Paltrow is giving them something that’s especially for them.
And, absolutely, the conventional healthcare system has faults and has treated women — and other groups — poorly, not listened to them, and so on. But those faults are not an excuse to exploit the problem and sell bunk! People should actually be more angry at Paltrow, not less, because she’s exploiting a genuine problem in order to profit from it.
We talked before about appealing to ideology, and her wellness brand really does play to people who are attracted to a certain kind of lifestyle and have a certain kind of worldview. When I first started writing about her, she wasn’t as polarizing. Most people dismissed her and thought the brand was silly. Now something like 80 per cent of people hate her, and 5 or 10 per cent really like her. That’s all she needs to have a successful business, so she plays to that market.
The next Roger W. Gale Symposium in Philosophy, Politics and Economics is on March 4 and 5.
Join us for a diverse expert panel on how we can combat misinformation while preserving free speech. Timothy Caulfield speaks on March 4 from 1:00 p.m. to 1:45 p.m.