Interview with Laura Helmuth
by Zena Ryder
Laura Helmuth has a PhD in cognitive neuroscience from the University of California, Berkeley and is the Editor in Chief of Scientific American. Before that, she was the Health and Science Editor at The Washington Post and the President of the National Association of Science Writers.
She’s also been an editor and contributing writer for National Geographic, Smithsonian, Slate, and Science magazines. Helmuth’s writing has also appeared in The New York Times, National Wildlife, Nautilus, Mother Jones, and other publications.
Do misrepresentations of how science works contribute to the problem of misinformation?
Yes. Researchers have discovered a common misleading narrative in news stories about science: The scientist had a question, they did an experiment, and they got the answer. It’s so simple and tidy.
During the pandemic, science has progressed very quickly and researchers have learned a lot in a short period of time. Much of what we thought at the beginning, based on our knowledge of other viruses, turned out not to be true. This virus is unusual in some ways.
Those mistakes have been fuel for people who say things like, “Well, Tony Fauci said in March that there’s no reason to wear a mask, so you can’t trust him on anything.”
It’s really important for journalists to show that science is an iterative, self-correcting process. Journalists should be saying, “This is what we know so far, this is what we don’t know, things continue to change, and when we get more information, we’ll fill you in.”
How can journalists improve their science communication?
There’s a lot of research in the social sciences about where misinformation comes from, how it spreads, who believes it, what’s effective in countering it. Journalists who are familiar with this research are trying to use it to inform their own practice. For example, in Scientific American, we’ve used the research on effective debunking to debunk myths. We had a story on myths about the COVID pandemic. For each one we said: here’s what’s true, here’s what’s not true, here’s why people believe it — and we explained the best practices for dealing with people who believe the myths. You don’t just tell someone they’re wrong. You need to explain where the misinformation came from and give the facts to replace the false beliefs.
Journalists also need to broadcast their trustworthiness. One way to do this is to show their process, especially if a story uses anonymous sources or is on a controversial topic. Sometimes, there’ll be what’s call a ‘did it box’. It’s a couple of paragraphs explaining who was interviewed, which interviewees didn’t want to share their names because of fear of retaliation, and so on. The point is for journalists to be transparent about how they know what they do.
It’s also important to establish someone’s expertise by explicitly explaining why people should listen to this person. For example, an epidemiologist who’s been modeling infectious diseases for years is worth listening to about coronaviruses.
We’ve talked about journalists improving science communication. What about on the receiving end? Is science education getting worse?
There has actually been some improvement in science education. Fifteen or 20 years ago, there was a big push to teach creationism, under the guise of ‘intelligent design’, as an alternative to evolution. Scientists and science writers spent a lot of time presenting to school boards, defending evolution and showing why creationism is wrong and shouldn’t be taught.
The National Center for Science Education regularly polls science teachers. It used to be the case that way less than half of science teachers taught evolution as a fact. Now, it’s more like two thirds are teaching evolution without any false ‘balance’ presenting an alternative. So that’s some good news.
Once people have rejected science in one area — such as evolution — do they become more likely to reject it in another, such as COVID?
Yes, and we’re seeing some religious figures saying that the COVID vaccines are dangerous. Once someone believes that scientists are conspiring to make up evolution, then it’s easy to believe all kinds of terrible things about them, like they’re conspiring to make up this pandemic.
I think it’s made a difference, at least in the short term, that Trump is out of office. Studies looking at exposure to misinformation about COVID, found that the vast majority was originating from the White House.
How can we restore trust in expertise? And whose responsibility is that?
Scientists are busy and it’s not part of their job to talk to laypeople. But it’s really important for them to share their knowledge, especially when it’s about something that’s in the news, or it’s something people have concerns about.
I encourage scientists and other experts to write opinion columns. They need to say what they know and what they think it means, and not be overly cautious. Otherwise, we get overconfident white men with no scientific knowledge making false claims and getting attention.
Someone’s going to occupy that space and it should be occupied by people who know what they’re talking about.
How can real science information compete with pseudoscientific misinformation?
Our goal as science writers is to make science intriguing, fun, and welcoming — to help people feel it’s for them and that you don’t have to be a specialist to understand or appreciate it. But it’s hard because people who are not constrained by facts are always at an advantage.
The idea of people believing the earth is flat used to be a joke. Nobody really believed it. But then YouTube started showing videos claiming the earth was flat and they were getting lots of clicks. Because of the way the algorithms worked, once someone watched one flat earth video, they’d be fed another one. And pretty soon, you had a lot of people believing the earth is flat.
So, part of the response to the problem of pseudoscience is to identify it, expose it, and pressure the platforms to do something about it. YouTube did de-platform the flat earth videos and changed its algorithms.
To compete with the messaging, we can use strong storytelling techniques. For example, we can tell a science discovery story, showing the steps along the way — the excitement, the disappointments, the surprises. The scientist is a character and the plot is about their quest for discovery. We can show the excitement of science in a way that’s honest and not over-simplified.
What two or three things do you think ordinary people can do to tackle misinformation?
In the April issue of Scientific American, we have a story by Kathleen Hall Jamieson who’s been studying misinformation and public attitudes towards science for decades. She has practical advice for what to do when someone you know says something false about vaccines and microchips, say. You have an opportunity to be a small-scale science communicator. Don’t mock people, don’t make them feel stupid, don’t shame them. Draw them out and ask them where they heard the story. Help people realize that they’re being tricked. And then provide accurate information from the Food and Drug Administration to show how the vaccines were tested.
The key is to let people talk, acknowledge that the pandemic is confusing and there’s a lot of misinformation circulating and we’re all trying to figure out what’s true and what’s not.
There have been lab models of social media interactions showing that if someone posts misinformation, it really does help to comment with a link to a site that debunks the misinformation. The person who posted it might not change their mind, but their followers are much less likely to believe and share the misinformation. Also by sharing reliable information yourself, you help crowd out misinformation.
What other things should we do when we’re trying to change someone’s mind?
A strategy that seems to work is to frame the evidence in terms of things they care about, or connect it to things they already believe. Find common ground.
If you’re talking about climate change with someone who’s politically conservative, there’s no point talking about how climate change might harm an endangered species they don’t think is important. But you could talk about how nuclear power could make the United States more energy-independent, and fight climate change at the same time.
When talking about vaccines, what seems to work is talking about protecting others — babies too young to be vaccinated or people undergoing chemotherapy or immunocompromised people. Remind people they care about others in their community and that by getting vaccinated, they’re helping others.
Appealing to shared values is an effective way to counter misinformation.
Is there a vaccine for the infodemic?
The next Roger W. Gale Symposium in Philosophy, Politics and Economics is on March 4 and 5.
Join us for a diverse expert panel on how we can combat misinformation while preserving free speech. Laura Helmuth speaks on March 5 from 1:45 p.m. to 2:30 p.m.