2023: Lessons from Ukraine: Armed Conflict, Dictatorship, and Responses from Liberal Democracies 2022: A Wicked Problem: Individual Freedoms and Climate Change
by Zena Ryder
Daniel Levitin is a neuroscientist, musician, and author. He’s the James McGill Professor Emeritus of Psychology, Neuroscience, and Music at McGill University in Montreal. He’s also the Founding Dean of Arts & Humanities at the Minerva Schools at the Keck Graduate Institute in San Francisco.
He began playing music professionally when he was 16 and has performed with Mel Tormé, Sting, Rosanne Cash, and David Byrne. He released his first album, Turnaround, in 2020. He not only wrote all the songs, he sang, and played the guitar, bass, and keyboards for them. One of the songs from that record was performed at the Kennedy Center in Washington D.C. with Renée Fleming.
He’s published over 300 articles in both academic journals and the popular press. He’s published a number of popular books including The Organized Mind: Thinking Straight in the Age of Information Overload and A Field Guide to Lies: Critical Thinking with Statistics and the Scientific Method. His most recent book, Successful Aging: A Neuroscientist Explores the Power and Potential of Our Lives, was published in 2020.
My interest in it has been long-standing. I’ve been teaching students for years about the scientific method — helping them figure out what it means to know something in science, how to decide whether we know something or not, how to critically evaluate articles in academic journals, and so on.
In 1998, I started teaching an undergraduate critical thinking course at McGill. I’d ask students to turn in examples of bad graphs or bad arguments they came across. I accumulated a big box of these clippings. They were mostly from newspapers and magazines like The Globe and Mail, the National Post, Maclean’s, and Newsweek. Misinformation has been around a long time.
My book, The Organized Mind, had a little section on critical thinking, and I thought it would be fun to expand that theoretical treatment into a practical guide — a textbook for lay people. And that’s how my book, A Field Guide to Lies, came about. The content of the book comes right from my students at McGill.
Over the last 10 years, I’ve seen a distressing, democracy-threatening change in the average person’s — even an educated person’s — understanding of how to separate what’s true from what’s patently false. Belief in conspiracy theories is on the rise, as are fuzzy thinking, incomplete thinking, and belief in things that are highly improbable. There are many causes of this, but I think that one of the factors was Wikipedia.
I’ve been involved in ‘edit wars’ on Wikipedia. For example, on a page about the prefrontal cortex, I would write ‘Here’s what the research literature says’ and someone would come in and replace my edit with something nonsensical. The other person’s response in the edit comments was, ‘Who cares what that says. This is what I think.’ I would keep trying to revert it back to the evidence, but because they apparently had more time than I did, they ended up winning.
That’s an erosion of respect for expertise that Jimmy Wales didn’t foresee when he founded Wikipedia. The co-founder, Lawrence Sanger, has been quite strident in his criticism of that, though.
When the public doesn’t respect expertise, there are all kinds of negative effects. It leads people to listen to a MIT electrical engineer or a movie actor who says that vaccines cause autism or that climate change isn’t real, for example. Those people don’t have the right expertise to be trusted sources of information on those complex topics. If people don’t care about expertise, then anybody’s opinion can have as much influence on people’s beliefs as anyone else’s.
Part of our job with this symposium is to try to make expertise sexy again.
First, we need to be educated. Students should be taught critical thinking skills as an explicit part of K through 12 education and in university programs. It’s not something that just happens incidentally while teaching other subjects.
Second, we need a society-wide effort to improve the situation. Everybody is responsible for engaging in critical thinking. People tend to model their behaviour on those around them. Studies show that if everybody else in the neighbourhood uses public transit and puts out their recycling bins, then you’re more likely to do it too.
Tackling misinformation is a collective effort, like vaccination. Everybody — or at least some critical mass — has to do it for it to work. We all need to take responsibility for sharing stuff and clicking ‘like’.
I do. We see something that captures our interest on social media. It looks interesting, it might be true, and we want others to see it, so we share it without thinking. There’s a moment of instant gratification as we cross it off our mental to-do list. We’re all feeling like we’re behind and multi-tasking and trying to get things done. So we don’t take the extra time to figure out if what we’re sharing is true or not. I think that slowing down is often what it takes, but most of us don’t want to do that.
Humility is probably the most important single factor in critical thinking, and in having true beliefs, and fewer false beliefs. A humble person considers the possibility that they don’t know something, that maybe there’s another side to this issue, that maybe there’s more to learn. Humility causes us to slow down and ask the right questions and to realize there are experts who know more than we do.
For every section of A Field Guide to Lies, I had experts in the domain read it. For example, although I use and teach statistics, I’m not a Statistician, so I asked statisticians to review the sections on statistics.
It’s important to realize that even if you think you understand something, there’s probably somebody who understands it better.
I don’t know how you teach humility, but I suspect that encouraging curiosity from a young age is part of it.
I was a judge at a Montreal science fair and some students sheepishly presented the unexpected results of an experiment that hadn’t come out the way it was ‘supposed to’. Their teacher had admonished them for this, but they had already committed to presenting at the science fair.
Instead of being admonished, those students should have been taught to ask what factors might have led to those results, to be curious. In science, an unexpected result is exciting! Although there are lots of good teachers, there are also bad ones. We need to ask more of teachers and parents to kindle curiosity.
The next Roger W. Gale Symposium in Philosophy, Politics and Economics is on March 4 and 5.
Join us for a diverse expert panel on how we can combat misinformation while preserving free speech. Daniel Levitin speaks on March 5 from 1:00 p.m. to 1:45 p.m.