The Skeptics Society & Skeptic magazine

👽 We need your support. Please make a tax-deductible donation before Dec 31. Thank you.
Save 25–40% now thru Dec 5 2020 Save 25–40% now thru Dec 5 2020

What evidence would it take to change your mind? (Peter Boghossian)

What Would it Take to Change Your Mind?

I’ve been writing about and teaching critical thinking for more than two decades. “Form beliefs on the basis of the evidence,” was my mantra, and I taught tens of thousands of students how to do just that. Why, then did people leave my classroom with the same preposterous beliefs as when they entered—from alternative medicine to alien abductions to Obama being a Muslim? Because I had been doing it wrong.

The problem is that everyone thinks they form their beliefs on the basis of evidence. That’s one of the issues, for example, with fake news. Whether it’s Facebook, Twitter, or just surfing Google, people read and share stories either that they want to believe or that comport with what they already believe—then they point to those stories as evidence for their beliefs. Beliefs are used as evidence for beliefs, with fake news just providing fodder.

Teaching people to formulate beliefs on the basis of evidence may, ironically, trap them in false views of reality. Doing so increases their confidence in the truth of a belief because they think they’re believing as good critical thinkers would, but they’re actually digging themselves into a cognitive sinkhole. The more intelligent one is, the deeper the hole. As Michael Shermer famously stated, “Smarter people are better at rationalizing bad ideas.” That is, smarter people are better at making inferences and using data to support their belief, independent of the truth of that belief.

What, then, can we skeptics do? Here’s my recommendation: Instead of telling people to form beliefs on the basis of evidence, encourage them to seek out something, anything, that could potentially undermine their confidence in a particular belief. (Not something that will, but something that could. Phrased this way it’s less threatening.) This makes thinking critical.

Here’s an example of how to accomplish that: Jessica believes Obama is a Muslim. Ask her, on a scale from 1–10, how confident she is in that belief. Once she’s articulated a number, say 9, ask her what evidence she could encounter that would undermine her confidence. For example, what would it take to lower her confidence from 9 to 8, or even 6? Ask her a few questions to help her clarify her thoughts, and then invite her to seek out that evidence.

Philosophers call this process “defeasibility”. Defeasibility basically refers to whether or not a belief is revisable. For example, as Muslims don’t drink alcohol, perhaps a picture of Obama drinking beer would lower her confidence from 9 to 8, or maybe videos over the last eight years of Obama praying at Saint John’s Church in DC would be more effective, lowering her confidence to a 6. Or maybe these wouldn’t budge her confidence. Maybe she’d have well-rehearsed, uncritical responses to these challenges.

This is exactly what happened in my Science and Pseudoscience class at Portland State University. A student insisted Obama was a Muslim. When I displayed a series of pictures of Obama drinking beer on the projector, he instantly and emphatically responded,“Those pictures are photoshopped!” I asked him, on a scale of 1–10, how sure he was. He responded 9.9. I then asked him if he’d like to write an extra-credit paper detailing how the claim that the pictures were photoshopped could be false.

This strategy is effective because asking the question, “What evidence would it take to change your mind?” creates openings or spaces in someone’s belief where they challenge themselves to reflect upon whether or not their confidence in a belief is justified. You’re not telling them anything. You’re simply asking questions. And every time you ask it’s another opportunity for people to reevaluate and revise their beliefs. Every claim can be viewed as such, an opportunity to habituate people to seek disconfirming evidence.

If we don’t place defeasibility front and center, we’re jeopardizing peoples’ epistemic situation by unwittingly helping them artificially inflate the confidence they place in their beliefs. We’re creating less humility because they’re convincing themselves they’re responsible believers and thus that their beliefs are more likely to be true. That’s the pedagogical solution. It’s the easy part.

Skeptic magazine 22.1

This article appeared in Skeptic magazine 22.1 (2017)

Buy print edition
Buy digital edition

The more difficult part is publicly saying, “I don’t know” when we’re asked a question and don’t know the answer. And more difficult still, admitting “I was wrong” when we make a mistake. These are skills worth practicing.

Critical thinking begins with the assumption that our beliefs could be in error, and if they are, that we will revise them accordingly. This is what it means to be humble. Contributing to a culture where humility is the norm begins with us. We can’t expect people to become critical thinkers until we admit our own beliefs or reasoning processes are sometimes wrong, and that there are some questions, particularly in our specialties, that we don’t know how to answer. Doing so should help people become better critical thinkers, far more than 1000 repetitions of “form beliefs on the basis of evidence” ever could.

About the Author

Peter Boghossian is an Assistant Professor of Philosophy at Portland State University and an affiliated faculty member at Oregon Health Science University in the Division of General Internal Medicine. His popular pieces can be found in Scientific American, Time, the Philosopher’s Magazine, and elsewhere. Follow Peter on Twitter @peterboghossian.

Recommended by Amazon
42 Comments
Newest
Oldest
Inline Feedbacks
View all comments

This site uses Akismet to reduce spam. Learn how Akismet processes your comment data. Comments are closed 45 days after an article is published.

Donate

Get eSkeptic

Be in the know.

eSkeptic delivers great articles, videos, podcasts, reviews, event announcements, and more to your inbox.

Sign me up!

Retrospective

Skeptic cover art by Pat Linse

Art of the Skeptic

In celebration of Skeptic magazine ’s 100th issue, we present sage graphic art advice for skeptical groups and a gallery of art reflecting more than 47 years of skeptical activism from Skeptic’s long time Art Director, Pat Linse

Detecting Baloney

Baloney Detection Kit Sandwich (Infographic) by Deanna and Skylar (High Tech High Media Arts, San Diego, CA)

The Baloney Detection Kit Sandwich (Infographic)

For a class project, a pair of 11th grade physics students created the infographic shown below, inspired by Michael Shermer’s Baloney Detection Kit: a 16-page booklet designed to hone your critical thinking skills.

FREE PDF Download

Wisdom of Harriet Hall

Top 10 Things to Know About Alternative Medicine

Harriet Hall M.D. discusses: alternative versus conventional medicine, flu fear mongering, chiropractic, vaccines and autism, placebo effect, diet, homeopathy, acupuncture, “natural remedies,” and detoxification.

FREE Video Series

Science Based Medicine vs. Alternative Medicine

Science Based Medicine vs. Alternative Medicine

Understanding the difference could save your life! In this superb 10-part video lecture series, Harriet Hall M.D., contrasts science-based medicine with so-called “complementary and alternative” methods.

FREE PDF Download

Top 10 Myths of Terrorism

Is Terrorism an Existential Threat?

This free booklet reveals 10 myths that explain why terrorism is not a threat to our way of life or our survival.

FREE PDF Download

The Top 10 Weirdest Things

The Top Ten Strangest Beliefs

Michael Shermer has compiled a list of the top 10 strangest beliefs that he has encountered in his quarter century as a professional skeptic.

FREE PDF Download

Reality Check: How Science Deniers Threaten Our Future (paperback cover)

Who believes them? Why? How can you tell if they’re true?

What is a conspiracy theory, why do people believe in them, and can you tell the difference between a true conspiracy and a false one?

FREE PDF Download

The Science Behind Why People See Ghosts

The Science Behind Why People See Ghosts

Mind altering experiences are one of the foundations of widespread belief in the paranormal. But as skeptics are well aware, accepting them as reality can be dangerous…

FREE PDF Download

Top 10 Myths About Evolution

Top 10 Myths About Evolution (and how we know it really happened)

If humans came from apes, why aren’t apes evolving into humans? Find out in this pamphlet!

FREE PDF Download

Learn to be a Psychic in 10 Easy Lessons

Learn to do Psychic “Cold Reading” in 10
Easy Lessons

Psychic readings and fortunetelling are an ancient art — a combination of acting and psychological manipulation.

FREE PDF Download

The Yeti or Abominable Snowman

5 Cryptid Cards

Download and print 5 Cryptid Cards created by Junior Skeptic Editor Daniel Loxton. Creatures include: The Yeti, Griffin, Sasquatch/Bigfoot, Loch Ness Monster, and the Cadborosaurus.

Copyright © 1992–2020. All rights reserved. | P.O. Box 338 | Altadena, CA, 91001 | 1-626-794-3119. The Skeptics Society is a non-profit, member-supported 501(c)(3) organization (ID # 95-4550781) whose mission is to promote science & reason. As an Amazon Associate, we earn from qualifying purchases. Privacy Policy.