Skeptic » Reading Room » What Would it Take to Change Your Mind? » Skeptic

The Skeptics Society & Skeptic magazine

What evidence would it take to change your mind? (Peter Boghossian)

What Would it Take to Change Your Mind?

I’ve been writing about and teaching critical thinking for more than two decades. “Form beliefs on the basis of the evidence,” was my mantra, and I taught tens of thousands of students how to do just that. Why, then did people leave my classroom with the same preposterous beliefs as when they entered—from alternative medicine to alien abductions to Obama being a Muslim? Because I had been doing it wrong.

The problem is that everyone thinks they form their beliefs on the basis of evidence. That’s one of the issues, for example, with fake news. Whether it’s Facebook, Twitter, or just surfing Google, people read and share stories either that they want to believe or that comport with what they already believe—then they point to those stories as evidence for their beliefs. Beliefs are used as evidence for beliefs, with fake news just providing fodder.

Teaching people to formulate beliefs on the basis of evidence may, ironically, trap them in false views of reality. Doing so increases their confidence in the truth of a belief because they think they’re believing as good critical thinkers would, but they’re actually digging themselves into a cognitive sinkhole. The more intelligent one is, the deeper the hole. As Michael Shermer famously stated, “Smarter people are better at rationalizing bad ideas.” That is, smarter people are better at making inferences and using data to support their belief, independent of the truth of that belief.

What, then, can we skeptics do? Here’s my recommendation: Instead of telling people to form beliefs on the basis of evidence, encourage them to seek out something, anything, that could potentially undermine their confidence in a particular belief. (Not something that will, but something that could. Phrased this way it’s less threatening.) This makes thinking critical.

Here’s an example of how to accomplish that: Jessica believes Obama is a Muslim. Ask her, on a scale from 1–10, how confident she is in that belief. Once she’s articulated a number, say 9, ask her what evidence she could encounter that would undermine her confidence. For example, what would it take to lower her confidence from 9 to 8, or even 6? Ask her a few questions to help her clarify her thoughts, and then invite her to seek out that evidence.

Philosophers call this process “defeasibility”. Defeasibility basically refers to whether or not a belief is revisable. For example, as Muslims don’t drink alcohol, perhaps a picture of Obama drinking beer would lower her confidence from 9 to 8, or maybe videos over the last eight years of Obama praying at Saint John’s Church in DC would be more effective, lowering her confidence to a 6. Or maybe these wouldn’t budge her confidence. Maybe she’d have well-rehearsed, uncritical responses to these challenges.

This is exactly what happened in my Science and Pseudoscience class at Portland State University. A student insisted Obama was a Muslim. When I displayed a series of pictures of Obama drinking beer on the projector, he instantly and emphatically responded,“Those pictures are photoshopped!” I asked him, on a scale of 1–10, how sure he was. He responded 9.9. I then asked him if he’d like to write an extra-credit paper detailing how the claim that the pictures were photoshopped could be false.

This strategy is effective because asking the question, “What evidence would it take to change your mind?” creates openings or spaces in someone’s belief where they challenge themselves to reflect upon whether or not their confidence in a belief is justified. You’re not telling them anything. You’re simply asking questions. And every time you ask it’s another opportunity for people to reevaluate and revise their beliefs. Every claim can be viewed as such, an opportunity to habituate people to seek disconfirming evidence.

If we don’t place defeasibility front and center, we’re jeopardizing peoples’ epistemic situation by unwittingly helping them artificially inflate the confidence they place in their beliefs. We’re creating less humility because they’re convincing themselves they’re responsible believers and thus that their beliefs are more likely to be true. That’s the pedagogical solution. It’s the easy part.

Skeptic magazine 22.1

This article appeared in Skeptic magazine 22.1 (2017)

Buy print edition
Buy digital edition

The more difficult part is publicly saying, “I don’t know” when we’re asked a question and don’t know the answer. And more difficult still, admitting “I was wrong” when we make a mistake. These are skills worth practicing.

Critical thinking begins with the assumption that our beliefs could be in error, and if they are, that we will revise them accordingly. This is what it means to be humble. Contributing to a culture where humility is the norm begins with us. We can’t expect people to become critical thinkers until we admit our own beliefs or reasoning processes are sometimes wrong, and that there are some questions, particularly in our specialties, that we don’t know how to answer. Doing so should help people become better critical thinkers, far more than 1000 repetitions of “form beliefs on the basis of evidence” ever could.

About the Author

Peter Boghossian is an Assistant Professor of Philosophy at Portland State University and an affiliated faculty member at Oregon Health Science University in the Division of General Internal Medicine. His popular pieces can be found in Scientific American, Time, the Philosopher’s Magazine, and elsewhere. Follow Peter on Twitter @peterboghossian.


  1. Craig Hoyer says:

    OK. I taught math in a religious college, finding students only slightly more willing to self-challenge, which increased my motivation to push for truth. My motivation to talk to a true believer evaporates when she seems to choose to ignore good rules of evidence, regurgitate dogma, and abandon curiosity. Challenges to a believer’s interpretation of the universe can be quite negatively emotional to the challenger, let alone the deluded. In standard “emotional rebound” mode, the believer reacts with self-justification. This does not help. Asking the believer what evidence it would take to change beliefs presupposes curiosity, understanding evidence in context, clarity in how the mind works, a willingness to believe not scientists but in the scientific method, and probably commit thought crimes against god. Believers want to fit in and be comfortable, and in my experience challenging them to skepticism fails, since they see ahead to a consequential life-change if what they know is wrong, and that ain’t gonna happen none round here. Generally, believers misapprehend the need for truth.

  2. Dfg says:

    I see it differently. Skepticism = uncertainty.

    We human don’t seem to like that.

    We need to know how things work so that we can have (at least an impression of) control over things.

    It takes quite a lot of mental strength and balance to accept uncertainty and move forward. Not everyone has that and in many cases you really can’t do much about it.

    Take a person that is looking for more stability = certainty. Teaching him to cross sources and check facts will not work in the direction you expect because the basic hypothesis that he is seeking the truth is wrong : he is only seeking more stability.

    So maybe the first thing to do is to explain people that seeking the truth is painful. It means trying to break the foundations of your mind all the time. So many times, scientists had to throw away theories they built during their entire lifetime and many couldn’t.

    As an illustration, take the movie “Shutter Island”. During that movie, you build a story in your mind brick after brick, just to have it completely ruined at the end. I found the movie impressive, but it left me with a weird, very displeasant feeling at the end, that I had a hard time explaining.

    So progressing towards truth means being trained at this uncomfortable feeling : the impression of having wasted your time, done harm (convincing others of wrong things), taken bad decisions … and still move forward.

  3. Dan Walter says:

    “Skepticism is not cynicism or denial; it is the state of mind that does not agree quickly, that does not accept or take things for granted. A mind that accepts is seeking, not enlightenment or wisdom, but refuge.” J. Krishnamurti

Leave a comment

Get eSkeptic

Science in your inbox every Wednesday!

eSkeptic delivers great articles, videos, podcasts, reviews, event announcements, and more to your inbox once a week.

Sign me up!

Donate to Skeptic

Please support the work of the Skeptics Society. Make the world a more rational place and help us defend the role of science in society.

Detecting Baloney

Baloney Detection Kit Sandwich (Infographic) by Deanna and Skylar (High Tech High Media Arts, San Diego, CA)

The Baloney Detection Kit Sandwich (Infographic)

For a class project, a pair of 11th grade physics students created the infographic shown below, inspired by Michael Shermer’s Baloney Detection Kit: a 16-page booklet designed to hone your critical thinking skills.

FREE Video Series

Science Based Medicine vs. Alternative Medicine

Science Based Medicine vs. Alternative Medicine

Understanding the difference could save your life! In this superb 10-part video lecture series, Harriet Hall, M.D., contrasts science-based medicine with so-called “complementary and alternative” methods. The lectures each range from 32 to 45 minutes.

FREE PDF Download

Top 10 Myths of Terrorism

Is Terrorism an Existential Threat?

This free booklet reveals 10 myths that explain why terrorism is not a threat to our way of life or our survival.

FREE PDF Download

The Top 10 Weirdest Things

The Top Ten Strangest Beliefs

Michael Shermer has compiled a list of the top 10 strangest beliefs that he has encountered in his quarter century as a professional skeptic.

FREE PDF Download

Reality Check: How Science Deniers Threaten Our Future (paperback cover)

Who believes them? Why? How can you tell if they’re true?

What is a conspiracy theory, why do people believe in them, and why do they tend to proliferate? Why does belief in one conspiracy correlate to belief in others? What are the triggers of belief, and how does group identity factor into it? How can one tell the difference between a true conspiracy and a false one?

FREE PDF Download

The Science Behind Why People See Ghosts

The Science Behind Why People See Ghosts

Do you know someone who has had a mind altering experience? If so, you know how compelling they can be. They are one of the foundations of widespread belief in the paranormal. But as skeptics are well aware, accepting them as reality can be dangerous…

FREE PDF Download

Top 10 Myths About Evolution

Top 10 Myths About Evolution (and how we know it really happened)

If humans came from apes, why aren’t apes evolving into humans? Find out in this pamphlet!

FREE PDF Download

Learn to be a Psychic in 10 Easy Lessons

Learn to do Psychic “Cold Reading” in 10
Easy Lessons

Psychic readings and fortunetelling are an ancient art — a combination of acting and psychological manipulation.

Copyright © 1992–2017. All rights reserved. The Skeptics Society | P.O. Box 338 | Altadena, CA, 91001 | 1-626-794-3119. Privacy Policy.