The Skeptics Society & Skeptic magazine


EPISODE # 202

Julia Galef — The Scout Mindset: Why Some People See Things Clearly and Others Don’t

The Scout Mindset: Why Some People See Things Clearly and Others Dont (book cover)

When it comes to what we believe, humans see what they want to see. In other words, we have what Julia Galef calls a “soldier” mindset. From tribalism and wishful thinking, to rationalizing in our personal lives and everything in between, we are driven to defend the ideas we most want to believe—and shoot down those we don’t. But if we want to get things right more often, argues Galef, we should train ourselves to have a “scout” mindset. Unlike the soldier, a scout’s goal isn’t to defend one side over the other. It’s to go out, survey the territory, and come back with as accurate a map as possible. Regardless of what they hope to be the case, above all, the scout wants to know what’s actually true.

In The Scout Mindset, Galef shows that what makes scouts better at getting things right isn’t that they’re smarter or more knowledgeable than everyone else. It’s a handful of emotional skills, habits, and ways of looking at the world—which anyone can learn. With fascinating examples ranging from how to survive being stranded in the middle of the ocean, to how Jeff Bezos avoids overconfidence, to how superforecasters outperform CIA operatives, to Reddit threads and modern partisan politics, Galef explores why our brains deceive us and what we can do to change the way we think.

Julia Galef is the host of the popular Rationally Speaking podcast, where she has interviewed thinkers such as Tyler Cowen, Sean Carroll, Phil Tetlock, and Neil deGrasse Tyson. She is an advisor to OpenAI, works with the Open Philanthropy Project, and cofounded the Center for Applied Rationality. Her 2016 TED Talk “Why You Think You’re Right—Even If You’re Wrong” has been viewed over 4 million times.

Shermer and Galef discuss:

  • mind metaphors,
  • Daniel Kahneman’s Thinking Fast and Slow,
  • Daniel Kahneman vs. Gerd Gigerenzer: how irrational are humans?
  • What if you’re right? Shouldn’t you be a soldier in defense of the truth?
  • myths about the “team of rivals”,
  • beliefs and truths: empirical, religious, political, ideological, aesthetic, personal,
  • social media effects and company regulations?
  • BLM, #metoo, woke, gender, antiracism, etc.,
  • science denial and how to deal with it,
  • selective skeptic test,
  • the outsider test,
  • the ideological Turing test,
  • deception and self-deception,
  • conspiracy theories,
  • persuasion, influence and volition/free will,
  • how to use the principles in The Scout Mindset to structure a meeting between Arabs and Israelis.
Scout mindset vs. soldier mindset (from Shermer’s review of Galef’s book in the Wall Street Journal)

Soldiers rationalize, deny, deceive and self-deceive, and engage in motivated reasoning and wishful thinking in order to win the battle of beliefs. “We talk about our beliefs as if they’re military positions, or even fortresses, built to resist attack,” she writes. “Beliefs can be deep-rooted, well-grounded, built on fact, and backed up by arguments. They rest on solid foundations. We might hold a firm conviction or a strong opinion, be secure in our convictions or have an unshakeable faith in something.” This soldier mindset leads us to defend against people who might “poke holes” in our logic, “shoot down” our beliefs, or confront us with a “knock-down” argument, all of which may be our beliefs are “undermined”, “weakened”, or even “destroyed” so we become “entrenched” in them less we “surrender” to the opposing position.

Soldiers are more likely to agree with statements like these: “Changing your mind is a sign of weakness.” “It is important to persevere in your beliefs even when evidence is brought to bear against them.” Scouts are more likely to agree with these statements: “People should take into consideration evidence that goes against their beliefs.” “It is more useful to pay attention to those who disagree with you than to pay attention to those who agree.” Scouts, Galef explains, “revise their opinions incrementally over time, which makes it easier to be open to evidence against their beliefs” and “they view errors as opportunities to hone their skill at getting things right, which makes the experience of realizing ‘I was wrong’ feel valuable, rather than just painful.” In fact, Galef suggests, let’s drop the whole “wrong” confession and instead describe the process as “updating”, a reference to Bayesian reasoning in which we revise our estimations of the probability of something being true after gaining new information about it. “An update is routine. Low-key. It’s the opposite of an overwrought confession of sin,” Galef continues. “An update makes something better or more current without implying that its previous form was a failure.”

If you enjoy the podcast, please show your support by making a $5 or $10 monthly donation.

This episode was released on August 21, 2021.

Skeptic Magazine App on iPhone

SKEPTIC App

Whether at home or on the go, the SKEPTIC App is the easiest way to read your favorite articles. Within the app, users can purchase the current issue and back issues. Download the app today and get a 30-day free trial subscription.

Download the Skeptic Magazine App for iOS, available on the App Store
Download the Skeptic Magazine App for Android, available on Google Play
Download the Skeptic Magazine App for iOS, available on the App Store
Download the Skeptic Magazine App for Android, available on Google Play
SKEPTIC • 3938 State St., Suite 101, Santa Barbara, CA, 93105-3114 • 1-805-576-9396 • Copyright © 1992–2023. All rights reserved • Privacy Policy