What is Truth, Anyway?

What is Truth, Anyway?
  • Do you believe global warming is real?
  • Do you believe in the germ theory of disease?
  • Do you believe masks work and should be mandated?
  • Do you believe Jesus was resurrected?
  • Do you believe the Holocaust happened?
  • Do you believe there are objective morals and values in life?

As a public intellectual who engages in debates and conversations on a wide range of subjects, I am often asked questions such as these, which I found puzzling at first until I figured out that my interlocutors were confusing the meaning of beliefs and facts. 

For example, I don’t “believe in” the germ theory of disease. I accept it as factually true, and as we’ve seen in the recent pandemic, a germ like the SARS-CoV-2 virus is not something to believe in or disbelieve in. It simply is a matter of fact and it can cause a deadly disease like Covid-19. 

Whether or not vaccines and masks slow its spread is also a factual question that science, at least in principle, can answer, although whether or not vaccines and masks should be mandated by law is a political matter that differs from scientific questions. But asking you if you “believe in” the SARS-CoV-2 virus would be like asking you if you “believe” in gravity. Gravity is just a brute fact of nature. It’s not something to believe or disbelieve. 

As the science fiction author Philip K. Dick famously quipped, “Reality is that which, when you stop believing in it, doesn’t go away.”

Objective Truths and Justified True Belief

What we’re after here is knowledge, which philosophers traditionally define as justified true belief. That is, we want to know what is actually true, not just what we want to believe is true. The problem is that none of us are omniscient. If there is an omniscient God, it’s not me, and it’s also not you. Or, in the secular equivalent, there is objective reality but I don’t know what it is, and neither do you.

Truth: What It Is, How To Find It, & Why It Still Matters

Michael Shermer

BUY ON AMAZON

Once we agree that there is objective truth out there to be discovered and that none of us knows for certain what it is, we need to work together through open dialogue in communities of truth-seekers to figure it out, starting by acknowledging our shortcomings as finite fallible beings subject to all the cognitive biases that come bundled with our reasoning capacities. The workaround for this problem is having adequate evidence to justify one’s beliefs. Here are two examples from science:

  • Dinosaurs went extinct around 65 million years ago. This is true by verification and replication of radiometric dating techniques for volcanic eruptions above and below dinosaur fossils. Since each layer can be accurately dated, we infer that the age of a fossil falls between these two dates. Above the strata dated 65 million years ago, there are no more dinosaurs. Ergo, we can assert with a high degree of confidence that this is an objective fact, and we can be satisfied in the truth of the proposition that dinosaurs went extinct around 65 million years ago, unless and until new data emerge. 
  • Our universe came into existence at the Big Bang some 13.8 billion years ago. This is true based on the convergence of evidence of a wide range of phenomena such as the cosmic microwave background, the abundance of light elements like hydrogen and helium, the distribution of galaxies and the large-scale structure of the cosmos, the redshift of most galaxies that indicates they are all moving away from one another in a way that resembles a giant explosion, and the expansion of space-time itself that resulted from such a big bang, resulting in the accelerating expanding cosmos we see today.
Michael Shermer reminds us that the search for truth is not a luxury, but a necessity. This book is a powerful argument for why reality matters and a practical toolkit for how to find it.
―Sabine Hossenfelder, author of Existential Physics: A Scientist's Guide to Life's Biggest Questions

The above propositions are “true” in the sense that the evidence is so substantial that it would be unreasonable to withhold our provisional assent. At the same time, it’s not impossible, for example, that the dinosaurs went extinct recently, just after the creation of the universe some 10,000 years ago (as Young Earth Creationists assert). However, this proposition is so unlikely, so completely lacking in evidence, and so evidently grounded in religious faith, that we need not waste our time considering it any further (the debate about the age of the Earth was resolved over a century ago). 

Thus, a scientific truth is a claim for which the evidence is so substantial it is rational to offer one’s provisional assent.Provisional is the key word here. Scientific truths are temporary and could change with changing evidence. 

The ECREE Principle, or Why Extraordinary Claims Require Extraordinary Evidence

In his 1980 television series Cosmos, in the episode on the possibility of extraterrestrial intelligence existing somewhere in the galaxy, or of aliens having visited Earth, Carl Sagan popularized a principle about proportioning one’s beliefs to the evidence, when he pronounced that “extraordinary claims require extraordinary evidence.” The ECREE principle was first articulated in the 18th century by the Scottish Enlightenment philosopher David Hume, who wrote in his 1748 An Enquiry Concerning Human Understanding: “a wise man proportions his belief to the evidence.” 

ECREE means that an ordinary claim requires only ordinary evidence, but an extraordinary claim requires extraordinary evidence. Here’s a quotidian example. I once took a road trip from my home in Southern California to the Esalen Institute in Big Sur, California, home of all things New Age. To get there I took the 210 freeway north to the 118 Freeway north to the 101 freeway north to San Luis Obispo, where I exited to Highway 1 and followed the Pacific Coast Highway north through Cambria and San Simeon until arriving at the storied home of the 1960’s Human Potential Movement. Weirdly, just past Cambria, a bright light hovered over my car. Thinking it was a police helicopter, I pulled over to the side of the road, fearful that I had been busted for speeding (which I am wont to do). But it wasn’t the cops. It was the aliens, and they abducted me into their mothership and whisked me off to the Pleiades star cluster where their home planet is located. There I met extraterrestrial beings who gave me a message to take back to Earth—we must stop global warming and nuclear proliferation…or else.

Michael Shermer has a fine record as a long-time crusader for evidenced rationality. This fascinating and wide-ranging book should further enhance his impact on current controversies.
―Lord Martin Rees, Astronomer Royal, former President of the Royal Society

Now, which part of this story triggers your insistence on additional evidence? That’s obvious. My claim to have driven on California highways is ordinary and calls for only ordinary evidence (in this case, you can just take my word for it), but my claim to have been abducted by aliens and rocketed off to the Pleiadeian home planet is extraordinary, and unless I can provide extraordinary evidence—like an instrument from the dashboard of the alien spaceship, or one of the aliens themselves—you should be skeptical.

ECREE also suggests that belief is not an either-or on-off switch—not a discrete state of belief or disbelief, but a continuum on which you can place confidence in a belief according to the evidence: more evidence, more confidence; less evidence, less confidence. Consider the extraordinary claim that another bipedal primate called Big Foot, or Yeti, or Sasquatch survives somewhere on Earth. That would be quite extraordinary because after centuries of searching for such a creature none have been found. 

Truth (Autographed)

Michael Shermer

BUY FROM SHOP SKEPTIC

Before we assent to such a claim we need extraordinary evidence, in this case a type specimen—what biologists call a holotype—in the form of an actual body. Blurry photographs, grainy videos, and stories about spooky things that happen at night when people are out camping does not constitute extraordinary evidence—it’s barely even ordinary evidence—so it is reasonable for us to withhold our provisional assent. 

Impediments to Truth and How to Overcome Them

In addition to falling far short of omniscience, humans are also saddled with numerous cognitive biases, including (to name but a few): confirmation bias, hindsight bias, myside bias, attribution bias, sunk-cost bias, status-quo bias, anchoring bias, authority bias, believability bias, consistency bias, expectation bias, and the blind-spot bias, in which people can be trained to identify all these biases in other people but can’t seem to see the log in their own eye.

Truth lances the myth of truth's subjectivity, arguing (provocatively) that truth can generate moral absolutes. This stimulating, excellent book inspires you to spread the word that the Earth is not flat and that truth matters.
―Robert Sapolsky, author of Determined: A Science of Life Without Free Will

Then there are the suite of logical fallacies, such as Emotive Words, False Analogies, Ad hominem, Hasty Generalization, Either-Or, Circular Reasoning, Reductio ad Absurdum and the Slippery Slope, after-the-fact reasoning, and especially why anecdotes are not data, why rumors do not equal reality, and why the unexplained is not necessarily the inexplicable.

With such listicles of cognitive biases and logical fallacies identified by philosophers and psychologists it’s a wonder we can think at all. But we can and do, through experience, education, and instruction in the art and science of thinking. What follows are some of the methods developed by philosophers and psychologists to identify and work-around all these impediments to the search for truth.

Practice Active Open-Mindedness. Research shows that when people are given the task of selecting the right answer to a problem by being told whether particular guesses are right or wrong, they do the following:

  • Immediately form a hypothesis and look only for examples to confirm it.
  • Do not seek evidence to disprove the hypothesis.
  • Are very slow to change the hypothesis even when it is obviously wrong.
  • If the information is too complex, adopt overly-simple hypotheses or strategies for solutions.
  • If there is no solution, if the problem is a trick and “right” and “wrong” is given at random, form hypotheses about coincidental relationships they observed. 

In their book Superforecasting, Philip Tetlock and Dan Garner document how bad most people are at making predictions, and what skillsets those who are good at it employ. They begin with the results of extensive testing of people’s predictions. It’s not good. Even most so-called experts were no better than dart-tossing monkeys when their predictions were checked. When asked to make specific predictions—for example, “Will another country exit from the EU in the next two years?” and, presciently, “Will Russia annex additional Ukraine territory in the next three months?”—and their prognosticating feet were held to the empirical fire, Tetlock and Garner found that most experts were overconfident (after all, they’re experts), encouraged by the lack of feedback on their accuracy (if no one reminds you of your misses you’ll only remember the hits—the confirmation bias), and are victims of all the cognitive biases and illusions that plague the rest of us. 

Michael Shermer has spent his career grappling with the slipperiest word in our language: truth. As someone who knows firsthand what happens when truth gets lost in noise and narrative, I'm grateful for Shermer's clear-eyed insistence that truth is not only real, but necessary.
―Amanda Knox, author of Free: My Search for Meaning

The worst forecasters were people with big ideas—grand theories about how the world works—such as left-wing pundits predicting class warfare that never came, or right-wing commentators prophesizing a socialistic demise of the free enterprise system that never happened. Failed predictions are hand-waved away—“This means nothing!” “Just you wait!” Superforecasters, by contrast, practice active open-mindedness, which Tetlock and Garner defined quantitatively by asking experts “Do you agree or disagree with the following statements?” Superforecasters were more likely to agree that:

  • People should take into consideration evidence that goes against their beliefs.
  • It is more useful to pay attention to those who disagree with you than to pay attention to those who agree.
  • Even major events like World War II or 9/11 could have turned out very differently.
  • Randomness is often a factor in our personal lives.

Superforecasters were more likely to disagree that:

  • Changing your mind is a sign of weakness.
  • Intuition is the best guide in making decisions.
  • It is important to persevere in your beliefs even when evidence is brought to bear against them.
  • Everything happens for a reason.
  • There are no accidents or coincidences. 

The psychologist Gordon Pennycook and his colleagues developed their own instrument of measuring active open-mindedness, in which people are asked whether they agree or disagree with the following statements, where the more open-minded answer is indicated in parentheses:

  • Beliefs should always be revised in response to new information or evidence. (agree)
  • People should always take into consideration evidence that goes against their beliefs. (agree)
  • I believe that loyalty to one’s ideals and principles is more important than “open-mindedness.” (disagree)
  • No one can talk me out of something I know is right. (disagree)
  • Certain beliefs are just too important to abandon no matter how good a case can be made against them. (disagree)

Active open-mindedness is a cogent tool of reason in assessing the truth value of any claim or idea. As is reason itself, of which active open-mindedness is a subset of rational skills that must be cultivated through education and practice.

Michael Shermer pulls no punches: in a world where opinion too often masquerades as fact, he dismantles delusion and arms us with the tools to meet reality head-on.
―Brian Greene, author of Until the End of Time: Mind, Matter, and Our Search for Meaning in an Evolving Universe

Protect and Defend the Constitution of Knowledge

Objective facts in support of provisional truths about the world are determined by tried-and-true methods developed over the centuries since the Scientific Revolution and the Enlightenment in what are sometimes called rationality communities—scholars, scientists, and researchers who collect data, form and test hypotheses, present their findings to colleagues at conferences, publish their papers in peer reviewed journals and books, and reinforce the norms of truth-telling to their colleagues and students along with themselves. In his book The Constitution of Knowledge, the journalist and civil rights activist Jonathan Rauch outlines and defends the epistemic operating system of Enlightenment liberalism’s social rules for attaining reliable knowledge when people cannot agree on what is true. Although these communities differ in the details of what, exactly, should be done to determine justified true belief, Rauch suggests several features held in common that constitute the constitution of knowledge:

  • Fallibilism. The understanding that we might be wrong.
  • Objectivity. A commitment to the proposition that there is a reality and we can know it through reason and empiricism.
  • Disconfirmation. Challenging or testing any and all claims through peer review and replication (science), editing and fact-checking (journalism), adversarial lawyers (the law), and red-team review (business).
  • Accountability. We should all be held accountable for our mistakes.
  • Pluralism. An insistence on viewpoint diversity.

The most important norm of all is the freedom to critique or challenge any and all ideas. Why?

  • We might be completely right but still learn something new in hearing what someone else has to say.
  • We might be partially right and partially wrong, and by listening to other viewpoints we might stand corrected and refine and improve our beliefs. 
  • We might be completely wrong, so hearing criticism or counterpoint gives us the opportunity to change our minds and improve our thinking. 
  • By listening to the opinions of others we have the opportunity to develop stronger arguments and build better facts for our positions. 
  • My freedom to speak and dissent is inextricably tied to your freedom to speak and dissent. If I censor you, why shouldn’t you censor me? If you silence me, why shouldn’t I silence you? 

If you disagree with me, it is the norms and customs of free speech and open dialogue that allows you to do so. From those open dialogues, debates, and disputations, in time the truth emerges.

Excerpt from Truth: What It Is, How to Find It, and Why It Still Matters, Johns Hopkins University Press. January 27, 2026

Share This Article:

Think a friend would enjoy this? Send it their way!

Member Discussion

Similar Articles

OUR MISSION

To explore complex issues with careful analysis and help you make sense of the world. Nonpartisan. Reality-based.

About Skeptic Magazine