The Skeptics Society & Skeptic magazine


Browse by Author

Read posts by:
Michael Shermer

Dr. Michael Shermer is the Publisher of Skeptic magazine, a monthly columnist for Scientific American, an Adjunct Professor at Claremont Graduate University and Chapman University, and the author of The Believing Brain, Why People Believe Weird Things, Why Darwin Matters, The Mind of the Market, How We Believe, and The Science of Good and Evil. His new book is The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom. Read Michael’s other posts on this blog.

Metaphors & Mindsets
How to ‘Update’ Beliefs

Posted on Apr. 24, 2021 by | Comments (6)

The word “metaphor” derives from the Greek words “meta” (“change of place, order, condition, or nature”) and “phor” (to carry or transfer) and in common usage means to carry or transfer you from one idea that is difficult to understand to another idea that is easier to grasp. Many subjects in science are notoriously incomprehensible and so the use of metaphors is essential. The Newtonian “mechanical universe” metaphor, for example, carries us from the difficult idea of gravity and its spooky action at a distance to the more understandable “clockwork universe” of gears and wheels. In turn, this metaphor was employed by Enlightenment thinkers to explain the workings of everything from the human body with its levers and pullies (joints, tendons, muscles) to the workings of political systems (with the king as the sun and his subjects as planets circling about him) and even economies — François Quesnay modeled the French economy after the human body, in which money flows through a nation like blood flows through a body, and ruinous government policies were like diseases that impeded economic health, leading him to propose laissez faire as a government policy.

The workings of the human mind are especially enigmatic, so scientists have invoked metaphors such as hydraulic mechanisms, electrical wires, logic circuits, computer networks, software programs, and information workspaces. Nobel laureate psychologist Daniel Kahneman famously invoked the metaphor of thinking as a dual system: fast and intuitive versus slow and rational. In The Scout Mindset Julia Galef, cofounder of the Center for Applied Rationality and host of the popular podcast Rationally Speaking, quite effectively uses the metaphor of mindsets — that of scouts and soldiers. The soldier mindset leads us to defend our beliefs against outside threats, seek out and find evidence to support our beliefs and ignore or rationalize away counterevidence, and resist admitting we’re wrong as that feels like defeat. The scout mindset, by contrast, seeks to discover what is true through evidence and reasoning toward conclusions that lead to a more accurate map of reality, “the motivation to see things as they are, not as you wish they were,” Galef explains.

The norms of reasoning between these two mindsets are striking and, Galef argues, explains how thinking goes right and wrong. Soldiers rationalize, deny, deceive and self-deceive, and engage in motivated reasoning and wishful thinking in order to win the battle of beliefs. “We talk about our beliefs as if they’re military positions, or even fortresses, built to resist attack,” she writes. “Beliefs can be deep-rooted, well-grounded, built on fact, and backed up by arguments. They rest on solid foundations. We might hold a firm conviction or a strong opinion, be secure in our convictions or have an unshakeable faith in something.” This soldier mindset leads us to defend against people who might “poke holes” in our logic, “shoot down” our beliefs, or confront us with a “knock-down” argument, all of which may be our beliefs are “undermined”, “weakened”, or even “destroyed” so we become “entrenched” in them less we “surrender” to the opposing position.

If you’re right, of course, this can be effective. But the problem is that none of us are omniscient, and almost all reasoning and decision making happens under uncertainty, so the soldier mindset can easily lead to error. In seeking truth — that is, an accurate map of reality regardless of which belief is right — scouts engage in more open-minded discovery, objectivity, and intellectual honesty in which “I was wrong” and “I change my mind” become virtues instead of vices.

An update makes something better or more current without implying that its previous form was a failure.

Soldiers are more likely to agree with statements like these: “Changing your mind is a sign of weakness.” “It is important to persevere in your beliefs even when evidence is brought to bear against them.” Scouts are more likely to agree with these statements: “People should take into consideration evidence that goes against their beliefs.” “It is more useful to pay attention to those who disagree with you than to pay attention to those who agree.” Scouts, Galef explains, “revise their opinions incrementally over time, which makes it easier to be open to evidence against their beliefs” and “they view errors as opportunities to hone their skill at getting things right, which makes the experience of realizing ‘I was wrong’ feel valuable, rather than just painful.” In fact, Galef suggests, let’s drop the whole “wrong” confession and instead describe the process as “updating”, a reference to Bayesian reasoning in which we revise our estimations of the probability of something being true after gaining new information about it. “An update is routine. Low-key. It’s the opposite of an overwrought confession of sin,” Galef continues. “An update makes something better or more current without implying that its previous form was a failure.”

Galef’s scout mindset metaphor deserves its place among the pantheon of cognitive metaphors, not only because it explains how people reason (or fail to reason) and why we so often get things wrong, but also why “we change our minds less often than we should but more often than we could.” Yes, she acknowledges, we “sometimes hide the truth from ourselves” but more importantly, she expounds, “the times we succeed in not fooling ourselves, and what we can learn from those successes.”

The Scout Mindset: Why Some People See Things Clearly and Others Don‘t (book cover)

Galef has first-hand experience in running workshops teaching people how to use the tools of rationality like probability, logic, and especially how to avoid cognitive biases, only to discover that this wasn’t enough. Learning about cognitive biases no more improves your judgement than reading about exercise improves your fitness. Knowing how to reason is important but cultivating an attitude of truth-seeking through the scout mindset is like developing a lifestyle of exercise that leads to fitness. Without the attitude of curiosity and truth-seeking undergirding knowledge and tools of science and reason, mental fitness will elude us. The Scout Mindset cultivates those virtues in an engaging and enlightening account from which we can all benefit. END

TAGS: , , , , , , , , , ,

The Skeptic’s Chaplain
Richard Dawkins as a Fountainhead of Skepticism

Posted on Mar. 26, 2021 by | Comments (2)
The following essay was commissioned by Oxford University Press to be included in a volume entitled Richard Dawkins. How a Scientist Changed the Way We Think: Reflections by Scientists, Writers, and Philosophers, edited by Alan Grafen and Mark Ridley (biologists and former graduate students of Dawkins) and published in 2006 to mark the 30th anniversary of the publication in 1976 of Dawkins’ influential book, The Selfish Gene. The volume includes essays by such notable scientists, writers, and philosophers as Steven Pinker, Daniel Dennett, Helena Cronin, Marian Stamp Dawkins, Ullica Segerstrale, David Deutsch, Michael Ruse, Patrick Bateson, Martin Daly and Margo Wilson, Randolph Nesse, David Barash, and many others.

Over the weekend of august 12–14, 2001, I participated in an event entitled “Humanity 3000,” whose mission it was to bring together “prominent thinkers from around the world in a multidisciplinary framework to ponder issues that are most likely to have a significant impact on the long-term future of humanity.” Sponsored by the Foundation for the Future—a nonprofit think tank in Seattle founded by aerospace engineer and benefactor Walter P. Kistler—long-term is defined as a millennium. We were tasked with the job of prognosticating what the world will be like in the year 3000.

Yeah, sure. As Yogi Berra said, “It’s tough to make predictions, especially about the future.” If such a workshop were held in 1950 would anyone have anticipated the World Wide Web? If we cannot prognosticate fifty years in the future, what chance do we have of saying anything significant about a date 20 times more distant? And please note the date of this conference—needless to say, not one of us realized that we were a month away from the event that would redefine the modern world with a date that will live in infamy. It was a fool’s invitation, which I accepted with relish. Who could resist sitting around a room talking about the most interesting questions of our time, and possibly future times, with a bunch of really smart and interesting people. To name but a few with whom I shared beliefs and beer: science writer Ronald Bailey, environmentalist Connie Barlow, twin research expert Thomas Bouchard, brain specialist William Calvin, educational psychologist Arthur Jensen, mathematician and critic Norman Levitt, memory expert Elizabeth Loftus, evolutionary biologist Edward O. Wilson, and many others, all highly regarded in their fields, well published, often controversial, and always relevant.

Also in attendance, there to receive the $100,000 Kistler Prize “for original work that investigates the social implications of genetics,” was the Oxford University evolutionary biologist Richard Dawkins. (Ed Wilson was the previous year’s winner and was there to co-present the award, along with Walter Kistler, to Richard.) Dawkins was awarded a gold medal and a check for his work “that redirected the focus of the ‘levels of selection’ debate away from the individual animal as the unit of evolution to the genes, and what he has called their extended phenotypes.” Simultaneously, the award description continues, Dawkins “applied a Darwinian view to culture through the concept of memes as replicators of culture.” Finally, “Dr. Dawkins’ contribution to a new understanding of the relationship between the human genome and society is that both the gene and the meme are replicators that mutate and compete in parallel and interacting struggles for their own propagation.” The prize ceremony was followed by a brilliant acceptance speech by Richard, who never fails to deliver in his role as a public intellectual (the number one public intellectual in England, according to Prospect magazine) and spokesperson for the public understanding of science.

This is not what most impressed me about Richard, however, since any professional would be expected to shine in a public forum, especially with a six-figure motivator hanging around his neck. It was during the two full days of round-table discussions, breakout sessions, fishbowl debates, and (most interestingly) coffee-break chats, where Richard stood out head-and-shoulders above this august crowd. Despite his reputation as a massive egotist, Richard is, in fact, somewhat shy and quiet, a man who listens carefully, thinks through what he wants to say, and then says it with an economy of words that is a model for any would-be opinion editorialist. In one session, for example, we were to debate “Conscious Evolution—Fantasy or Fact?” After about 20 minutes of discussion of a topic none of us had carefully defined, Richard spoke up:

I wanted to listen around the table to see if I could make out what conscious evolution is. I still haven’t. It seems to be a mix-up of two, or three, or four very different things. There is the evolution of consciousness; there is what Julian Huxley would have called “consciousness of evolution” or, the way he put it was “man is evolution become conscious of itself.” But entirely separate from that…forget about consciousness and just talk about deliberate control of evolution, and then we bifurcate again into two entirely different kinds of evolution. That is genetic evolution and cultural evolution. I am not going to utter the “m” word [memes]; everybody else keeps saying it and then looking at me, and I am going to duck out of that. I used not to think this, but I am increasingly thinking that nothing but confusion arises from confounding genetic evolution with cultural evolution, unless you are very careful about what you are doing and don’t talk as though they are somehow just different aspects of the same phenomenon. Or, if they are different aspects of the same phenomenon, then let’s hear a good case for regarding them as such.

The first response was from the futurist Michael Marien, who said: “I would like to start back at the point where Richard Dawkins honestly said he had never heard the term conscious evolution. Sometimes a statement of ignorance can be very illuminating.” Indeed it can be, and Richard’s candid comments throughout the weekend illuminated the conference like no one else’s.

Outside of additional specifics of what Richard said, here is my overall impression of that weekend in Seattle, an observation with broader implications for Dawkins’ impact on science and culture: a discussion would ensue over some issue, such as “the factors most critical to the long-term future of humanity,” and most of us panelists would jump in with our opinions, banter some particular theme back and forth for awhile, then leap to another topic, hammer that into the faux-mahogany table, and so forth round and round. Richard would sit there listening, processing our long-winded verbosities, select his moment to lean forward and make a short observational remark or inductive inference, then sit back and collect more data. It was what happened after Richard spoke that I came to realize this is a man on a different plane, above even these stellar minds. The conversation changed, bifurcated in a new direction, with references to its source. “You know, Richard has a point…” “I’d like to comment on Richard’s observation…” “Going back to what Professor Dawkins said…” And so on. Richard Dawkins changed the conversation. He has been changing the conversation ever since 1976, when his book The Selfish Gene changed the way we look at ourselves and our world.

* * *

Humans are a hierarchical social primate species who, despite centuries of democratic rule, still long to sort themselves into pecking orders within families, schools, peer groups, social clubs, corporations, and societies. We can’t help it. It is in our nature, courtesy of natural selection operating in the social sphere. As an intellectual social movement with which I am intimately involved, skepticism is subject to the same hierarchical social forces. As such, we scientists and skeptics look up to and model ourselves after our alpha leaders. In my own intellectual development there have been several who served me well in this capacity, including Carl Sagan, Stephen Jay Gould, and Richard Dawkins. They are, in fact, candles in the darkness of our demon-haunted world (in Carl’s apt phrase from his skeptical manifesto). Lamentably, we lost Carl and Steve too early. How I long for one more poetic narrative on our pale blue dot in the vast cosmos, or one more elegant essay on life’s complexity and history’s contingency.

But thank the fates and his hearty DNA that we still have Richard, who stands as a beacon of scientific skepticism and a hero to skeptics around the world. Dawkins’ work has touched the skeptical movement in three areas of common concern: pseudoscience, creationism, and religion.

Dawkins’ primary work on pseudoscience is Unweaving the Rainbow, a collection of essays centered on “Science, Delusion and the Appetite for Wonder” (the book’s subtitle). Here we see almost no limits to the breadth of Richard’s interests, as he skeptically analyzes astrology, coincidences, conjurors, eyewitness accounts, fairies, flying saucers, the Gaia hypothesis, gambling fallacies, hallucinations, horoscopes, illusion, imagination, intuition, miracles, mysticism, paranormalism, post-modernism, psychic phenomena, reincarnation, Scientology, superstition, telepathy, and even The X-Files. Richard’s analysis of these and other delusions is not debunking as such, but more positively directed toward helping us better grasp what science is by looking at what science isn’t; and how we can recognize good science by seeing bad science for what it is. Deeper still is the take-home message embodied in the book’s title from Keats, “who believed that Newton had destroyed all the poetry of the rainbow by reducing it to the prismatic colours. Keats could hardly have been more wrong.” In his stead Richard offers us this insight: “I believe that an orderly universe, one indifferent to human preoccupations, in which everything has an explanation even if we still have a long way to go before we find it, is a more beautiful, more wonderful place than a universe tricked out with capricious, ad hoc magic.”

Creationism is a form of pseudoscience, and the connection here is an obvious one for an evolutionary biologist who holds the job title of “Professor of the Public Understanding of Science.” In America, at least, there is no more public misunderstanding of science than creationism, and Richard has written broadly and deeply on the subject, and he minces no words (and it is with creationists especially that Richard does not “suffer fools gladly,” as he has been accused). After the May, 2005, hearings in Kansas on the proposed introduction of “Intelligent Design” into the public school science curriculum, Dawkins fired off an opinion editorial in The Times (London) on May 21, entitled “Creationism: God’s Gift to the Ignorant,” that included this poignant observation that cut through yards of creationist verbiage with clarity and wit:

The standard methodology of creationists is to find some phenomenon in nature that Darwinism cannot readily explain. Darwin said: “If it could be demonstrated that any complex organ existed which could not possibly have been formed by numerous, successive, slight modifications, my theory would absolutely break down.” Creationists mine ignorance and uncertainty in order to abuse his challenge. “Bet you can’t tell me how the elbow joint of the lesser spotted weasel frog evolved by slow gradual degrees?” If the scientist fails to give an immediate and comprehensive answer, a default conclusion is drawn: “Right, then, the alternative theory, ‘intelligent design’, wins by default.”

At least three of Richard’s books—The Blind Watchmaker, Climbing Mount Improbable, and River Out of Eden—are direct challenges to creationists’ arguments, although presented not as straight debunking works, but as science advancing treatises on evolutionary theory. And Richard’s latest book, The Ancestor’s Tale, is one long answer to creationists’ demand to “show me just one transitional fossil.” Dawkins traces innumerable transitional forms, or what he calls “concestors”—the “point of rendezvous” of the last common ancestor shared by a set of species—from Homo sapiens back four billion years to the origin of heredity and the emergence of evolution. No one concestor proves that evolution happened, but together they reveal a majestic story of process over time. Richard is the Geoffrey Chaucer of life’s history, and our most articulate public defender of evolution.

Creationism, of course, is nothing more than thinly disguised religion masquerading as science, in order to attempt an end-run around the U.S. Constitution’s First Amendment prohibition on government establishment of religion. Richard’s views on religion, particularly when it intersects with science, are so public and controversial that they have even inspired a book by the Oxford University professor of historical theology, Alister McGrath, Dawkins’ God (Blackwell, 2004), a book I reviewed in Science (8 April 2005, 205–206). According to Dawkins the connection between science and religion runs like this: Before Darwin, the default explanation for the apparent design found in nature was a top-down designer, God. The 18th-century English theologian, William Paley, formulated this into the infamous watchmaker argument: If one stumbled upon a watch on a heath, one would not assume it had always been there, as one might with a stone. A watch implies a watchmaker. Design implies a designer. Darwin provided a scientific explanation of design from the bottom up: natural selection. Since then, arguably no one has done more to make the case for bottom-up design than Dawkins, particularly in The Blind Watchmaker, a direct challenge to Paley. But if design comes naturally from the bottom up and not supernaturally from the top down, what place, then, for God?

Although most scientists avoid the question altogether, or take a conciliatory stance along the lines of Stephen Jay Gould’s non-overlapping magisteria (NOMA), Dawkins unequivocally states in The Blind Watchmaker: “Darwin made it possible to be an intellectually fulfilled atheist.” And in River Out of Eden: “The universe we observe has precisely the properties we should expect if there is, at bottom, no design, no purpose, no evil and no good, nothing but blind pitiless indifference.”

Herein lies the crux of the issue, and Dawkins brooks no theological obfuscation. For example, after debunking all the quasi-scientific and pseudoscientific arguments allegedly proving God’s existence, scientists are told by theologians like Alister McGrath that we should come to know God through faith. But what does that mean, exactly? In The Selfish Gene, Dawkins wrote that faith “means blind trust, in the absence of evidence, even in the teeth of evidence.” This, says McGrath, “bears little relation to any religious (or any other) sense of the word.” In its stead McGrath presents the definition of faith by the Anglican theologian W. H. Griffith-Thomas: “It commences with the conviction of the mind based on adequate evidence; it continues in the confidence of the heart or emotions based on conviction, and it is crowned in the consent of the will, by means of which the conviction and confidence are expressed in conduct.” Such a definition—which McGrath describes as “typical of any Christian writer”—is what Dawkins, in reference to French postmodernists, calls “continental obscurantism.” Most of it describes the psychology of belief. The only clause of relevance to a scientist is “adequate evidence,” which raises the follow-up question, “Is there?” Dawkins’ answer is an unequivocal “No.”

* * *

Does a scientific and evolutionary worldview such as that proffered by Richard Dawkins obviate a sense of spirituality? I think not. If we define spirituality as a sense of awe and wonder about the grandeur of life and the cosmos, then science has much to offer. As proof, I shall close with a final story about Richard, and a moment we shared inside the dome of the 100-inch telescope atop Mt. Wilson in Southern California. It was in this very dome, on October 6, 1923, that Edwin Hubble first realized that the fuzzy patches he was observing were not “nebula” within the Milky Way galaxy, but were separate galaxies, and that the universe is bigger than anyone imagined, a lot bigger. Hubble subsequently discovered through this same telescope that those galaxies are all red-shifted—their light is receding from us and thus stretched toward the red end of the electromagnetic spectrum—meaning that all galaxies are expanding away from one another, the result of a spectacular explosion that marked the birth of the universe. It was the first empirical data indicating that the universe had a beginning, and thus was not eternal. What could be more awe-inspiring—more numinous, magical, spiritual— than this cosmic visage of deep time and deep space?

Since I live in Altadena, on the edge of a cliff in the foothills of the San Gabriel mountains atop which Mt. Wilson rests, I have had many occasions to make the trek to the telescopes. In November, 2004, I arranged for a visit to the observatory for Richard, who was in town on a book tour for The Ancestor’s Tale. As we were standing beneath the magnificent dome housing the 100-inch telescope, and reflecting on how marvelous, even miraculous, this scientistic vision of the cosmos and our place in it all seemed, Richard turned to me and said, “All of this makes me so proud of our species that it almost brings me to tears.”

I can only echo the same sentiment about the works and words of Richard Dawkins. END

TAGS: , , , , , , , , ,

Afrocentric Pseudoscience & Pseudohistory

Posted on Feb. 11, 2021 by | Comments Off on Afrocentric Pseudoscience & Pseudohistory
There is a lot of high-quality, constructive Afrocentric scholarship. As in most fields, however, there are fringe groups and extraordinary claims that grab our attention because of their extremism, and, occasionally, their absurdity. Since it is our job at Skeptic magazine to track these groups and claims, we bring them to our reader’s attention. This is not to imply that all or most African-American scientists and historians believe such claims. The recent surge of these beliefs, however, especially when supported by such recognizable names as Louis Farrakhan, is alarming. Here are just a few quotes emblematic of this extreme. This compilation by the editors appeared in Skeptic magazine 2.4 (1994).

On Blacks, Whites, and Melanin

“We’re not talking about superiority and inferiority, but we’re talking about the important factor of melanin. It allows us [blacks, who have more melanin and are thus the superior “sun people” over the inferior white “ice people”] to negotiate the vibrations of the universe and to deal with the ultraviolet rays of the sun. There’s a mix of DNA, RNA, and there’s a not-too-understood question of melanin, the organized molecule, in the beginning.” —Leonard Jeffries, Time, 2/14/93

Who Created Civilization?

“For the first two or three thousand years of civilization, there was not a civilized white man on the earth. Civilization was founded and developed by the swarthy races of Mesopotamia, Syria, and Egypt. It was southern colored peoples everywhere, in China, in Central America, in India, Mesopotomia, Syria, Egypt and Crete who gave the northern white peoples civilization.” —John Jackson, Ethiopia and the Origin of Civilization

The Holocaust

“Because I Said the Holocaust of Black People Was a Hundred Times Worse Than the Holocaust of Jews They Were Angry With Me for Even Comparing This, and It Is Because You Feel Your Life Is So Much More Sacred Than the Lives of the Gentiles, or the Lives of the Asians, Arabs, and Africans.” —Louis Farrakhan, Interview With Barbara Walters, 20/20


“The white man is so wicked and filthy that God calls him the scum of the planet Earth.” —Elijah Muhammad, quoted by Barbara Walters on 20/20


“Jews are sucking the blood of blacks. No matter who sits in the seat at the White House, Jews control that seat. Jews control the media, especially NBC, ABC, and CBS. The Pope is a no-good cracker. Somebody needs to raise that dress up and see what’s really under there.” —Khallid Muhammad, speech at Kean college

“I’m going to be a pitbull, that’s the way I’m going to be against the Jews.” —Khallid Muhammad, speech at Howard University

Drugs, Guns, AIDS Conspiracy

“Black males are specifically programmed for self-destruction by this society. Hundreds of thousands of U.S. military troops are called on to wage urban warfare on our people … An avalanche of cheap heroin was unleashed into our communities to lull our people to sleep…African Americans are beginning to realize that the real enemy is not the brother standing across the street, but the white man in the top floor of the downtown high rise.” —Malcolm Carson, “The White Conspiracy,”The Hilltop, Howard University campus newspaper. “The crack epidemic did not start until after Louis Farrakhan became the voice of the poor.” [Regarding guns]: “Somebody is inspiring the black people to kill off each other. We believe it’s the government of the United States of America.” —Louis Farrakhan, Washington Post, 1990

Skeptic 2.4 (cover)

This article appeared in Skeptic magazine 2.4 (1994).
Buy print edition
Buy digital edition
Subscribe to print edition
Subscribe to digital edition

“The fact of the matter is that there was no crack prior to 1985 when Farrakhan spoke in Madison Square Garden in New York and nearly 50,000 people came out to hear him. The fact of the matter is that in the 60s when blacks began moving in the Civil Rights movement a purer form of Heroin was introduced into the black community. I believe with all my heart that there is a dirty hand here somewhere and if the government is not responsible then help me clean it up.” —Louis Farrakhan, Interview with Barbara Walters, 20/20

“The spread of international AIDS was an attempt by the U.S. government to decimate the population of central Africa.” —Louis Farrakhan, African-American Summit speech, New Orleans, 1989

“Do you know where the AIDS virus was developed? Right outside of Washington. It is my feeling that the U.S. government is deliberately spreading AIDS.” —Louis Farrakhan, Interview with Barbara Walters, 20/20

About the Author

Dr. Michael Shermer is Publisher of Skeptic magazine, Director of the Skeptics Society, and Assistant Professor of History of Science at Occidental College. He has published Teach Your Child Science and co-authored Teach Your Child Math and Mathemagics with Dr. Arthur Benjamin. Dr. Shermer has just finished two book manuscripts, Heretic-Scientist, a biography of Alfred Russel Wallace, and The Chaos of History, on the application of chaos theory to human history. He has also written numerous cycling books based on a ten year professional career as an ultra-marathon cyclist and competitor in the transcontinental Race Across America.

TAGS: , , ,

Replicating Milgram
A Study on Why People Commit Evil Deeds

Posted on Oct. 02, 2020 by | Comments (6)

In recent discussions about the “replication crisis” in science, in addition to a large percentage of famous psychology experiments failing to replicate, suggestions have also been made that some classic psychology experiments could not be conducted today due to ethical or practical concerns, the most notable being that of Stanley Milgram’s famous shock experiments. In fact, in 2010, Dr. Michael Shermer, working with Chris Hansen and Dateline NBC producers, replicated a number of classic psychology experiments, including Milgram. What follows is a summary of that research, from Chapter 9, Moral Regress, in Dr. Shermer’s book The Moral Arc, along with the two-part episode from the Dateline NBC show, called “What Were You Thinking?”

In 2010, I worked on a Dateline NBC two-hour television special in which we replicated a number of now classic psychology experiments, including that of Yale University professor Stanley Milgram’s famous shock experiments from the early 1960s on the nature of evil. Here we provide links to the two-part segment television episodes of our replication of Milgram.

In public talks in which I screen these videos I am occasionally asked how we got this replication passed by an Institutional Review Board (an IRB), which is required for scientific research, inasmuch as such experiments could never be conducted today. We didn’t. This was for network television, not an academic laboratory, so the equivalent of an IRB was review by NBC’s legal department, which approved it. This seems to surprise — even shock — many academics, until I remind them of what people do to one another on reality television programs in which subjects are stranded on remote islands or thick jungles and left to fend for themselves — sometimes naked and afraid — in various contrivances that resemble a Hobbesian world of a war of all against all.

Watch Replicating Milgram on Dateline NBC’s special “What Were You Thinking?” Part 1 & 2 (playlist)
Shock and Awe in a Yale Lab

Shortly after the war crimes trial of Adolf Eichmann began in Jerusalem in July of 1961, psychologist Stanley Milgram devised a set of experiments, the aim of which was to better understand the psychology behind obedience to authority. Eichmann had been one of the chief orchestrators of the Final Solution but, like his fellow Nazis at the Nuremberg trials, his defense was that he was innocent by virtue of the fact that he was only following orders. Befehl ist Befehl — orders are orders — is now known as the Nuremberg defense, and it’s an excuse that seems particularly feeble in a case like Eichmann’s. “My boss told me to kill millions of people so — hey — what could I do?” is not a credible defense. But, Milgram wondered, was Eichmann unique in his willingness to comply with orders, no matter how atrocious? And just how far would ordinary people be willing to go?

A contestant for our faux television reality show “What a Pain!” talks to our actors playing the learner and the authority to be obeyed.

A contestant for our faux television reality show “What a Pain!” talks to our actors playing the learner and the authority to be obeyed.

Obviously Milgram could not have his experimental subjects gas or shoot people, so he chose electric shock as a legal nonlethal substitute. Looking for subjects to participate in what was billed as a “study of memory,” Milgram advertised on the Yale campus and also in the surrounding New Haven community. He said he wanted “factory workers, city employees, laborers, barbers, businessmen, clerks, construction workers, sales people, telephone workers,” not just the usual guinea pigs of the psychology lab, i.e., undergraduates participating for extra credit or beer money. Milgram assigned his subjects to the role of “teacher” in what was purported to be research on the effects of punishment on learning. The protocol called for the subject to read a list of paired words to the “learner” (who was, in reality, a shill working for Milgram), then present the first word of each pair again, upon which the learner was to recall the second word. Each time that the learner was incorrect, the teacher was to deliver an electric shock from a box with toggle switches in 15-volt increments that ranged from 15 volts to 450 volts, and featured such labels as Slight Shock, Moderate Shock, Strong Shock, Very Strong Shock, Intense Shock, Extreme Intensity Shock, and DANGER: Severe Shock, XXXX.1 Despite the predictions of 40 psychiatrists that Milgram surveyed before the experiment, who predicted that only one percent of subjects would go all the way to the end, 65 percent of subject completed the experiment, flipping that final toggle switch to deliver a shocking 450 volts, a phenomenon the social psychologist Philip Zimbardo characterizes as “the pornography of power.”1

Who was most likely to go the distance in maximal shock delivery? Surprisingly — and counter-intuitively — gender, age, occupation, and personality characteristics mattered little. Similar levels of punishment were delivered by the young and the old, by males and females, and by blue-collar and white-collar workers alike. What mattered most was physical proximity and group pressure. The closer the learner was to the teacher, the lower on the ladder of shock they climbed. And when Milgram added more confederates to encourage the teacher to administer ever more powerful shocks, most complied; when the confederate themselves rebelled against the authority figure’s instructions, the teacher was equally disinclined to obey. Nevertheless, 100 percent of Milgram’s subjects delivered at least a “strong shock” of 135 volts.3

Shock and Awe in a New York Studio

In our 2010 replication in a New York City studio, we tested six subjects who believed they were auditioning for a new reality show called “What a Pain!” We followed Milgram’s protocols and had our subjects read a list of paired words to a “learner” (an actor named Tyler), then present the first word of each pair again. When Tyler gave a prearranged incorrect answer, our subjects were instructed by an authority figure (an actor named Jeremy) to deliver an electric shock from a box modeled after Milgram’s contraption that, like ours, was just a hollow box that delivered no shocks at all.

Milgram characterized his experiments as testing “obedience to authority,” and most interpretations over the decades have focused on subjects’ unquestioning adherence to an authority’s commands. What I saw, however, was great reluctance and disquietude in all of our subjects nearly every step of the way. Our first subject, Emily, quit the moment she was told the protocol. “This isn’t really my thing,” she said with nervous laughter. When our second subject, Julie, got to 75 volts (having flipped five switches) she heard Tyler groan. “I don’t think I want to keep doing this,” she said.

Jeremy pressed the case: “Please continue.”

“No, I’m sorry,” Julie protested. “I don’t think I want to.”

“It’s absolutely imperative that you continue,” Jeremy insisted.

“It’s imperative that I continue?” Julie replied in defiance. “I think that — I’m like, I’m okay with it. I think I’m good.”

“You really have no other choice,” Jeremy said in a firm voice. “I need you to continue until the end of the test.”

Julie stood her ground: “No. I’m sorry. I can just see where this is going, and I just — I don’t — I think I’m good. I think I’m good to go. I think I’m going to leave now.”

At that point the show’s host Chris Hansen entered the room to debrief her and introduce her to Tyler, and then Chris asked Julie what was going through her mind. “I didn’t want to hurt Tyler,” she said. “And then I just wanted to get out. And I’m mad that I let it even go five [toggle switches]. I’m sorry, Tyler.”

Our contestant/subject Julie refuses to continue shocking the learner once he begins to cry out in pain.

Our contestant/subject Julie refuses to continue shocking the learner once he begins to cry out in pain.

Our third subject, Lateefah, started off enthusiastically enough, but as she made her way up the row of toggle switches, her facial expressions and body language made it clear that she was uncomfortable; she squirmed, gritted her teeth, and shook her fists with each toggled shock. At 120 volts she turned to look at Jeremy, seemingly seeking an out. “Please continue,” he authoritatively instructed. At 165 volts, when Tyler screamed “Ah! Ah! Get me out of here! I refuse to go on! Let me out!” Lateefah pleaded with Jeremy. “Oh my gosh. I’m getting all…like…I can’t…”; nevertheless Jeremy pushed her politely, but firmly, to continue. At 180 volts, with Tyler screaming in agony, Lateefah couldn’t take it anymore. She turned to Jeremy: “I know I’m not the one feeling the pain, but I hear him screaming and asking to get out, and it’s almost like my instinct and gut is like, ‘Stop,’ because you’re hurting somebody and you don’t even know why you’re hurting them outside of the fact that it’s for a TV show.” Jeremy icily commanded her to “please continue.” As Lateefah reluctantly turned to the shock box, she silently mouthed, “Oh my God.” At this point, as in Milgram’s experiment, we instructed Tyler to go silent. No more screams. Nothing. As Lateefah moved into the 300-volt range it was obvious that she was greatly distressed, so Chris stepped in to stop the experiment, asking her if she was getting upset. “Yeah, my heart’s beating really fast.” Chris then asked, “What was it about Jeremy that convinced you that you should keep going here?” Lateefah gave us this glance into moral reasoning about the power of authority: “I didn’t know what was going to happen to me if I stopped. He just — he had no emotion. I was afraid of him.”

Our fourth subject, a man named Aranit, unflinchingly cruised through the first set of toggle switches, pausing at 180 volts to apologize to Tyler after his audible protests of pain: “I’m going to hurt you and I’m really sorry.” After a few more rungs up the shock ladder, accompanied by more agonizing pleas by Tyler to stop the proceedings, Aranit encouraged him, saying, “Come on. You can do this. We are almost through.” Later, the punishments were peppered with positive affirmations. “Good.” “Okay.” After completing the experiment Chris asked, “Did it bother you to shock him?” Aranit admitted, “Oh, yeah, it did. Actually, it did. And especially when he wasn’t answering anymore.”

Our subject Aranit (left) continues shocking the learner upon the encouragement of our “authority” figure Jeremy (right).

Our subject Aranit (left) continues shocking the learner upon the encouragement of our “authority” figure Jeremy (right).

Two other subjects in our replication, a man and a woman, went all the way to 450 volts, giving us a final tally of five out of six who administered shocks, and three who went all the way to the end of maximal electrical evil. All of the subjects were debriefed and assured that no shocks had actually been delivered, and after lots of laughs and hugs and apologies, everyone departed none the worse for wear.

Active Agents or Mindless Zombies?

What are we to make of these results? In the 1960s — the heyday of the Nurture Assumption4 — it was taken to mean that human behavior is almost infinitely malleable, and Milgram’s data seemed to confirm the idea that degenerate acts are primarily the result of degenerate environments (Nazi Germany being, perhaps, the ultimate example). In other words, evil is a matter of bad barrels, not bad apples.

Milgram’s interpretation of his data included what he called the “agentic state,” which is “the condition a person is in when he sees himself as an agent for carrying out another person’s wishes and they therefore no longer see themselves as responsible for their actions. Once this critical shift of viewpoint has occurred in the person, all of the essential features of obedience follow.” Subjects who are told that they are playing a role in an experiment are stuck in a no-man’s land somewhere between authority figure, in the form of a white lab-coated scientist, and stooge, in the form of a defenseless learner in another room. They undergo a mental shift from being moral agents in themselves who make their own decisions (that autonomous state) to the ambiguous and susceptible state of being an intermediary in a hierarchy and therefore prone to unqualified obedience (the agentic state).

Milgram believed that almost anyone put into this agentic state could be pulled into evil one step at a time — in this case 15 volts at a time — until they were so far down the path there was no turning back. “What is surprising is how far ordinary individuals will go in complying with the experimenter’s instructions,” Milgram recalled. “It is psychologically easy to ignore responsibility when one is only an intermediate link in a chain of evil action but is far from the final consequences of the action.” This combination of a step-wise path, plus a self-assured authority figure that keeps the pressure on at every step, is the double whammy that makes evil of this nature so insidious. Milgram broke the process down into two stages: “First, there is a set of ‘binding factors’ that lock the subject into the situation. They include such factors as politeness on his part, his desire to uphold his initial promise of aid to the experimenter, and the awkwardness of withdrawal. Second, a number of adjustments in the subject’s thinking occur that undermine his resolve to break with the authority. The adjustments help the subject maintain his relationship with the experimenter, while at the same time reducing the strain brought about by the experimental conflict.”5

Put yourself into the mind of one of these subjects — either in Milgram’s experiment or in our NBC replication. It’s an experiment conducted at the prestigious Yale University — or at a studio set up by a major television network. It’s being supervised by an established institution — a national university or a national network. It’s for science — or it’s for television. It’s being run by a white-lab-coated scientist — or by a television director. The authorities overseeing the experiment are either university professors or network executives. An agent — someone carrying out someone else’s wishes under such conditions — would feel in no position to object. And why should she? It’s for a good cause, after all — the advancement of science, or the development of a new and interesting television series.

Dr. Shermer explains for the television audience why people commit evil acts.

Dr. Shermer explains for the television audience why people commit evil acts.

Out of context, if you ask people — even experts, as Milgram did — how many people would go all the way to 450 volts, they lowball the estimate by a considerable degree, as Milgram’s psychiatrists did. As Milgram later reflected: “I am forever astonished that when lecturing on the obedience experiments in colleges across the country, I faced young men who were aghast at the behavior of experimental subjects and proclaimed they would never behave in such a way, but who, in a matter of months, were brought into the military and performed without compunction actions that made shocking the victim seem pallid.”6

In the sociobiological and evolutionary psychology revolutions of the 1980s and 1990s, the interpretation of Milgram’s results shifted toward the nature/biological end of the spectrum from its previous emphasis on nurture/environment. The interpretation softened somewhat as the multidimensional nature of human behavior was taken into account. As it is with most human action, moral behavior is inextricably complex and includes an array of causal factors, obedience to authority being just one among many. The shock experiments didn’t actually reveal just how primed all of us are to inflict violence for the flimsiest of excuses; that is, it isn’t a simple case of bad apples looking for a bad barrel in order to cut loose. Rather the experiments demonstrate that all of us have conflicting moral tendencies that lie deep within.

Our moral nature includes a propensity to be sympathetic, kind, and good to our fellow kith and kin, as well as an inclination to be xenophobic, cruel, and evil to tribal Others. And the dials for all of these can be adjusted up and down depending on a wide range of conditions and circumstances, perceptions and states of mind, all interacting in a complex suite of variables that are difficult to tease apart. In point of fact, most of the 65 percent of Milgram’s subjects who went all the way to 450 volts did so with great anxiety, as did the subjects in our NBC replication. And it’s good to remember that 35 percent of Milgram’s subjects were exemplars of the disobedience to authority — they quit in defiance of what the authority figure told them to do. In fact, in a 2008 partial replication by the social psychologist Jerry Burger, in which he ran the voltage box only up to 150 volts (the point at which the “learner” in Milgram’s original experiment began to cry out in pain), twice as many subjects refused to obey the authority figure. Assuming these subjects were not already familiar with the experimental protocols, the findings are an additional indicator of moral progress from the 1960s to the 2000s caused, I would argued, by that ever-expanding moral sphere and our collective capacity to take the perspective of another, in this case the to-be-shocked learner.7

Alpinists of Evil

Milgram’s model comes dangerously close to suggesting that subjects are really just puppets devoid of free will, which effectively lets Nazi bureaucrats off the hook as mere agentic automatons in an extermination engine run by the great paper-pushing administrator, Adolf Eichmann (whose actions as an unremarkable man in a morally bankrupt and conformist environment were famously described by Hannah Arendt as “the banality of evil.”). The obvious problem with this model is that there can be no moral accountability if an individual is truly nothing more than a mindless zombie whose every action is controlled by some nefarious mastermind. Reading the transcript of Eichmann’s trial is mind numbing (it goes on for thousands of pages), as he both obfuscates his real role while shifting the blame entirely to his overseers, as in this statement:

What I said to myself was this: The Head of State has ordered it, and those exercising judicial authority over me are now transmitting it. I escaped into other areas and looked for a cover for myself which gave me some peace of mind at least, and so in this way I was able to shift — no, that is not the right term — to attach this whole thing one hundred percent to those in judicial authority who happened to be my superiors, to the head of State — since they gave the orders. So, deep down, I did not consider myself responsible and I felt free of guilt. I was greatly relieved that I had nothing to do with the actual physical extermination.8

The last statement might possibly be true — given how many battle-hardened SS soldiers were initially sickened at the site of a killing action — but the rest is pure spin-doctored malarkey and Arendt allowed herself to be taken in by it more than reason would allow, as the historian David Cesarani shows in his revealing biography Becoming Eichmann and as recounted in Margarethe von Trotta’s moving film Hanna Arendt.9 The evidence of Eichmann’s real role in the Holocaust was plain for all to see at the time, as dramatically re-enacted in Robert Young’s 2010 biopic entitled simply Eichmann, based on the transcripts of the interrogation of and confession by Eichmann just before his trial, conducted by the young Israeli police officer Avner Less, whose father was murdered in Auschwitz.10 Time and again, throughout hundreds of recorded hours, Less queries Eichmann about transports of Jews and gypsies sent to their death, all followed by denials and lapses of memory. Less then presses the point by showing Eichmann copies of transport documents with his signature at the bottom, leading Eichmann to say in an exasperated voice, “what’s your point?”

The point is that there is a mountain of evidence proving that Eichmann — like all the rest of the Nazi leadership — were not simply following orders. As Eichmann himself boasted when he wasn’t on trial: “When I reached the conclusion that it was necessary to do to the Jews what we did, I worked with the fanaticism a man can expect from himself. No doubt they considered me the right man in the right place…. I always acted 100 per cent, and in giving of order I certainly was not lukewarm.” As the Holocaust historian Daniel Jonah Goldhagen asks rhetorically, “Are these the words of a bureaucrat mindlessly, unreflectively doing his job about which he has no particular view?”11

The historian Yaacov Lozowick characterized the motives in his book Hitler’s Bureaucrats, in which he invokes a mountain-climbing metaphor: “Just as a man does not reach the peak of Mount Everest by accident, so Eichmann and his ilk did not come to murder Jews by accident or in a fit of absent-mindedness, nor by blindly obeying orders or by being small cogs in a big machine. They worked hard, thought hard, took the lead over many years. They were the alpinists of evil.”12

About the Author

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine, the host of the Science Salon Podcast, and a Presidential Fellow at Chapman University where he teaches Skepticism 101. For 18 years he was a monthly columnist for Scientific American. He is the author of New York Times bestsellers Why People Believe Weird Things and The Believing Brain, Why Darwin Matters, The Science of Good and Evil, The Moral Arc, and Heavens on Earth. His new book is Giving the Devil His Due: Reflections of a Scientific Humanist.

  1. Milgram, Stanley. 1969. Obedience to Authority: An Experimental View. New York: Harper.
  2. Interview with Phil Zimbardo conducted by the author on March 26, 2007.
  3. Milgram, 1969.
  4. Harris, Judith Rich. 1998. The Nurture Assumption: Why Children Turn Out the Way They Do. New York: Free Press.
  5. Milgram, 1969.
  6. Ibid.
  7. Burger, Jerry. 2009. “Replicating Milgram: Would People Still Obey Today?” American Psychologist, 64, 1–11.
  8. The Trial of Adolf Eichmann, Session 95, July 13, 1961.
  9. Cesarani, David. 2006. Becoming Eichmann: Rethinking the Life, Crimes, and Trial of a “Desk Murderer”. New York: De Capo Press. Von Trotta, Margarethe (Director). 2012. Hannah Arendt. Zeitgeist Films. See also: Lipstadt, Deborah E. 2011. The Eichmann Trial. New York: Schocken.
  10. Young, Robert. 2010. Eichmann. Regent Releasing, Here! Films. October.
  11. Quoted in: Goldhagen, Daniel Jonah. 2009. Worse Than War: Genocide, Eliminationism, and the Ongoing Assault on Humanity. New York: PublicAffairs, 158.
  12. Lozowick, Yaacov. 2003. Hitler’s Bureaucrats: The Nazi Security Police and the Banality of Evil. New York: Continuum, 279.
TAGS: , , , , , , , , , , , , ,

Suffrage & Success
Celebrating the Centennial of Women’s Right to Vote

Posted on Aug. 18, 2020 by | Comments (2)

Today, August 18, marks the 100th anniversary of the adoption of the 19th Amendment to the Constitution of the United States, guaranteeing women the right to vote. We honor that momentous event with an excerpt adapted from the chapter on women’s rights in Dr. Michael Shermer’s 2015 book The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom (New York: Henry Holt).

Read the essay below, or listen to it being read by the author, Michael Shermer:

On August 18, 1920, the 19th Amendment of the United States Constitution was ratified, legally securing the franchise to women. It was the culmination of a 72-year battle that began when Elizabeth Cady Stanton and Lucretia Mott organized the 1848 Seneca Falls conference, after attending the World Anti-slavery Convention in London in 1840 — a meeting at which they had come to participate as delegates, but at which they were not allowed to speak and were made to sit like obedient children in a curtained-off area. This did not sit well with Stanton and Mott. Conventions were held throughout the 1850s but were interrupted by the American Civil War, which secured the franchise in 1870 — not for women, of course, but for black men (though they were gradually disenfranchised by poll taxes, legal loopholes, literacy tests, threats and intimidation). This didn’t sit well either and only served to energize the likes of Matilda Joslyn Gage, Susan B. Anthony, Ida B. Wells, Carrie Chapman Catt, Doris Stevens, and countless others who campaigned unremittingly against the political slavery of women.

Things began to heat up when the great American suffragist Alice Paul (arrestingly portrayed by Hilary Swank in the 2004 film Iron Jawed Angels) returned from a lengthy sojourn in England. She had learned much during her time there through her active participation in the British suffrage movement and from the more radical and militant British suffragists, including the courageous political activist Emmeline Pankhurst, characterized as “the very edge of that weapon of willpower by which British women freed themselves from being classed with children and idiots in the matter of exercising the franchise.”1

Upon her death Pankhurst was heralded by the New York Times as “the most remarkable political and social agitator of the early part of the twentieth century and the supreme protagonist of the campaign for the electoral enfranchisement of women”;2 years later, Time magazine voted her one of the 100 most important people of the century. Thus, when Alice Paul returned from abroad she was ready for action, though the more conservative members of the women’s movement weren’t quite ready for Alice. Nevertheless, in order to attract attention to the cause she and Lucy Burns organized the largest parade ever held in Washington. On March 3, 1913 (strategically timed for the day before President Wilson’s inauguration), 26 floats, 10 bands, and 8,000 women marched, led by the stunning Inez Milholland wearing a flowing white cape and riding a white horse. (See Figure 1 above.) Upwards of 100,000 spectators watched the parade but the mostly male crowd became increasingly unruly and the women were spat upon, taunted, harassed and attacked while the police stood by. Afraid of an all-out riot, the War Department called in the cavalry to contain the escalating violence and chaos.3

It was a gift. A scandal ensued due to the rough treatment of the women and suddenly, “the issue of suffrage — long thought dead by many politicians — was vividly alive in front page headlines in newspapers across the country.… Paul had accomplished her goal — to make woman suffrage a major political issue.”5

In 1917 women began peacefully picketing outside the White House but, once again, they were met with harassment and violence. These Silent Sentinels (as they were called) stood day and night (except Sundays) with their banners for two and a half years but, after the U.S. joined in the war, patience ran thin as it was seen as improper to picket a wartime president. The picketers were charged with obstructing traffic and were thrown — often quite literally thrown — into prison cells where they were treated like criminals, rather than political protesters, and were kept in appalling conditions. Many of the women went on a hunger strike, including Alice Paul, who was viciously force-fed in order to keep her from becoming a martyr for the cause.

Word of the brutality in the workhouse was leaked to the press and the public became increasingly incensed at the protestors’ horrific treatment. During what became known as the Night of Terror, 40 prison guards went on a rampage and the women were “grabbed, dragged, beaten, kicked, and choked”; Lucy Burns had her wrists cuffed and chained above her head to the cell door; another woman was taken to the men’s section and told “they could do what they pleased with her”; another woman was knocked unconscious, still another had a heart attack.6 These outrages were a grave tactical error. “With public pressure mounting as a result of press coverage, the government felt the need to act.… Arrests didn’t stop these protesters; neither did jail terms, psychopathic wards, force-feeding, or violent attacks. Their next decision was simply to let them out.”7

At long last, in 1920, the 19th amendment (originally drafted by Susan B. Anthony and Elizabeth Cady Stanton in 1878) was passed — by a single vote — thanks to 24-year-old Harry T. Burn, a Tennessee legislator who had originally intended to vote against his state ratifying the amendment (which needed ratification of 36 of the 48 states to pass), but changed his mind because of a note from his mother.

Dear Son:

Hurrah, and vote for suffrage! Don’t keep them in doubt. I notice some of the speeches against. They were bitter. I have been watching to see how you stood, but have not noticed anything yet.

Don’t forget to be a good boy and help Mrs. Catt put the “rat” in ratification.

Your Mother.8

In the end, then, suffrage for women came down to the vote of one man, influenced by his mom. It was rumored that, “the anti-suffragists were so angry at his decision that they chased him from the chamber, forced him to climb out a window of the Capitol and inch along a ledge to safety.”9 Thus suffrage arrived in the U.S., kicking and screaming.

It was a right that women in a number of other countries had already won years before, but one that others would have to wait for. Figure 2 (below) tracks the moral progress of women’s suffrage, while Figure 3 (below) tracks the gaps between when all men versus all women were granted the franchise, from Switzerland’s 123-year gap between 1848 and 1971, to Denmark’s 0-year gap in 1915. By comparison, the 50-year gap in the United States between 1870 and 1920 lies mid-way in this history.

Womens Right to Vote Over Time

Figure 2: Women’s Right to Vote Over Time The stair-step progress of women’s suffrage is tracked over time from 1900 to 2010, showing two big bursts, the first after World War I and the second after World War II. Tellingly, the expected date for the sovereign nation of Vatican City to grant women the right to vote is “never.”10

The Gap Between the Franchise for Men and Women

Figure 3: The Gap Between the Franchise for Men and Women. The spasmodic nature of moral progress is reflected in the shrinking time in years between the dates that men’s suffrage and women’s suffrage was legalized, from 123 years for Switzerland to 0 years for Denmark. Such change is contingent on many social and political variables that differ from country to country.

Carving Women’s Rights: A Personal Story

The trend over the past several centuries has been to grant women the same rights and privileges as those of men. Political, economic, and social advances, enabled by scientific, technological, and medical discoveries and inventions have increasingly provided women not only greater amounts of reproductive autonomy and control, but have also driven an expansion of their rights and opportunities in all areas of life, leading to healthier and happier societies across the globe. As with the other rights revolutions there is much progress that remains to be realized, but the momentum now is such that the expansion of women’s rights should continue unabated into the future.

In these ways — the rational justification for including women as full rights-bearing persons no less deserving than men, the interchangeability of women’s perspectives with that of men, the scientific understanding of the nature of human sexuality and reproduction, and the continuous thinking that enables us to see and comprehend the difference between a woman’s and a fetus’s rights — science and reason have led humanity closer to truth, justice, and freedom.

As an example of how far we’ve come in just the last two generations (and how oppressed women were as recently as the early 20th century), I close with the story of two women — mother and daughter — both named Christine Roselyn Mutchler. The mother was born in Germany and passed through Ellis Island in 1893 with her parents, who then moved to Alhambra, California. Mother Christine married her husband Frederick and gave birth to baby Christine in 1910 (and a second daughter three years later), but their lives were shattered shortly after that when Fred told his wife he was going out for a loaf of bread and never returned. Abandoned by her husband, left with no money or food to care for herself and her two small children, mother Christine was forced to return to her father’s home.

Unknown to her at the time, Fred had wandered off into the county jail with delusions that his father-in-law was after him. After being examined by a physician he was sent to a mental hospital for over a year. During this time, with his delusions in remission, Fred wrote heartbreaking letters to his wife asking about her and the children, but Christine’s father kept the letters from her and she continued to believe that she had been abandoned. In time she found work as a housemaid for a friend of a successful motion picture executive named John C. Epping, whose wife had recently died. Desperate for a daughter and enamored by three-year old Christine, Epping talked Christine’s father into forcing her to allow him to adopt the child. Young, poor, scared, and intimidated by her father, Christine reluctantly agreed to the adoption, although a series of articles in The Los Angeles Times show that a probation officer on the case opposed the adoption, declaring “she believed Epping saw possibilities of a future Mary Pickford in the little girl, and that the child should have a home in some private family where home life and education would be the principal features.”11 Based on the false information provided by Christine’s father that Fred had abandoned them, the judge granted the adoption.

Epping promptly changed the name of his newly adopted daughter to Frances Dorothy Epping, addressed her by her middle name, and (unbelievably) told her she was born in Providence, Rhode Island and that his deceased wife was her true mother. Now age four, Christine/Dorothy apparently did not accept the fictional story and rebelled — or perhaps Epping changed his mind about raising a daughter as a single Dad — because he shuffled her around through a series of surrogate parents, including sisters at the Ramona Convent in Alhambra and caretakers at the Marlborough Preparatory School in Los Angeles, before shipping her back east for a year to live with his sister in the Catskills, and then on to Germany where she lived with Epping’s relations. During that period Dorothy discovered that she had a talent for the arts, in particular sculpture.

She then returned to Los Angeles and finished her secondary education, after which she was reunited with her original family and told the truth about the adoption. She went on to college at the Otis Art Institute in Los Angeles, the Corcoran School of Art in Washington, D.C. and the prestigious Academy of Fine Arts in Munich, Germany in the 1930s under the tutelage of Joseph Wackerle who, at that time, was the Third Reich Culture Senator and received praise from both Goebbels and Hitler. (She later recalled being stunned by the hypnotic pull Hitler had on an audience of one of his speech’s she attended.) In the meantime, Dorothy’s real mother, Christine, was instructed by her father to divorce her husband Fred, after which she met and married a vegetable cart vendor in Los Angeles, left her father’s oppressive rule, and began to rebuild her life and new family. But the tragedy of being forced to give up her first-born child haunted her the rest of her life. As the world changed and Christine saw how women became more empowered in the second half of the 20th century, she continually asked herself why she didn’t speak up and oppose the adoption.

Meanwhile, as Dorothy came of age she soon discovered that family law and the adoption courts were not the only worlds ruled by men. Her chosen profession of sculpture was a heavily male dominated one, so to be taken seriously she began using a truncated version of her first name Frances — Franc — and that gained her entrée into the German academy and subsequent galleries and museums (even now one can find references to “his” work). She later recalled that when the professors at the Academy of Fine Arts in Munich found out “Franc” was a women, she had to listen to lectures from the hallways because only men were allowed inside. From the early 1930s through her death in 1983 — by which time it was acceptable for women to shape clay, wood, and stone with their hands — Franc Epping’s work was shown in numerous exhibits throughout the United States, including the prestigious Whitney Museum of American Art in New York City. One of her works, “The Man with a Hat,” even appeared in an episode of the original series of Star Trek. I know because I own that piece, along with many other sculptures of hers, which I inherited from my mother.

The Man with a Hat appeared in an episode of the original series of Star Trek.

Franc Epping’s work, “The Man with a Hat,” appeared in an episode of the original series of Star Trek (The Original Series, Season 1, Dp. 24 A Taste of Armageddon, at 17 minutes, 25 seconds). That sculpture, along with many other Epping sculptures, were inherited by Michael Shermer from his mother. Franc Epping was the author’s Aunt.

You see, Franc Epping was my Aunt, her real mother Christine was my grandmother, and I am proud to be related to such a resilient and determined woman.12 Aunt Franc’s sculptures portray strong women with muscular features in empowering poses — allegories for what women for generations have had to rise to in order to gain the recognition and equality that is rightfully theirs. This book was written in the inspiring presence of those carved stones. END

Figure 4: Sculptor Franc Epping working

Figure 4: Sculptor Franc Epping, born Christine Roselyn Mutchler and given the adopted name Frances Dorothy Epping, started using the masculinized version of her adopted name — Franc — in order to be taken seriously in the male-dominated world of sculpture.

Figure 5: Alomg Franc Epping's sculptures are strong women with muscular features in empowering poses.

Figure 5: Among Franc Epping’s many sculptures are strong women with muscular features in empowering poses.13

About the Author

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine, the host of the Science Salon Podcast, and a Presidential Fellow at Chapman University where he teaches Skepticism 101. For 18 years he was a monthly columnist for Scientific American. He is the author of New York Times bestsellers Why People Believe Weird Things and The Believing Brain, Why Darwin Matters, The Science of Good and Evil, The Moral Arc, and Heavens on Earth. His new book is Giving the Devil His Due: Reflections of a Scientific Humanist.

  1. Purvis, June. 2002. Emmeline Pankhurst: A Biography. London: Routledge. 354.
  2. Ibid., 354.
  3. Stevens, Doris. Edited by Carol O’Hare. Originally published 1920; 3 revised and edited 1995. Jailed for Freedom: American Women Win the Vote. Troutdale: New Sage Press. 18–19.
  4. Source: Library of Congress. George Grantham Bain Collection. Original caption reads: Inez Milholland Boissevain, wearing white cape, seated on white horse at the National American Woman Suffrage Association parade, March 3, 1913, Washington, D.C. LC-DIG-ppmsc-00031 (digital file from original photograph) LC-USZ62-77359
  5. Ibid., 19.
  6. Adams, Katherine H. and Michael L. Keene. 2007. Alice Paul and the American Suffrage Campaign. Illinois: University of Illinois Press. 206–208.
  7. Ibid., 211.
  9. Ibid.
  10. The Wikipedia entry for “Women’s Suffrage” has a complete list of every country and when they legalized the franchise for women:
  11. “4-Sided Battle in Court for Child.” 1914. Los Angeles Times, October 31.
  12. Most of this story has been carefully documented by Ann Marie Batesole, a private detective and my cousin — our grandmother was Christine, Aunt Fanci’s mother.
  13. Source: Author’s collection.
TAGS: , , , , ,

Fat Man & Little Boy

Posted on Aug. 07, 2020 by | Comments (38)

On the 75th anniversary of nuclear weapons, Dr. Michael Shermer presents a moral case for their use in ending WWII and the deterrence of Great Power wars since, and a call to eventually eliminate them. This essay was excerpted, in part, from Michael Shermer‘s book, The Moral Arc, in the chapter on war.

Read the essay below, or listen to it being read by the author, Michael Shermer:

On August 6 the Little Boy gun-type uranium-235 bomb exploded with an energy equivalent of 16-18 kilotons of TNT, flattening 69 percent of Hiroshima’s buildings and killing an estimated 80,000 people and injuring another 70,000. (

On August 6, 1945 the Little Boy gun-type uranium-235 bomb exploded with an energy equivalent of 16–18 kilotons of TNT, flattening 69 percent of Hiroshima’s buildings and killing an estimated 80,000 people and injuring another 70,000.

Three quarters of a century ago this summer, nuclear weapons altered our civilization forever. On July 16 the Trinity plutonium bomb detonated with the energy equivalent of 22 kilotons (22,000 metric tons) of TNT, sending a mushroom cloud 39,000 feet into the atmosphere. The explosion left a crater 76 meters wide filled with radioactive glass called trinitite (melted quartz grained sand). It could be heard as far away as El Paso, Texas. On August 6 the Little Boy gun-type uranium-235 bomb exploded with an energy equivalent of 16–18 kilotons of TNT, flattening 69 percent of Hiroshima’s buildings and killing an estimated 80,000 people and injuring another 70,000. On August 9 the Fat Man plutonium implosion-type bomb with the energy equivalence of 19-23 kilotons of TNT leveled around 44 percent of Nagasaki, killing an estimated 35,000 to 40,000 people and severely wounding another 60,000.1

The aftermath of Little Boy (

The aftermath of Little Boy

On August 9 the Fat Man plutonium implosion-type bomb with the energy equivalence of 19-23 kilotons of TNT leveled around 44 percent of Nagasaki, killing an estimated 35,000 to 40,000 people and severely wounding another 60,000. (

On August 9, 1945 the Fat Man plutonium implosion-type bomb with the energy equivalence of 19–23 kilotons of TNT leveled around 44 percent of Nagasaki, killing an estimated 35,000 to 40,000 people and severely wounding another 60,000.

Before and Aftermath of Nagasaki (

Before and aftermath of Nagasaki

Memorandum from Major General Leslie Groves to Army Chief of Staff George Marshall (

Click image to view larger PDF. Had the Japanese military hardliners had their way to continue the war into the fall, Groves had three more bombs readied for September and another three for October. Here he instructs his Chief of Staff that the next bomb will be ready to drop on after August 24. Emperor Hirohito capitulated on August 15, thereby saving millions of lives of his citizens.

As documented in the memo below dated August 10, 1945, if the Japanese had not surrendered the head of the Manhattan Project, Major General Leslie R. Groves, had another Fat Man-type plutonium implosion bomb ready to go after August 24 that would have likely killed another 50,000 to 100,000 people.2 And had the Japanese military hardliners had their way to continue the war into the fall, Groves had three more bombs readied for September and another three for October. President Harry Truman was not exaggerating when he threatened Japan with “a rain of ruin from the air, the like of which has never been seen on this Earth.” Truman did agonize about dropping more nukes on Japan, troubled as he was by the thought of more innocents and noncombatants being killed. He wrestled that decision away from the military. (Note Groves’ handwritten addendum to his memo that “It is not to be released on Japan without express authority from the President.” U.S. presidents have had sole authority to use nuclear weapons ever since.) However, further bombings proved unnecessary. On August 15 Emperor Hirohito, against the wishes of some of Japan’s military leaders, announced on the radio that Japan would capitulate. On September 2 they signed the surrender documents in Tokyo Bay, ending the Second World War.3

On this 75th anniversary of the summer of the bomb I want to make the case that their use was necessary to end the war, that their continued existence has acted as a deterrence against another Great Power war — but that we must eliminate them entirely for the long-term existence of our civilization and possibly our species.

Since 1945 a cadre of critics have proffered the claim that atomic bombs were unnecessary to bring about the end of World War II (or, at least, the Fat Man Nagasaki bomb was superfluous), and thus this act was immoral, illegal, or even a crime against humanity. Robert Oppenheimer and other physicists like Leo Szilard who worked on the Manhattan Project expressed reservations. “The physicists have known sin,” Oppenheimer opined. He went to Truman and confessed “Mr. President. I feel I have blood on my hands,” to which the President recalled “I told him the blood was on my hands — to let me worry about that.” Truman promptly dismissed Oppenheimer and told Secretary of State Dean Acheson, “I don’t want to see that son-of-a-bitch in this office ever again.”4

In 1946 the Federal Council of Churches issued a statement declaring, “As American Christians, we are deeply penitent for the irresponsible use already made of the atomic bomb. We are agreed that, whatever be one’s judgment of the war in principle, the surprise bombings of Hiroshima and Nagasaki are morally indefensible.”5 In 1967 the linguist and contrarian politico Noam Chomsky called the two bombings “the most unspeakable crimes in history.”6

More recently, in his history of genocide titled Worse Than War, the historian Daniel Goldhagen opens his analysis by calling the U.S. President Harry Truman “a mass murderer” because in ordering the use of atomic weapons he “chose to snuff out the lives of approximately 300,000 men, women and children.” Goldhagen opines that “it is hard to understand how any rightthinking person could fail to call slaughtering unthreatening Japanese mass murder.”7 Goldhagen defines “genocide” broadly enough to equate it with “mass murder” (without ever defining what, exactly, that means). In morally equating Harry Truman with Adolf Hitler, Joseph Stalin, Mao Zedong, and Pol Pot, Goldhagen allows himself to be constrained by the categorical thinking that prevents one from discerning the different kinds, levels, and motives for large scale military violence. By this reasoning, nearly every act that kills a large number of people could be considered genocidal because there are only two categories — mass murder and non-mass murder.

By contrast, continuous thinking allows us to distinguish the differences between types of mass killings (some scholars define genocide as one-sided killing by armed people of unarmed people), their context (during a state war, civil war, ethnic cleansing), motivations (termination of hostilities or extermination of a people), and quantities (hundreds to millions) along a sliding scale. In 1946 the Polish jurist Raphael Lemkin created the term genocide and defined it as “a conspiracy to exterminate national, religious or racial groups.”8 That same year the U.N. General Assembly defined genocide as “a denial of the right of existence of entire human groups.”9 More recently, in 1994 the highly respected philosopher Steven Katz defined genocide as “the actualization of the intent, however successfully carried out, to murder in its totality any national, ethnic, racial, religious, political, social, gender or economic group.”10

By these definitions, the dropping of Fat Man and Little Boy were not acts of genocide. The difference between Truman and the others is in the context and motivation of the act. In their genocidal actions against targeted people, Hitler, Stalin, Mao, and Pol Pot had as their objective the total elimination of a group. The killing would only stop when every last pursued person was exterminated (or if the perpetrators were stopped or defeated). Truman’s goal in dropping the bombs was to end the war with Japan (which it did), not to eliminate the Japanese people (which it didn’t). That the U.S. provided considerable financial, personnel, and material support to help rebuild Japan into a world economic power puts the lie to the eliminationist accusation.11

The author’s father, Richard Shermer, in 1945, serving aboard the USS Wren

The author’s father, Richard Shermer, in 1945, serving aboard the USS Wren.

More broadly morally, if we ground morality in the survival and flourishing of sentient beings,12 by that measure, then not only did Fat Man and Little Boy end the war and stop the killing, they saved lives — very probably millions of lives, both Japanese and American. My father Richard Shermer was possibly one such survivor. During the Second World War he served aboard the USS Wren (DD-568), a Fletcher-class destroyer assigned to protect aircraft carriers and other large capital ships from Japanese submarines and from Kamikaze planes on what was called antiaircraft radar picket watch. His ship was so attacked several times but sustained no major damage. The Wren was part of the larger fleet that was working its way toward Japan, escorting the carriers whose planes were bombarding the Japanese homeland in preparation for the planned invasion. My father told me that everyone onboard dreaded that day because they had heard of the horrific carnage resulting from the invasion of just two tiny islands held by the Japanese — Iwo Jima and Okinawa. If that was any indication of what was to come with a full-scale invasion, the contemplation of it was almost too much to bear.13

The USS Wren (,_circa_in_the_mid-1950s_(NH_107257).jpg)

The USS Wren, a Navy destroyer deployed to protect aircraft carriers from suicidal Kamikaze pilots while their planes bombarded the Japanese homeland in preparation for the invasion that never came, thanks to Fat Man and Little Boy.

USS Lexington
USS Wren fantail
USS Wren front
USS Wren

Click an image above to enlarge it. Four photos taken by Richard Shermer on board the USS Wren, pictured fore and aft, accompanying the aircraft carrier USS Lexington, and arriving in Tokyo Bay in late August, 1945 in preparation for the surrender ceremony on September 2, marking the end of the Second World War.

During the invasion of Iwo Jima there were approximately 26,000 American casualties that included 6,821 dead in the 36-day battle. How fiercely did the Japanese defend that little volcanic rock 700 miles from Japan? Of the 22,060 Japanese soldiers assigned to fight to the bitter end, only 216 survived.14 The subsequent battle for Okinawa, only 340 miles from the Japanese mainland, was fought even more ferociously, resulting in a staggering body count of 240,931 dead, including 77,166 Japanese soldiers, 14,009 American soldiers, plus an additional 149,193 Japanese civilians living on the island who either died fighting or committed suicide rather than let themselves be captured.15 With an estimated 2.3 million Japanese soldiers and 28 million Japanese civilian militia prepared to defend their island nation to the death,16 it was clear to all what an invasion of the Japanese mainland would entail.

It is from these cold hard facts that Truman’s advisors estimated that between 250,000 and one million American lives would be lost in an invasion of Japan.17 General Douglas MacArthur estimated that there could be a 22:1 ratio of Japanese to American deaths, which translates to a minimum death toll of 5.5 million Japanese.18 By comparison, cold though it may sound, the body count from both atomic bombs — about 200,000–300,000 total (Hiroshima: 90,000–166,000 deaths, Nagasaki: 60,000–80,000 deaths19) — was a bargain.

In any case, if Truman hadn’t ordered the bombs dropped, General Curtis LeMay and his fleet of B-29 bombers would have continued pummeling Tokyo and other Japanese cities into rubble. When asked to predict when the war would end based on his bombing program, LeMay said September 1, because that was when there would be nothing left of Japan to bomb. The death toll from conventional bombing would have been just as high as that produced by the two atomic bombs, if not higher. Previous mass bombing raids had produced Hiroshima-level death rates, and it is likely that more than just two cities would have been destroyed before the Japanese surrendered. Compare, for example, Little Boy’s energy equivalent of 16,000–19,000 tons of TNT to the U.S. Strategic Bombing Survey estimate that this was the equivalent of 220 B-29s carrying 1,200 tons of incendiary bombs, 400 tons of high-explosive bombs, and 500 tons of anti-personnel fragmentation bombs, with an equivalent number of casualties.20 In fact, on the night of March 9–10, 1945, 279 B-29s dropped 1,665 tons of bombs on Tokyo, leveling 15.8 square miles of the city, killing 88,000 people, injuring another 41,000, and leaving another million homeless.21

On the night of March 9-10, 1945, 279 B-29s dropped 1,665 tons of bombs on Tokyo, leveling 15.8 square miles of the city, killing 88,000 people, injuring another 41,000, and leaving another million homeless. This is the result. (

On the night of March 9–10, 1945, 279 B-29s dropped 1,665 tons of bombs on Tokyo, leveling 15.8 square miles of the city, killing 88,000 people, injuring another 41,000, and leaving another million homeless. This is the result.

These facts also help refute the claim that the alternative scenario of dropping an atomic bomb on an uninhabited island or bay to demonstrate its destructive force would have worked to convince the Japanese to surrender. Given that they refused to capitulate even after numerous cities were obliterated by conventional bombs and Hiroshima was erased from the map by an atomic bomb it seems unlikely this more benign strategy would have worked.22

On balance, then, dropping the atomic bombs was the least destructive of the options on the table. Although we wouldn’t want to call it a moral act, it was in the context of the time the least immoral act by the criteria of lives saved. That said, we should also recognize that the several hundred thousand killed is still a colossal loss of life. The fact that the invisible killer of radiation continued its effects long after the bombings should dissuade us from ever using such weapons again. Along that sliding scale of evil, in the context of one of the worst wars in human history that included the singularly destructive Holocaust of six million murdered, it was not, pace Chomsky, the most unspeakable crime in history — not even close — but it was an event in the annals of humanity never to be forgotten and, hopefully, never to be repeated.

When I was an undergraduate at Pepperdine University in 1974, the father of the hydrogen bomb — Edward Teller — spoke at our campus in conjunction with the awarding of an honorary doctorate. His message was that deterrence works. At the time I remember thinking — like so many politicos were saying — “yeah, but a single slip-up is all it takes.” Popular films such as Fail Safe and Dr. Strangelove reinforced the point. But the blunder never came (and the close calls were kept secret for decades). In the game theoretic strategy of Mutual Assured Destruction (MAD), deterrence works because neither side has anything to gain by initiating a first strike against the other. The retaliatory capability of both is such that a first strike would most likely lead to the utter annihilation of both countries (along with much of the rest of the world). “It’s not mad!” proclaimed Secretary of Defense Robert S. McNamara. “Mutual Assured Destruction is the foundation of deterrence. Nuclear weapons have no military utility whatsoever, excepting only to deter one’s opponent from their use. Which means you should never, never, never initiate their use against a nuclear-equipped opponent. If you do, it’s suicide.”23

The logic of deterrence was first articulated in 1946 by the American military strategist Bernard Brodie in his appropriately titled book The Absolute Weapon, in which he noted the break in history that atomic weapons brought with their development: “Thus far the chief purpose of our military establishment has been to win wars. From now on, its chief purpose must be to avert them. It can have almost no other purpose.”24 As Dr. Strangelove explained in Stanley Kubrick’s classic Cold War film: “Deterrence is the art of producing in the mind of the enemy the fear to attack.” Said enemy, of course, must know that you have at the ready such destructive devices, and that is why “The whole point of a doomsday machine is lost if you keep it a secret!25

Dr. Strangelove was a black comedy that parodied MAD by showing what can happen when things go terribly wrong, in this case when General Jack D. Ripper becomes unhinged at the thought of “Communist infiltration, Communist indoctrination, Communist subversion, and the international Communist conspiracy to sap and impurify all of our precious bodily fluids” and orders a nuclear first strike against the Soviet Union. Given this unfortunate incident and knowing that the Russkis know about it and will therefore retaliate, General “Buck” Turgidson pleads with the president to go all out and launch a full first strike. “Mr. President, I’m not saying we wouldn’t get our hair mussed, but I do say no more than ten to twenty million killed, tops, uh, depending on the breaks.”26

This isn’t far off real projected casualties (Kubrick was a student of Cold War strategy), as in 1957 Strategic Air Command (SAC) estimated that between 360 and 525 million casualties would be inflicted in the first week of a nuclear exchange with the Soviet block.27 In 1968 Secretary of Defense Robert McNamara gave these figures for MAD to work: “In the case of the Soviet Union, I would judge that a capability on our part to destroy, say, one-fifth to one-fourth of their population and one-half of her industrial capacity would serve as an effective deterrent.” With a population of the time of about 128 million, this translates to 25–32 million dead.28 A 1979 report from the Office of Technology Assessment for the U.S. Congress, entitled The Effects of Nuclear War, estimated that 155 to 165 million Americans would die in an all-out Soviet first strike (unless people made use of existing shelters near their homes, reducing fatalities to 110–120 million). The population of the U.S. at the time was 225 million, so the estimated percent that would be killed ranged from 49 percent to 73 percent. Staggering.

Deterrence has worked so far — no nuclear weapon has been detonated in a conflict of any kind in 75 years — but it would be foolish to think of deterrence as a permanent solution.29 As long ago as 1795, in an essay titled Perpetual Peace, Immanuel Kant worked out what such deterrence ultimately leads to: “A war, therefore, which might cause the destruction of both parties at once … would permit the conclusion of a perpetual peace only upon the vast burial-ground of the human species.” (Kant’s book title came from an innkeeper’s30 sign featuring a cemetery — not the type of perpetual peace most of us strive for.) Deterrence acts as only a temporary solution to the Hobbesian temptation to strike first (also called the security dilemma in which a nation arming in defense triggers other nations to also arm in defense), allowing both Leviathans to go about their business in relative peace, settling for small proxy wars, which themselves have been in decline for decades.31

In the long run we need to work toward a world free of nuclear weapons. The risks of accidents or a deranged Dr. Strangelove-type character triggering a nuclear exchange is too high for a MAD deterrence strategy to be a permanent solution to the security dilemma it was invented to solve. Authors such as Richard Rhodes in his nuclear tetralogy (The Making of the Atomic Bomb, Dark Sun, Arsenals of Folly, and The Twilight of the Bombs32), and Eric Schlosser in Command and Control,33 leave readers with vertigo knowing how many close calls there have been. To name but a few: the jettisoning of a Mark IV atomic bomb in British Columbia in 1950; the crash of a B-52 carrying two Mark 39 nuclear bombs in North Carolina; the Cuban Missile Crisis; the Able Archer 83 Exercise in Western Europe that the Soviets misread as the buildup to a nuclear strike against them; the Titan II Missile explosion in Damascus, Arkansas that narrowly avoided eradicating the entire city off the map; and Stanislav Petrov’s decision not to trigger a retaliatory strike against the U.S. based on reports from the Soviet early warning satellite system of incoming ballistic missiles. It is not for nothing that Petrov is known as “the man who saved the world.”34

Thus, in the long run we must get to Nuclear Zero, but in the short run there are so many hurdles that few think we are anywhere near such a lofty goal. In two episodes of my Science Salon podcast Fred Kaplan, the national security journalist and author of several books on nuclear weapons, and William J. Perry, Secretary of Defense under President Clinton and a staunch advocate for eliminating nuclear weapons, both told me that they did not think this could happen any time soon, even while their books outline how it could be done.35 In The Moral Arc I summarized the consensus by experts on the most important steps to take to reduce the risk of nuclear weapons and to work toward a world free of them, including: (1) enact a “no first use” policy, (2) take all weapons off of “launch on warning”; (3) increase the warning and decision times for launching a retaliatory strike; (4) remove from the President the sole authority to launch nuclear weapons; (5) uphold non-proliferation agreements; (6) widen the taboo from using nuclear weapons to owning them; (7) increase economic interdependence; (8) expand democratic governance; (9) reduce spending on nuclear weapons; and (10) continue the disarmament of existing nuclear weapons. To that end, it is encouraging to see the decline in the total number of nuclear warheads to around 16,000 from the peak of around 70,000 in 1986, as visualized in the figure below.36

The decline in the total number of nuclear warheads to around 16,000 from the peak of around 70,000 in 1986.

Click image to enlarge. The decline in the total number of nuclear warheads to around 16,000 from the peak of around 70,000 in 1986.

I should note that some security scholars, along with many political theorists and leaders, think that the path to peace is more deterrence through more and better nuclear weapons. President Trump, for example, insists on renovating our aged nuclear weapons systems to the tune of $1.2 trillion between 2017 and 2046, an upgrade program37 he inherited from President Obama. And despite winning the Nobel Peace Prize for working toward nuclear nonproliferation, Obama nevertheless backed off from initiating a “no first use” policy under pressure from our NATO allies, who were worried that Russian saber rattling and border expansion might be encouraged if an escalation from conventional to nuclear weapons was no longer on the defense table.38

Similarly, the late political scientist Kenneth Waltz thought that allowing Iran to go nuclear would bring stability to the Middle East because “in no other region of the world does a lone, unchecked nuclear state exist. It is Israel’s nuclear arsenal, not Iran’s desire for one, that has contributed most to the current crisis. Power, after all, begs to be balanced.”39 Except for when it doesn’t, as in the post-1991 period after the collapse of the Soviet Union and the unipolar dominance of the United States. No other medium-size power rose to fill the vacuum, no rising power started wars of conquest to consolidate more power, and the only other candidate, China, has remained war-free for almost four decades. Given Iran’s outlier status in the international system and their avowed promise to “wipe off the map” Israel, anyone who would join a Fair-Play-for-Nuclear-Iran-Committee has lost their moral compass.

This all just shows how difficult it is going to be to get to a world without nukes. Nevertheless, we have to try. One more statistic is sobering in this regard, as noted by the anti-nuclear scientist and activist David Barash: The U.S. has a triad of nuclear weapons: land (missiles), air (bombers) and sea (submarines). A single Trident sub carries 20 nuclear-tipped missiles, each one of which has eight independently targetable warheads of about 465 kilotons, or about 30 times the destructive power of Little Boy. So, one sub packs the equivalent of 4,800 Hiroshimas (20 x 8 x 30), and we have 18 Trident submarines, or the equivalent of 86,400 Hiroshimas!40 In the words of President Obama during a briefing about our nuclear capability: “Let’s stipulate that this is all insane.”41

The use of nuclear weapons for both ending wars and deterring them is a 20th century phenomenon that can be phased out for the new century. As the political scientist Christopher Fettweis notes in his book Dangerous Times?, despite the popularity of such intuitive notions as the “balance of power” — based on a small number of non-generalizable cases from the past that are in any case no longer applicable to the present — so-called “clashes of civilization” like the world wars of the 20th century are extremely unlikely to happen in the highly interdependent world of the 21st century. In fact, Fettweis shows, never in history has such a high percentage of the world’s population lived in peace. Conflicts of all forms have been steadily dropping since the early 1990s, and even terrorism can bring states together in international cooperation to combat a common enemy.42

The abolition of nuclear weapons is a complex and difficult puzzle that has been studied extensively by scholars and scientists for over half a century. The many problems and permutations of getting from here to there are legion, and there is no single sure-fire pathway to zero. Nevertheless, it is a soluble problem, and humans are nothing if not innovative problem solvers.43 I do not believe that the deterrence trap is one from which we can never extricate ourselves, and the remaining threats should direct us to work toward Nuclear Zero sooner rather than later. In the meantime, minimum is the best we can hope for given the complexities of international relations, but given enough time, as Shakespeare poetically observed…

Time’s glory is to calm contending kings,
To unmask falsehood and bring truth to light,
To stamp the seal of time in aged things,
To wake the morn and sentinel the night, …
To slay the tiger that doth live by slaughter, …
To cheer the ploughman with increased crops,
And waste huge stones with little water-drops.”44

About the Author

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine, the host of the Science Salon Podcast, and a Presidential Fellow at Chapman University where he teaches Skepticism 101. For 18 years he was a monthly columnist for Scientific American. He is the author of New York Times bestsellers Why People Believe Weird Things and The Believing Brain, Why Darwin Matters, The Science of Good and Evil, The Moral Arc, and Heavens on Earth. His new book is Giving the Devil His Due: Reflections of a Scientific Humanist.

  1. Rhodes, Richard. 1986. The Making of the Atomic 1 Bomb. New York: Simon & Schuster.
  2. The Atomic Bomb and the End of World War II, A Collection of Primary Sources. National Security Archive Electronic Briefing Book No. 162. George Washington University. See also: “The Third Shot.”
  3. DeNooyer, Rushmore. 2015. The Bomb. PBS documentary.
  4. Bird, Kai and Martin J. Sherwin. 2007. American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer. New York: Knopf, 332.
  5. Quoted in: Marty, Martin E. 1996. Modern American Religion, Vol 3: Under God, Indivisible, 1941–1960. Chicago: University of Chicago Press, 117.
  6. Chomsky Noam. 1967. “The Responsibility of Intellectuals.” The New York Review of Books, 8(3).
  7. Goldhagen, Daniel Jonah. 2009. Worse Than War: Genocide, Eliminationism, and the Ongoing Assault on Humanity. New York: PublicAffairs, 1, 6.
  8. Lemkin, Raphael. 1946. “Genocide.” American Scholar, 15(2), 227–230.
  9. United Nations General Assembly Resolution 96(1): “The Crime of Genocide.”
  10. Katz, Steven T. 1994. The Holocaust in Historical Perspective, Vol. 1. New York: Oxford University Press.
  11. Kugler, Tadeusz, Kyung Kook Kang, Jacek Kugler, Marina Arbetman-Rabinowitz, and John Thomas. 2013. “Demographic and Economic Consequences of Conflict.” International Studies Quarterly, March, 57(1), 1–12.
  12. Shermer, Michael. 2015. The Moral Arc: How Science and Reason Lead Humanity to Truth, Justice, and Freedom. New York: Henry Holt, 11.
  13. In 2002 I attended the reunion of the Wren crew in my father’s stead and confirmed his memories.
  14. Toland, John. 1970. The Rising Sun: The Decline and Fall of the Japanese Empire 1936–1945. New York: Random House, 731.
  15. “The Cornerstone of Peace — Number of Names Inscribed.” Kyushu-Okinawa Summit 2000: Okinawa G8 Summit Host Preparation Council, 2000. See also: Pike, John. 2010. “Battle of Okinawa.”; Manchester, William. 1987. “The Bloodiest Battle of All.” The New York Times, June 14.
  16. Giangreco, Dennis M. 2009. Hell to Pay: Operation Downfall and the Invasion of Japan 1945–1947. Annapolis, MD: Naval Institute Press, 121–124.
  17. Giangreco, Dennis M. 1998. “Transcript of ‘Operation Downfall [U.S. Invasion of Japan]: US Plans and Japanese Counter-Measures. Beyond Bushido: Recent Work in Japanese Military History. See also: Maddox, Robert James. 1995. “The Biggest Decision: Why We Had to Drop the Atomic Bomb.” American Heritage, 46(3).
  18. Skates, John Ray. 2000. The Invasion of Japan: Alternative to the Bomb. University of South Carolina Press, 79.
  19. Putnam, Frank W. 1998. “The Atomic Bomb Casualty Commission in Retrospect.” Proceedings of the National Academy of Sciences, May 12, 95(10), 5426–5431.
  20. K’Olier, Franklin (Ed.) 1946. United States Strategic Bombing Survey, Sum 20 mary Report (Pacific War). Washington DC: United States Government Printing Office.
  21. Rhodes, 1984, op cit., 599.
  22. Ibid.
  23. Quoted in: Cold War: MAD 1960–1972. 1998. BBC Two Documentary. Transcript: Film:
  24. Brodie, Bernard. 1946. The Absolute Weapon: Atomic Power and World Order. New York: Harcourt Brace, 79.
  25. Kubrick, Stanley. 1964. Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb. Columbia Pictures.
  26. Ibid.
  27. Brown, Anthony Cave (Ed.). 1978. Dropshot: The American Plan for World War III Against Russia in 1957. New York: Dial Press; Richelson, Jeffrey. 1986. “Population Targeting and US Strategic Doctrine.” In Desmond Ball and Jeffrey Richelson (Eds.). Strategic Nuclear Targeting. Ithaca, NY: Cornell University Press, 234–249.
  28. McNamara, Robert S. 1969. “Report Before the Senate Armed Services Committee on the Fiscal year 1969-73 Defense Program, and 1969 Defense Budget, January 22, 1969.” Washington, DC: Government Printing Office, 11.
  29. For a scholarly analysis of and an alternative view to deterrence see: Kugler, Jacek. 1984. “Terror Without Deterrence: Reassessing the Role of Nuclear Weapons.” Journal of Conflict Resolution, 28(3), September, 470–506.
  30. Kant, Immanuel. 1795. “Perpetual Peace: A Philosophical Sketch.” In Perpetual Peace and Other Essays. Indianapolis: Hackett, I, 6.
  31. Pinker, Steven. 2011. The Better Angels of Our Nature: Why Violence Has Declined. New York: Penguin.
  32. Rhodes, Richard. 2010. Twilight of the Bombs: Recent Challenges, New Dangers, and the Prospects of a World Without Nuclear Weapons. New York: Knopf.
  33. Schlosser, Eric. 2013. Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. New York: Penguin.
  34. See the documentary film of that title. Trailer:
  35. Science Salon podcast episode # 107 with Fred Kaplan and Science Salon podcast episode # 127 with William J. Perry were based on their new books: Kaplan, Fred. 2020. The Bomb: Presidents, Generals, and the Secret History of Nuclear War. New York: Simon & Schuster; Perry, William J. and Tom Z. Collina. 2020. The Button: The New Nuclear Arms Race and Presidential Power from Truman to Trump. BenBella Books.
  36. For a striking visual demonstration of every one of the 2,053 nuclear weapon explosions between 1945 and 1998 by the Japanese artist Isao Hashimoto, starting with the Trinity test in New Mexico, where in the world they happened and whom they were sponsored by, see:
  37. 2018. U.S. Nuclear Modernization Programs report. Arms Control Association. August.
  38. The Nobel Prize committee’s statement on President Obama’s award: Sonne, Paul, Gordon Lubold, and Carol E. Lee. 2016. “‘No First Use’ Nuclear Policy Proposal Assailed by U.S. Cabinet Officials, Allies.” Wall Street Journal, August 12.
  39. Waltz, Kenneth N. 2012. “Why Iran Should Get the Bomb: Nuclear Balancing Would Mean Stability.” Foreign Affairs, July/August.
  40. Barash, David. 2018. “Deterrence and its Discontents.” Skeptic, Vol. 23, No. 2,
  41. Quoted in Kaplan, op cit., 244.
  42. Fettweis, Christopher. 2010. Dangerous Times? The International Politics of Great Power Peace. Georgetown University Press.
  43. Lipton, Judith and David Barash. 2019. Strength Through Peace: How Demilitarization Led to Peace and Happiness in Costa Rica, and What the Rest of the World Can Learn From a Tiny, Tropical Nation. Oxford University Press.
  44. Shakespeare, William. 1594. The Rape of Lucrece. Available at:
TAGS: , , , , , , , , , ,

Why People Believe Conspiracy Theories

Posted on Jul. 17, 2020 by | Comments (6)

What is a conspiracy, and how does it differ from a conspiracy theory? Michael Shermer explains who believes conspiracy theories and why they believe them in the following essay, derived from Lecture 1 of his 12-lecture Audible Original course titled “Conspiracies and Conspiracy Theories: What We Should Believe and Why.”

On Friday, March 15, 2019 a 28-year old Australian man wielding five firearms stormed two mosques in Christchurch, New Zealand, and opened fire, killing 50 people and wounding dozens more. It was the worst mass public shooting in the history of that country, prompting Prime Minister Jacinda Ardern to reflect: “While the nation grapples with a form of grief and anger that we have not experienced before, we are seeking answers.”

One answer may be found in the shooter’s rambling 74-page manifesto titled The Great Replacement, apparently inspired by a book of the same title by the French author Renaud Camus. The Great Replacement is a right wing conspiracy theory that claims that white Christian Europeans are being systematically replaced by people of non- European descent, most notably from North Africa, Sub-Saharan Africa, and the Arab Middle East, through immigration and higher birth rates.

The New Zealand killer’s name is Brenton Harrison Tarrant and his manifesto is filled with white supremacist tropes focused on this conspiracy theory, starting with his opening sentence “It’s the birthrates” repeated three times. “If there is one thing I want you to remember from these writings, it’s that the birthrates must change,” Tarrant insists. “Even if we were to deport all Non-Europeans from our lands tomorrow, the European people would still be spiraling into decay and eventual death.” Tarrant then cites the replacement fertility level of 2.06 births per woman, complaining that “not a single Western country, not a single white nation,” reaches this level. The result, he concludes, is “white genocide.”

This is classic 19th century blood-and-soil romanticism, and the self-described “Ethno-nationalist” Tarrant writes that he went on this murderous spree “to ensure the existence of our people and a future for white children, whilst preserving and exulting nature and the natural order.” His screed goes on and on like this, culminating in a photo collage of attractive white people and well-armed militia men.

It is reminiscent of the “Unite the Right” event in Charlottesville, Virginia, in August of 2017 when white supremacists shouted slogans like “blood and soil” and “Jews will not replace us.” Given that there are only about 15 million Jews in the world, Judaism employs no missionary effort at conversion, and birthrates among Jewish families are among the lowest in the world, why would any group worry about being “replaced” by them? They’re not. They’re reflecting the conspiracy theory that Jews control the media, politics, banking and finance, and even the world economy.

In his manifesto Tarrant references the number 14, or the fourteen-word slogan originally coined by the white supremacist David Lane while in federal prison for his role in the 1984 murder of the Jewish radio talk show host Alan Berg. Here are the 14 words:

“We must secure the existence of our people and a future for white children.”

The number is sometimes rendered as 14/88, with the 8s representing the eighth letter of the alphabet— H—and 88 or HH standing for Heil Hitler. Lane, in fact, was inspired by Adolf Hitler’s conspiracy- theory laden book Mein Kampf, in which the Nazi leader rants:

What we must fight for is to safeguard the existence and reproduction of our race and our people, the sustenance of our children and the purity of our blood, the freedom and independence of the fatherland, so that our people may mature for the fulfillment of the mission allotted it by the creator of the universe.

Hitler goes on to identify the enemy of his mission— the Jews—which reflects another conspiracy theory called the “stab in the back,” popular in Germany in the 1920s and 1930s. According to this theory, the only reason the Germans lost World War I was that they were stabbed in the back by the “November Criminals” (the Armistice was signed on November 11, 1918), who the Nazis insisted were Jews, Marxists, and Bolsheviks.

And this “stab in the back” conspiracy theory itself derives from an earlier and larger conspiracy theory involving The Protocols of the Learned Elders of Zion, a hoaxed document purporting to be the proceedings of a secret meeting of Jews plotting global domination. A number of prominent people at the time believed the Protocols hoax, including the American industrialist Henry Ford, who published his own conspiratorial tract titled The International Jew: The World’s Foremost Problem. He later recanted and withdrew the book from circulation when he found out the conspiracy theory was a fake.

What all this shows is the power of conspiratorial belief to motivate people to act, including murderous action, from killing dozens in New Zealand to murdering millions in the Holocaust.

Conspiracy theories are as countless as they are confusing. I once met a politician who told me that he believes the fluoridation of water is the greatest scam ever perpetrated on the public. I have been confronted by 9/11 “truthers” who have insisted the al-Qaeda attack was actually an “inside job” by the Bush administration. Others have regaled me for hours with their breathless tales of who really killed JFK, RFK, MLK Jr., Jimmy Hoffa, or Princess Diana, along with the nefarious goingson of the Federal Reserve, the New World Order, the Trilateral Commission, the Council on Foreign Relations, the Committee of 300, the Knights Templar, the Freemasons, the Illuminati, the Bilderberg Group, the Rothschilds, the Rockefellers, and the Zionist Occupation Government (ZOG) that secretly runs the United States. It would take Madison Square Garden to hold all the conspiracists plotting world domination.

What is a conspiracy, and how does it differ from a conspiracy theory?

I define a conspiracy as two or more people plotting or acting in secret to gain an advantage or to harm others immorally or illegally. I distinguish a conspiracy from a conspiracy theory, which I define as a structured belief about a conspiracy, whether it is real or not. A conspiracy theorist, or conspiracist, is someone who holds a conspiracy theory about a possible conspiracy, again whether or not it is real.

Although the terms “conspiracy theory”, “conspiracy theorist,” and “conspiracist” do sometimes carry pejorative connotations meant to disparage someone or their beliefs—as in “that’s just a crazy conspiracy theory” or “he’s one of those nutty conspiracists”— the terms in fact have a rich history not meant to disparage.

Who believes in such conspiracies? Surveys by the political scientists and conspiracy researchers Joseph Uscinski and Joseph Parent show that conspiracists “cut across gender, age, race, income, political affiliation, educational level, and occupational status.” For example, both liberals and conservatives believe in conspiracies at roughly the same level, although each thinks different secret cabals are at work, with liberals more likely to suspect that media sources and political parties are pawns of rich capitalists and corporations, while conservatives are more likely to believe that academics and liberal elites control these same institutions.

There are other factors at work as well. Race, for example, is not a predictor of overall conspiracism, but it does partially determine which conspiracy theories are likely to be embraced. African Americans, for example, are more likely to believe that the federal government invented AIDS to kill Blacks and that the CIA planted crack cocaine in inner city neighborhoods to ruin them. By contrast, white Americans are more likely to suspect the Feds are conspiring to abolish the Second Amendment and convert the nation into a socialist commune.

Education appears to attenuate conspiracy thinking, with 42 percent of those without a high school diploma scoring high in conspiratorial predispositions compared to those with postgraduate degrees, who come in at 22 percent. Nevertheless, that one in five Americans with postgraduate degrees believe in conspiracies tells us something else is going on here.

In my 2011 book The Believing Brain I suggested that two cognitive processes are at work in conspiracy thinking: (1) patternicity, or the tendency to find meaningful patterns in both meaningful and meaningless noise, and (2) agenticity, or the tendency to infuse patterns with meaning, intention, and agency. I will explore these concepts in more depth in another lecture, but the idea is that the patterndetection filters of conspiricists are wide open, thereby letting in any and all patterns as real with little or no screening of potential false patterns.

Conspiracy theorists connect the dots of random events into meaningful patterns, and then infuse those patterns with intentional agency, and believe that these intentional agents control the world, sometimes invisibly from the top down, instead of the bottom-up causal randomness that determines much of what happens in our world.

To these factors we can add three cognitive biases that often distort events and evidence to fit our preconceived conspiratorial conceptions. For example, the confirmation bias is the tendency to seek and find confirming evidence in support of already existing beliefs, and to ignore or reinterpret disconfirming evidence. Once you have decided that a conspiracy theory is true, your brain sets out to find evidence to support it and filter out evidence that doesn’t.

Another is the hindsight bias, in which we tailor after-the-fact explanations to what has already happened. Once an event has occurred, we look back and reconstruct how it happened, why it had to happen that way and not some other way, and why we should have seen it coming all along, the very essence of conspiracism.

Then there’s cognitive dissonance, the phenomenon of mental tension created when someone holds two conflicting thoughts simultaneously, such as what happens when conspiracy theories about the end of the world don’t come true—instead of admitting their mistake believers double down on their belief and rationalize the failures, all in an attempt to reduce dissonance.

Anxiety, alienation, and feelings of rejection are also factors in conspiratorial cognition. For example, in 2017 Princeton University researchers had subjects write a brief description of themselves that they then shared with two other people in their small group, telling them that they would be judged by the other group members. The subjects who were told that they were rejected were more inclined to believe in conspiracy-related scenarios.

And it’s not just private anxieties. Cultural anxiety may also lead to conspiracy thinking. A 2018 survey of over 3,000 Americans, for example, found that those who reported feeling that American values are eroding were more likely to agree with conspiratorial statements, such as “many major events have behind them the actions of a small group of influential people.”

Feeling in control or powerful in your environment reduces anxiety, but the opposite—concern about what may be out of your control can increase anxiety and conspiratorial paranoia about things that could go wrong. In a 2015 study conducted in the Netherlands, for example, researchers divided subjects into three groups:

  1. those primed to feel powerless and out of control,
  2. those primed to feel in control and powerful, and
  3. a control group not primed for anything.

The subjects were then told about a construction project undergoing problems that could be related to a conspiracy by the city council to steal money from the project’s budget. Subjects primed to feel powerless and out of control were more likely to believe the conspiracy theory. Researchers have also found that conspiratorial speculation runs higher after natural disasters like earthquakes, or when people fear that they may lose their job.

There is another reason why people believe in conspiracy theories that researchers have largely neglected: a lot of them are true. Enough conspiracies are real that it pays to be constructively paranoid because sometimes “they” really are out to get us.

If we take the Oxford English Dictionary’s definition of a conspiracy theory as “a belief that some covert but influential agency (typically political in motivation and oppressive in intent) is responsible for an unexplained event,” then even a cursory review of history reveals that conspiracies have dramatically influenced the course of history and may still be found at work in modern societies. Consider some examples.

Julius Caesar was stabbed to death by a conspiracy of Roman senators on the Ides of March in 44 B.C.E.

The Gunpowder Plot of 1605 saw a group of provincial English Catholics attempt to assassinate King James I by blowing up the House of Lords during the State Opening of Parliament. The plot was discovered and thwarted days before, with the conspiracists caught, tried, convicted, hanged, drawn, and quartered.

In 1776 an elite group of soldiers were assigned to be George Washington’s bodyguards, some of whom were plotting to assassinate the future first president of the United States at the behest of the governor of New York and the Mayor of New York City. The plot was foiled, thanks only to the ineptitude of the plotters to keep a secret.

Abraham Lincoln was assassinated by a conspiracy of Southerners angered by the outcome of the Civil War, which itself was instigated by a Southern cabal to illegally secede from the United States—arguably the biggest conspiracy in U.S. history.

World War I exploded after a Serbian separatist secret society called the Black Hand conspired to assassinate the Austrian archduke Franz Ferdinand, leading to an arms race that erupted in the guns of August and the start of a conflict that resulted in the deaths of millions.

The Japanese sneak attack on Pearl Harbor was, by definition, a conspiracy that the U.S. military and intelligence agencies failed to detect, leading to conspiracy theories that President Roosevelt let it happen on purpose to drag America into war.

The obsessively paranoid Joseph Stalin wasn’t conspiratorially-minded enough to realize that Hitler was plotting to break their nonaggression pact and invade the Soviet Union, despite warnings from the British government to that effect. The consequence was the deaths of tens of millions of soldiers and civilians.

In the 1950s, with his now-infamous Congressional hearings, the conspiratorially minded Senator Joseph McCarthy launched a witch hunt to ferret out what he claimed was a Communist conspiracy to destroy America.

In the 1960s, Operation Northwoods was a document produced under the Kennedy administration that proposed a number of “false flag” operations that might be carried out in order to justify military intervention in Cuba. Among the proposals were such ideas as staging a fake attack on the U.S. military base at Guantanamo Bay, employing a fake Russian MiG aircraft to buzz a real U.S. civilian airliner, faking an attack on a U.S. ship to make it look like Cubans did it, and developing “a Communist Cuban terror campaign in Miami.” None of these crazy ideas were implemented, but that members of Kennedy’s administration considered them—even in the context of a meeting with people just spitballing ideas willy nilly—reveals the lengths to which even high ranking people in the government are willing to conspire against others to get their way.

In the 1970s, Watergate stands out as a conspiracy of dunces, and the Pentagon Papers revealed the extent to which the Kennedy, Johnson, and Nixon administrations conspired to escalate the Vietnam war without Congressional knowledge, much less approval. And we now know that Kennedy conspired to have Fidel Castro assassinated, Johnson conspired to cover up that fact when he took office, and Nixon secretly recorded conversations in the Oval office that revealed his distinctive view of presidential power—a view that he later summarized in an interview with David Frost as follows: “Well, when the president does it, that means that it is not illegal.”

In the 1980s, the Iran-Contra arms-for-hostages scandal was a conspiracy that embodied what conspiracists since World War I had been concerned about—the usurpation of power by conspirators who were legally elected to their positions instead of hijacking government agencies through a coup, which was common in centuries past.

In the 1990s, government overreach against Randy Weaver and his family in Ruby Ridge, Idaho, and against David Koresh and the Branch Davidians in Waco, Texas, understandably led to the rise of the conspiratorially-minded militia movement that culminated in Timothy McVeigh’s bombing of a federal building in Oklahoma City.

In the 2000s, the George W. Bush administration concocted a conspiracy theory that Iraq was developing weapons of mass destruction as a justification for invading that country, which proved false when inspectors failed to find any WMDs. And Wikileaks revealed the extent to which the NSA and other governmental agencies conspired to spy on Americans and foreign leaders on the heels of 9/11. As Buffalo Springfield cautioned in their 1966 hit song For What It’s Worth, “There’s something happening here. What it is ain’t exactly clear.”

In the 2010s, as if the run-up to the 2016 Presidential election wasn’t crazy enough, in the middle of it, and continuing after Trump’s victory, there emerged a bizarre conspiracy theory at Trump rallies were some of his supporters held signs reading “Q” and “QAnon.” It apparently began with an internet user called “Q Clearance Patriot” or “Q,” who posted on internet message boards like 4chan and 8chan the conspiracy theory that inside the “deep state” there is an “anonymous” source working against the Trump administration. “I can hint and point but cannot give too many highly classified data points,” the Q conspiracist wrote, adding: “These are crumbs and you cannot imagine the full and complete picture.” That complete picture apparently includes such operatives as Hillary Clinton, Barack Obama, George Soros, and various Hollywood celebrities, all alleged to be involved in a global sex trade and pedophile ring.

The “Qincidences” (spelled with a Q) include the recurrence of certain numbers, such as 17 (Q is the 17th letter) and 4, 10 and 20, corresponding to DJT, or Donald J. Trump. And since “there are no coincidences” in the mind of the conspiracist, such numerology led to the absurd 2016 “Qonspiracy theory” (also spelled with a Q) of “Pizzagate”. Promulgators of this theory asserted— without any evidence and beyond belief—that Hillary Clinton was directing a pedophile ring out of a pizza parlor. As absurd as this sounds, the Pizzagate conspiracy theory led a young man to shoot up a restaurant with an AR-15-style rifle, claiming he intended to break up the perceived perversion. It was fortunate no one was hurt in the incident, but it revealed the power of conspiratorial paranoia.

As we approach the 2020s a new type of conspiracism has been identified by the political scientists Russell Muirhead and Nancy L. Rosenblum in their 2019 book, A Lot of People are Saying: The New Conspiracism and the Assault on Democracy. Classic conspiracy theories are grounded in arguments and evidence, whereas more recent conspiracy theories are simply asserted, usually without facts to support them. This new conspiracism is captured in the book’s title, ripped from the 2016 presidential election and Donald Trump’s recurrent phrase “a lot of people are saying,” which was typically followed by no evidence whatsoever for the assertion. As Muirhead and Rosenblum explain the new conspiracism:

There is no punctilious demand for proofs, no exhausting amassing of evidence, no dots revealed to form a pattern, no close examination of the operators plotting in the shadows. The new conspiracism dispenses with the burden of explanation. Instead, we have innuendo and verbal gesture: “A lot of people are saying …” Or we have bare assertion: “Rigged!” … This is conspiracy theory without the theory.

How then does such conspiracism spread and catch hold? Repetition. In the age of social media, what counts is not evidence so much as retweets, re-posts, and likes. And by no means is the new conspiracism the product only of President Trump, given that politicians—not to mention economists, scholars, pundits, and ideologues of all stripes— have been making evidence-lacking assertions for generations, although admittedly without an audience of 60 million twitter followers the current conspiracist-in-chief commands.

More importantly, Trump’s conspiratorial assertions would go nowhere without a receptive audience, so the blame for the nefarious effects of the new conspiracism have to be spread much wider to encompass all of social media, alternative media, and even some mainstream media, which have stepped up their sensationalistic headlines in an effort to recapture advertising dollars they’ve been losing since the rise of the Internet.

Individuals act on their beliefs, and when those beliefs contain conspiracy theories about nefarious goings-on, those acts can turn deadly. That is exactly what happened at the Tree of Life synagogue in Pittsburgh on October 27, 2018, when an assailant armed with guns and one of the oldest conspiracy theories about the Jews running the world, murdered eleven congregants before his capture. “I just want to kill Jews,” he proclaimed.

Consuming content on the online social network Gab, the conspiricist grew paranoid about the Hebrew Immigrant Aid Society (HIAS), which the Tree of Life synagogue helped support. On Gab the conspiracist read that HIAS provided aid to the migrant caravans moving north from Central America toward the United States’ southern border. “HIAS likes to bring invaders in that kill our people,” the assassin posted on Gab just before he committed the mass murder, adding “I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.”

This brings us back to the mass murder in New Zealand with which we began this lecture. These are just two of countless conspiracy theories with real-world consequences, mostly bad. Ideas matter. Beliefs matter. Conspiracy theories matter. And they are not confined to the fringes of pop culture or the dark web, but instead penetrate all areas of public and private life, often directing the lives of people and the course of history.

So as we analyze examples like these in the lectures ahead, I hope you’ll reach the same conclusion that I’ve reached: The subject of this course—conspiracies and conspiracy theories— could well be one of the most important subjects any of us can study. END

About the Author

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine, the host of the Science Salon Podcast, and a Presidential Fellow at Chapman University where he teaches Skepticism 101. For 18 years he was a monthly columnist for Scientific American. He is the author of New York Times bestsellers Why People Believe Weird Things and The Believing Brain, Why Darwin Matters, The Science of Good and Evil, The Moral Arc, and Heavens on Earth. His new book is Giving the Devil His Due: Reflections of a Scientific Humanist.

TAGS: , , , , , , , , , , , , ,

The Moral Arc: How Thinking Like a Scientist Makes the World More Moral

Posted on Jul. 03, 2020 by | Comments Off on The Moral Arc: How Thinking Like a Scientist Makes the World More Moral

In this, the final lecture of his Chapman University Skepticism 101 course, Dr. Michael Shermer pulls back to take a bigger picture look at what science and reason have done for humanity in the realm of moral progress. That is, applying the methods of science and principles of reason since the Scientific Revolution in the 17th century has solved not only problems in the physical and biological/medical fields, but in social and moral realms as well. How should we structure societies so that more people flourish in more places more of the time? Science can answer that question, and it has for centuries. Learning how to think like a scientist can make the world a better place, as Dr. Shermer explains in this lecture based on his 2015 book, The Moral Arc.

Shermer’s Chapman University course, Skepticism 101: How to Think Like a Scientist, covers a wide range of topics, from critical thinking, reasoning, rationality, free speech, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, the Bermuda Triangle, psychics, evolution, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.


Watch the entire 15-lecture Chapman University Skepticism 101 series for free!

Learn how to think like a scientist! Click the button below to browse through the entire course lecture series 1 through 15, and watch all lectures that interest you, for free!

Watch all 15 lectures for free

TAGS: , , , , , , , , , , , , , , , , , , , , , ,

What is Truth, Anyway?

Posted on Jun. 26, 2020 by | Comments Off on What is Truth, Anyway?

In this lecture Dr. Michael Shermer addresses one of the deepest questions of all: what is truth? How do we know what is true, untrue, or uncertain? Given that none of us are omniscient, all claims to knowledge carry a certain level of uncertainty. Given that fact, how can we determine what is true? Included: subjective/internal vs. objective/external truths, Hume’s theory of causality, correlation and causation, the principle of proportionality (or why extraordinary claims require extraordinary evidence), how to think about miracles and the resurrection, mysterian mysteries, post-truth, rational irrationalities, the man who saved the world, Bayesian reasoning, and why love depends on evidence.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , , , , , ,

The Truth About Post-Truth Truthiness

Posted on Jun. 25, 2020 by | Comments (3)

Is post-truth the political subordination of reality? Is truth itself any more under threat today that in the past? Have the populists & postmodernists won the day? In response to Dr. Lee McIntyre’s essay, Dr. Michael Shermer asserts that people are not nearly as gullible as some believe.

Words embody ideas, and their changing usage and meaning are tracked by lexicographers in dictionaries, which therein become barometers of cultural trends. In 2006, for example, the American Dialect Society and Merriam-Webster’s both chose as their word of the year the neologism “truthiness”, introduced by the comedian Stephen Colbert on the premiere episode of his satirical mock news show The Colbert Report (on which I appeared twice1), meaning “the truth we want to exist.”2 It was a prescient comedic bit as a decade later three examples of truthiness entered our lexicon.

After Donald Trump’s Presidential inauguration on January 20, 2017, his special counselor Kellyanne Conway concocted the term “alternative facts” during a Meet the Press interview while defending White House Press Secretary Sean Spicer’s inaccurate statement about the size of the crowd that day. “Our press secretary, Sean Spicer, gave alternative facts to that [the inaugural crowd size], but the point remains that….” at which time NBC correspondent Chuck Todd cut her off: “Wait a minute. Alternative facts? … Alternative facts are not facts. They’re falsehoods.”3 German linguists deemed it the “un-word of the year” (Unwort des Jahres) for 2017. Later that year the related term “fake news” became common parlance, leaping in usage 365 percent and landing it on the “word of the year shortlist” of Collins Dictionary, which defined it as “false, often sensational, information disseminated under the guise of news reporting.”4

Such words (or un-words) are often invoked as evidence that we are living in a “post-truth” era brought on by Donald Trump (according to liberals) or by postmodernism (according to conservatives). Are we living in a post-truth world of truthiness, fake news, and alternative facts? Have the populists and postmodernists won the day? Is all the political, economic, and social progress we have achieved over the past several centuries in reversal—the abolition of slavery and torture, the decline of homicide, crime, and violence, the cessation of the European Great Powers wars, and the expansion of the moral sphere to include civil rights, women’s rights, children’s rights, worker’s rights, and gay rights for more people in more places more of the time? Are we lurching backwards to the Middle Ages when bigots lighted faggots to torch women as witches?

Skeptic 24.3 (cover)

No. The Fall 2019 cover story of Skeptic by the Harvard psychologist Steven Pinker, “Why We Are Not Living in a Post-Truth Era,” explains why, starting with this question: Is the statement “We are living in a post-truth era”…true? If it is, then it isn’t! That is, if you argue that the statement is true then you are making an argument, which means you are committed to determining whether the statement is true or false, which means we have not passed into a post-truth world. Similarly, is the statement “humans are irrational” rational? If it is, then it can’t be because, as Pinker asks rhetorically, “If humans were truly irrational, who specified the benchmark of rationality against which humans don’t measure up?”5 As Pinker reflected in his 2018 book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress, “Mendacity, truth-shading, conspiracy theories, extraordinary popular delusions, and the madness of crowds are as old as our species, but so is the conviction that some ideas are right and others are wrong.”6

In this issue of Skeptic the philosopher Lee McIntyre, author of the book Post-Truth,7 challenges Pinker, starting with a definition of post-truth as the “political subordination of reality,” which he ascertains to be “a tactic in the authoritarian toolbox.” McIntyre’s definition is much narrower than the way Pinker and I use the term, confining it as he does to political propaganda, which he says “is not meant to convince you, but to show you who’s boss.” The message, he says, referencing Jason Stanley’s book How Propaganda Works8, is “I am so powerful that I can dominate your reality, and there is nothing you can do about it.” To reinforce the political nature of post-truth, McIntyre also invokes Tim Snyder’s observation in his 2017 book On Tyranny that “post-truth is pre-fascism,”9 along with Hannah Arendt’s observation that “the ideal subject of totalitarian rule is not the convinced Nazi or the convinced communist, but people for whom the distinction between fact and fiction (i.e., the reality of experience) and the distinction between true and false (i.e., the standards of thought) no longer exist.”10

Post-truth as political propaganda is certainly one use (or misuse) of truth that neither Pinker nor I discount, but McIntyre then accuses Pinker (and others) of merely knocking down one or more of four post-truth straw men: (1) that truth doesn’t matter, (2) that no one really cares about truth anymore, (3) that no one can find the truth, and (4) that if we were actually living in a post-truth era, we should just give up. Instead, to steel-man the problem McIntye asserts that “the claim that we live in a post-truth era is properly based on the idea that truth today is under threat.”

Is it? There certainly are people who, pace Hannah Arendt, cannot seem to distinguish between fact and fiction, true and false, and this shortcoming can lead not only to fascism or communism, but also to Holocaust denial, evolution denial, climate denial, vaccine denial, GMO denial, and more. But is it really that people cannot discern reality, or is it that they are motivated to spin the facts to support some other agenda? Holocaust deniers are anti-Semites. Evolution deniers are religious fundamentalists. Climate deniers mistrust big government. Vaccine deniers distrust big Pharma. GMO deniers detest Monsanto. It isn’t the truth about the facts under dispute, but an underlying motive. Consider an interview reprinted in McIntyre’s book, which he presents as a type specimen of post-truth, in which CNN’s Alisyn Camerota engages the former Republican Speaker of the House Newt Gingrich on crime rates. The exchange is revealing:11

Camerota: Violent crime is down. The economy is ticking up.

Gingrich: It is not down in the biggest cities.

Camerota: Violent crime, murder rate is down. It is down.

Gingrich: Then how come it’s up in Chicago and up in Baltimore and up in Washington?

Camerota: There are pockets where certainly we are not tackling murder.

Gingrich: Your national capital, your third biggest city…

Camerota: But violent crime across the country is down.

Gingrich: The average American, I will bet you this morning, does not think crime is down, does not think they are safer.

Camerota: But it is. We are safer and it is down.

Gingrich: No, that’s just your view.

Camerota: It is a fact. These are the national FBI facts.

Gingrich: But what I said is also a fact. … The current view is that liberals have a whole set of statistics that theoretically may be right, but it’s not where human beings are. People are frightened.

Camerota: But what you’re saying is, but hold on Mr. Speaker because you’re saying liberals use these numbers, they use this sort of magic math. These are the FBI statistics. They’re not a liberal organization. They’re a crime-fighting organization.

Gingrich: No, but what I said is equally true. People feel more threatened.

Camerota: Feel it, yes. They feel it, but the facts don’t support it.

Gingrich: As a political candidate, I’ll go with how people feel and let you go with the theoreticians.

As I read it, this isn’t an example of the post-truth equivalent of, as McIntyre describes it, a “chilling exchange in the basement of the Ministry of Love in the pages of George Orwell’s dystopian novel 1984.” Camerota and Gingrich are simply talking about two different matters: crime rates and peoples’ perceptions about crime rates. The difference represents a cognitive illusion due to the availability bias, in which one assesses a problem based on the most immediate and salient available example, usually from the evening news that features individual crimes, especially homicides. Camerota is a journalist focusing on the long-term decline of crime, whereas Gingrich is a politician trying to garner support by appealing to peoples’ fears about crime, citing the equally true statistics about recent upticks in crime in a handful of U.S. cities, most notably Chicago, Baltimore, and Washington D.C., which Camerota acknowledges. Both facts are true, so this is not an example of recent post-truthiness but of good old-fashioned spin-doctoring, which has been around at least since the 1940s, when George Orwell noted: “Political language— and with variations this is true of all political parties, from Conservatives to Anarchists—is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.”12 The problem can be traced back even further, as when Edmund Burke commented on the language surrounding the French Revolution:

The whole compass of the language is tried to find sinonimies [synonyms] and circumlocutions for massacres and murder. Things are never called by their common names. Massacre is sometimes called agitation, sometimes effervescence, sometimes excess; sometimes too continued an exercise of a revolutionary power.13

McIntyre says that the purpose of post-truth “is political, not epistemological,” but is not the former subsumed in the latter? It’s all epistemological, inasmuch as everything turns on what constitutes reliable knowledge, and that is a very old problem indeed.

Even the concept of post-truth is not new. The Oxford Dictionaries has tracked its use back to 1992, the year we founded Skeptic magazine, the early years of which were devoted to the “science wars,” which were fought over the nature of truth and whether or not science was the royal road to it. Many thought not, coming to believe that there is no objective reality to be discovered and no belief, idea, hypothesis, or theory that is closer to the truth than any other. In his 1996 Skeptic article “More Higher Superstitions,” Norman Levitt (coauthor of the book Higher Superstition14) describes the problem in language that could have been written in 2019:

Science studies…overlaps what is nowadays called cultural studies, a tendency that has effaced traditional scholarship in a number of areas, and it has absorbed many of the radically relativistic attitudes that predominate in postmodern cultural anthropology. The central doctrine of science studies is that science is “socially constructed” in a way that disallows traditional notions of scientific validity and objectivity. On this view, scientific theories are merely narratives peculiar to this culture and this point in its history. Their chief function is to create stories about the world consonant with dominant social and political values. Thus, they are no more “true,” or even more reliable, than the myths, legends, and just-so stories of other cultures. All are equally culture- specific.15

Post-truth claims were just as prominent in the 1990s as they are now, and no less criticized, even parodied. Recall that this was the decade of the wildly popular television series The X-Files, a conspiracy-laden mosh pit of aliens and UFOs, monsters and demons, mutants and shape-shifters, urban legends and government cover-ups, and all manner of paranormal piffle. So trendy was the show that The Simpsons caricatured it with an episode titled “The Springfield Files,” in which Homer has a close encounter of the third kind after downing ten bottles of beer. X-Files stars Gillian Anderson and David Duchovny (Scully and Mulder) guest star as investigators of the alien abduction, and Leonard Nimoy, host of the 1970’s more-or-less nonfiction version of The X-Files called In Search of…, voiced the introduction, announcing: “The following tale of alien encounters is true. And by true, I mean false. It’s all lies. But they’re entertaining lies, and in the end isn’t that the real truth? The answer is no.”

So post-truthiness is not new, but the availability bias dialed up to eleven through social media led the Oxford Dictionaries to name “post-truth” as its word of the year in 2016 after it documented a 2000 percent spike in usage over the previous year, characterizing it as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” As the dictionaries’ editor Casper Grathwohl noted: “We first saw the frequency really spike this year in June with buzz over the Brexit vote and again in July when Donald Trump secured the Republican presidential nomination. Given that usage of the term hasn’t shown any signs of slowing down, I wouldn’t be surprised if post-truth becomes one of the defining words of our time.”16 Even the veteran CBS anchorman and 60 Minutes correspondent Scott Pelley succumbed to the temptation to think we’re living in a post-truth era. On the final day of 2019, reflecting on the message of his new book Truth Worth Telling, summed up what has happened to truth in the decade of the 2010s:

This is the thing that worries me the most about our beloved country. We have gone from the information age to the disinformation age. I think our viewers and our readers now have a responsibility that they’ve never had before, and that is that they have to be careful about how they choose their information diet. This is going to be a problem for the rest of our history, in particular for democracies.17

Not only is post-truthiness not new, but the response to challenges to objective knowledge are as robust today as in the past, if not more so, having moved far beyond the pages of niche magazines like Skeptic and Scientific American, which defend science, reason, empiricism, and fact checking, and is now routinely addressed in national news magazines and newspapers. Despite President Trump’s constant reference to the “failing New York Times” in his Twitter feed,18 for example, the circulation of the Grey Lady has skyrocketed since Trump was elected. In the 4th quarter of 2018 alone, for example, the New York Times added 265,000 digital subscriptions, turned a profit of $55.2 million, and saw its newsroom staff grow to 1,600 people, the largest number in its 167-year history.19

Today, as dictionaries track the upswing in post-truth language, and as political pundits pronounce the end of truth and with it the Republic (if you can keep it), the Internet of ideas has responded with tools to combat the illiberalism of unreason: real time fact-checking. As politicians engaged in the old-time art of spindoctoring the truth in speeches, fact-checkers at,,, and tallied their errors and lies, the latter cheekily ranking statements as True, Mostly True, Half True, Mostly False, and Pants on Fire. As PolitiFact’s editor Angie Holan explained: “journalists regularly tell me their media organizations have started highlighting fact-checking in their reporting because so many people click on fact-checking stories after a debate or high-profile news event.”20

Finally, the idea that post-truthiness could invade the brains of gullible citizens is gainsaid by new research by cognitive scientists that demonstrates that people are not nearly as gullible as we’ve been led to believe. A 2020 book by the cognitive scientist Hugo Mercier, Not Born Yesterday, presents a mountain of evidence “against the idea that humans are gullible, that they are ‘wired not to seek truth’ and ‘overly deferential to authority’, and that they ‘cower before uniform opinion’,” quoting Jason Brennan in his book Against Democracy. In fact, Mercier reveals through both laboratory research and real-world examples that “far from being gullible, we are endowed with a suite of cognitive mechanisms that evaluate what we hear or read.” And far from defaulting to believing everything we hear, Mercier notes that “by default we veer on the side of being resistant to new ideas. In the absence of the right cues, we reject messages that don’t fit with our preconceived views or preexisting plans. To persuade us otherwise takes long-established, carefully maintained trust, clearly demonstrated expertise, and sound arguments.”21

Mercier begins by showing why evolution could not have created animals that are so gullible as to be routinely exploited by others, as that would ultimately lead to reproductive failure and the extinction of extreme gullibility. The balance in communication between belief and skepticism led to an evolutionary arms-race between deception and deception detection, along with cognitive mechanisms “that help us decide how much weight to put on what we hear or read.” Mass persuasion, for example, is extremely difficult to pull off, and most attempts at it fail miserably, because when scaled up from two-person communication to large audiences, trust cues do not scale up accordingly. Most preachers, prophets, and demagogues fail, but because of the availability bias we only remember the biggest names in the genre, such as Jesus and Hitler. But even these examples fail upon further inspection. In his own time Jesus was a disappointment at starting a new religion (which might not have been his mission in any case), and even the apostle Paul barely got Christianity rolling. It wasn’t until the 4th century that Christianity began to number in the millions, which sounds impressive until we consider the power of compound interest, in which a small but steady growth can yield an enormous figure given enough time. Invest $1 at a constant yearly interest rate of 1% in the year 0, if the dividends are reinvested by the year 2020 the investment would be worth over $2.4 billion. Mercier cites statistics compiled by the sociologist of religion Rodney Stark, who estimates Christianity’s growth rate at 3.5% over the centuries. If each Christian only saves a few souls in a lifetime the religion could easily compile tens of millions in a matter of a few centuries, and over two billion by today. Perhaps this is why the dismal conversion rate of, for example, Mormon missionaries on their two-year missions, is not a concern for the church, as the success rate can be small if the process goes on long enough (coupled to high birth rates, of course, which most religions encourage).

As for Hitler, Mercier presents compelling evidence revealing that most Germans did not accept Nazi ideology, nor most of the planks in the regime platform, and even the anti-Semitism so famously on display in Hitler’s writings and speeches was only effective on Germans who were already anti-Semitic. The euthanasia of the handicapped in the 1930s was resisted by most Germans and got so much bad press that the Nazis made the program secret and issued orders to never speak of it, a policy carried through the Final Solution and the Holocaust, which was shrouded in secrecy and mostly carried out in Poland, far from the prying eyes of German citizens. Hitler’s anti-communism appealed to right-leaning Germans but was rejected among industrial workers. By 1942, most citizens did not believe the declarations of victory issued by the propaganda minister Joseph Goebbels, instead relying on secreted BBC reports of how the war was really going for Germany (not well). As the Nazi intelligence agency Sicherheitsdienst (SD) reported: “Our propaganda encounters rejection everywhere among the population because it is regarded as wrong and lying.”22

To the commonly asked question “How could so many highly educated, intelligent, and cultured Germans become Nazis?” the answer is: “Most didn’t.” The entire regime—not unlike the Soviet Union and North Korea—was held aloft on pluralistic ignorance, in which individual members of a group don’t believe something but believe that most others in the group believe it. When no one speaks up—or are prevented from speaking up through state-sponsored censorship or imprisonment—it produces a “spiral of silence” that can transmogrify into witch hunts, purges, pogroms, and repressive political regimes. This is why totalitarian and theocratic regimes restrict speech, press, trade, and travel, and why the route to breaking the bonds of such repressive governments and ideologies is free speech, free press, free trade, and accurate and trustworthy information.

In a wide-ranging conversation for my Science Salon podcast, I asked Mercier directly, “are we living in a post-truth era?” His answer was clear:

In many ways it’s better than it’s ever been, in that people are more informed than they used to be, and because of that they tend to be more consistent in their points of view. Fake news, for example, is a very marginalized phenomenon. Only a few percent of Twitter or Facebook users actually saw or spread fake news, and it doesn’t appear to effect those who see it. But everyone has heard of fake news, so on the whole I think the information environment is improving, slowly and perhaps not as much as we would like it to be, but I think things are better than they used to be. People still want accurate opinions and they care about the truth. Even people who support Trump—studies show when you show them that something about Trump is fake news they accept that, even while maintaining their support for Trump.

In one of my final Scientific American columns I coined my own neologism in the Colbert tradition: Factiness, or the quality of something seeming to be factual when it is not.23 But how do we know when something is factual and not factiness? We employ science and reason! There is progress in science and culture, and some ideas really are better than others. The post-Enlightenment ideal that beliefs should be tested in the laboratory and marketplace of ideas with the goal of generating objective and disinterested knowledge may seem Sisyphean, in that we are always in danger of backsliding into truthiness and factiness in which propaganda, superstition, and self-serving sophistry can slow our progress in pushing the boulder of knowledge up the mountain of ignorance, but that is precisely what we’ve been doing for millennia.

Per aspera ad astra—with difficulty to the stars. END

About the Author

Dr. Michael Shermer is the Publisher of Skeptic magazine, the host of the Science Salon Podcast, a regular contributor to, and Presidential Fellow at Chapman University. His latest book is Giving the Devil His Due: Reflections of a Scientific Humanist. As a public intellectual he regularly contributes Opinion Editorials, book reviews, and essays to The Wall Street Journal, The Los Angeles Times, Science, Nature, and other publications. He appeared on such shows as The Colbert Report, 20/20, Dateline, Charlie Rose, and Larry King Live (but, proudly, never Jerry Springer!). He was a monthly columnist for Scientific American. His two TED talks, seen by millions, were voted in the top 100 of the more than 1000 TED talks. He holds a Ph.D. from Claremont Graduate University in the history of science.

  1. On August 21,2007: and on July 11, 2011:
  2. Colbert, Stephen. 2005. “The Word—Truthiness.” The Colbert Report,
  3. Interview with Kellyanne Conway. January 22, 2017. NBC Meet the Press.
  4. Definition of “fake news.” Collins Dictionary.
  5. Pinker, Steven. 2019. “Why We Are Not Living in a Post-Truth Era.” Skeptic, Vol. 24, No. 3.
  6. Pinker, Steven. 2018. Enlightenment Now: The Case for Reason, Science, Humanism, and Progress. New York: Viking, 375.
  7. McIntyre, Lee. 2018. Post-Truth. Cambridge: MIT Press.
  8. Stanley, Jason. 2015. How Propaganda Works. Princeton, NJ: Princeton University Press.
  9. Snyder, Timothy. 2017. On Tyranny: Twenty Lessons from the Twentieth Century. New York: Tim Duggan Books/Penguin Random House, 71.
  10. Arendt, Hannah. 1951/1994. The Origins of Totalitarianism. New York: Harcourt, 474.
  11. Quoted in: McIntyre, ibid., 3–4.
  12. Orwell, George. 1946. “Politics and the English Language.” Horizon, April.
  13. Burke, Edmund. 1790/1967. Reflections on the Revolution in France. London: J.M. Dent & Sons.
  14. Gross, Paul and Norman Levitt. 1996. Higher Superstition: The Academic Left and its Quarrels with Science. Baltimore: The Johns Hopkins University Press.
  15. Levitt, Norman. 1996. “More Higher Superstitions: Knowledge, Knowingness, and Reality.” Skeptic, Vol. 4, No. 4, 79.
  16. Editors. 2016. “’Post-truth’ declared word of the year by Oxford Dictionaries. BBC News. November 16.
  17. Pelley, Scott. 2019. Inter view. Reliable Sources. CNN. December 31.
  18. For one example among hundreds see his tweet of October 3, 2018 at 4:53 a.m.: “The Failing New York Times did something I have never seen done before…”
  19. Associated Press. 2019. “New York Times subscriber numbers are skyrocketing in the Trump age.” MarketWatch. February 6.
  20. Quoted in: Glaisyer, Tom. 2016. “Cranking up the Truth- O-Meter: Giving a Boost to Truth in Politics.” Democracy Fund. January 13.
  21. Mercier, Hugo. 2020. Not Born Yesterday: The Science of Who We Trust and What We Believe. Princeton, NJ: Princeton University Press, 257, 270–271. Quotes from Brennan are in: Brennan, Jason. 2019. Against Democracy. Princeton, NJ: Princeton University Press, 8.
  22. Kershaw, Ian. 1983. “How Effective was Nazi Propaganda?” In D. Welch (Ed.), Nazi Propaganda: The Power and the Limitations (180–205). London: Croom Helm, 199.
  23. Shermer, Michael. 2018. “Factiness: Are we living in a post-truth world?” Scientific American, March.
TAGS: , , , , , , , , , , , , , , ,

Is Freedom of Speech Harmful for College Students?

Posted on Jun. 19, 2020 by | Comments Off on Is Freedom of Speech Harmful for College Students?

In this lecture, Dr. Michael Shermer addresses the growing crisis of free speech in college and culture at large, triggered as it was by the title lecture, which he was tasked to deliver to students at California State University, Fullerton, after a campus paroxysm erupted over “Taco Tuesday,” in which students accused other students of “cultural appropriation” for non-Mexicans appropriating Mexican food from Mexicans, which if you’ve ever been to Southern California becomes absurd on the face of it inasmuch as Mexican cuisine is among the most popular dining options. From there Shermer reviews the history of free speech, the difference between government censorship and private censorship, the causes of the current crisis, and what we can do about it.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , , , , , , , , , , , , , , ,

What are Science & Skepticism?

Posted on Jun. 12, 2020 by | Comments Off on What are Science & Skepticism?

This lecture, traditionally the first in the series for the Skepticism 101 course, is based on the first couple of chapters from Dr. Michael Shermer’s first book, Why People Believe Weird Things, presenting a description of skepticism and science and how they work, along with a discussion of the difference between science and pseudoscience, and some very practical applications of how to test claims and evaluate evidence. The image for this lecture is the original oil painting for the first cover of Why People Believe Weird Things, commissioned by the publisher and painted by the artist Lawrence Berzon.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

The audio is out of sync with the video in “What is a Skeptic?” Here’s the link to view it. If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , ,

Evolution & Creationism, Part 2: Who says evolution never happened, why do they say it, and what do they claim?

Posted on Jun. 05, 2020 by | Comments Off on Evolution & Creationism, Part 2: Who says evolution never happened, why do they say it, and what do they claim?

Dr. Michael Shermer continues the discussion of evolution and creationism, focusing on the history of the creationism movement and the four stages it has gone through: (1) Banning the teaching of evolution, (2) Demanding equal time for Genesis and Darwin, (3) Demanding equal time for creation-science and evolution-science, and (4) Intelligent Design theory. Shermer provides the legal, cultural, and political context for how and why creationism evolved over the 150 years since Darwin published On the Origin of Species in 1859, thereby providing a naturalistic account of life, ultimately displacing the creationist supernatural account. Finally, Shermer reviews the best arguments made by creationists and why they’re wrong.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , ,

Wicked Games
Lance Armstrong, Forgiveness and Redemption, and a Game Theory of Doping

Posted on May. 31, 2020 by | Comments (7)

Part 2 of the documentary film “Lance” airs tonight on ESPN and served as a catalyst for this article that employs game theory to understand why athletes dope even when they don’t want to, as well as thoughts on forgiveness and redemption. The article is a follow up to and extension of Dr. Shermer’s article in the April 2008 issue of Scientific American.

All images within are screenshots from Marina Zenovich’s 3 hour and 22 minute film and are courtesy of ESPN, who provided a press screener. In appreciation. Zenovich also produced Robin Williams: Come Inside My Mind (2018), Water & Power: A California Heist (2017), Richard Pryor: Omit the Logic (2013), and Roman Polanski: Odd Man Out (2012).

Toward the end of Marina Zenovich’s riveting documentary film on Lance Armstrong, titled simply Lance and broadcast on ESPN May 24 and May 31, 2020, the former 7-time Tour de France champion grouses about the apparent ethical hypocrisy of why TdF champions like the German Jan Ullrich (1997), the Italian Marco Pantani (1998), and himself (1999–2005) were utterly disgraced and had their lives ruined because of their doping, whereas cyclists such as the German Eric Zabel, the Italian Ivan Basso, and the American George Hincapie are “idolized, glorified, given jobs, invited to races, put on TV” even though they’re “no different from us” inasmuch as they doped as well. Of Pantani, whom Armstrong famously battled up many a mountainous climb, Lance scowls that “they disgrace Marco Pantani, they destroy him in the press, they kick him out of the sport, and he’s dead. He’s fucking dead!” Ditto Ullrich. “They disgrace, they destroy, and they fucking ruin Jan Ullrich’s life. Why? … That’s why I went. Because that’s fucking bullshit.”

Lance Armstrong (Jan 1)

This invective, in fact, comes on the heels of the most touching moment of the nearly 3.5-hour film, in which a lachrymose Armstrong loses his tough-guy composure when asked why he spontaneously flew to Europe to support his former rival in his time of need. Ullrich’s life was unraveling after a series of incidents involving drugs, alcohol, and violence, and Armstrong’s emotional fracture in recalling it is so out of character from his public image that it may give even his most cynical critics pause. His answer? “I love him.”

Lance Armstrong
Lance Armstrong (Jan 2)

That a quintessentially straight jock from Texas would admit on camera that he loves another man surely humanizes someone who has otherwise for decades been a picture of leather-neck toughness, a “badass motherfucker” as his former teammate Floyd Landis describes him. If he were a 1950’s test pilot he’d be a steely-eyed missileman staring down the sound barrier. If he were a boxer he’d be Jake LaMotta mercilessly pounding opponents into the canvas. If he were a basketball player he’d be Michael Jordan, entering each sporting contest like it was a matter of life and death, which it was for Lance. “I like to win,” he told filmmaker Alex Gibney, “but more than anything, I can’t stand this idea of losing. Because to me, losing means death.”

That such a film as this is so widely viewed, coming as it is on the heels of the most-watched show in ESPN history, on Michael Jordan’s final season with the Chicago Bulls, is one answer to Lance’s puzzlement about the asymmetrical treatment he has received. It was over two decades ago (1999) that Lance won his first Tour de France and was invited to Bill Clinton’s White House. To put this into further perspective, Armstrong won his 7th and last Tour de France two years before Steve Jobs introduced the iPhone in 2007. That seems a lifetime ago, and yet we’re still talking about Lance Armstrong. Why?

Lance Armstrong and David Letterman

To answer the question I think we must distinguish between Lance’s doping that was, in fact, a logical outcome of a corrupt system that forced most top cyclists at the time to choose between cheating and quitting, from his mendacity and intimidation to enforce Omerta that included threats, lawsuits, libelous public statements, and alleged backroom deals that harmed anyone who threatened to break their silence. As well, to invoke the title of Armstrong’s bestselling memoir, it was never about the bike and always about Lance. As the idiom suggests, those who reach great heights have further to fall. In short, the continued interest in the rise and fall of Lance Armstrong has more to do with the human condition and what it reveals about our species and the wicked games we play.

The Dope on Doping

Doping has long been a part of cycling. From the 1940s through the 1980s stimulants and painkillers were ubiquitous. As the 5-time Tour winner Jacques Anquetil snorted, “You can’t ride the Tour de France on mineral water.” And when challenged to elaborate, quipped: “Everyone in cycling dopes himself. Those who claim they don’t are liars.” With that as the norm, doping regulations were virtually nonexistent until the British champion Tom Simpson keeled over dead on the climb up the legendary Mont Ventoux in the 1967 Tour. An autopsy revealed a pharmacopoeia of drugs in his body and a vial of amphetamines in his jersey pocket. But even after that tragedy and the implementation of incipient testing, the dopers were always ahead of the doping controls. When I was competing in the 3,000-mile nonstop transcontinental bicycle Race Across America in the 1980s blood doping was both popular and allowed until after the 1984 Olympics and was a quantum leap over earlier techniques, but I begged off it because it seemed medically risky to inject a bag of your own or someone else’s blood in order to boost the amount of oxygen-carrying red blood cells in your system. Lance’s teammate Tyler Hamilton, in his 2012 book The Secret Race, recounts horror stories about injecting bags of spoiled blood and the illness that followed.

This risk was averted in the early 1990s, before Lance entered the sport professionally, with the introduction of genetically engineered recombinant erythropoietin — r-EPO. Natural EPO is a hormone released by the kidneys into the bloodstream, which carries it to receptors in bone marrow, stimulating it to pump out more red blood cells. Chronic kidney disease and chemotherapy can cause anemia, and so the development of the EPO substitute r-EPO in the late 1980s was a savior for chronically anemic patients … and oxygen hungry endurance athletes. Taking r-EPO is just as effective as getting a blood transfusion, but instead of messing around with bags of blood hanging from hotel room picture hooks and poking long needles into uncooperative veins, cyclists could now store tiny ampoules of r-EPO on ice in a thermos bottle or hotel minifridge, then simply inject the hormone under the skin, boosting the rider’s hematocrit (HCT), or the percentage of red blood cells in the total volume of blood. The normal range of HCT is in the mid-40s. Endurance training can boost it naturally into the high 40s or low 50s. Multi-week stage races like the Tour de France cause HCT to steadily decrease. EPO can push those levels into the high 50s and even the 60s and keep them there. Bjarne Riis, the winner of the 1996 Tour de France was nicknamed “Mr. 60 Percent”, and in 2007 he confessed that EPO was behind his moniker. After a test was developed in 2000 that could detect EPO, dopers shifted to micro-dosing it intravenously and/or returning to blood doping under tight supervision.


How big a difference does EPO make? In Zenovich’s film Armstrong’s U.S. Postal teammate Jonathan Vaughters makes a back-of-the-envelope calculation that the drug enhances performance by about 10 percent. In a 100-hour race in which the first and last place riders are separated by two hours, or two percent, this is a game changer. The infamous sports physiologist and convicted doping doctor Michele Ferrari, who for years worked exclusively with Armstrong and the U.S. Postal team, quantified the effect for me more specifically when I interviewed him for a 2008 article in Scientific American on doping in sports: “If the volume of [red blood cells] increases by 10 percent, performance improves by approximately 5 percent. This means a gain of about 1.5 seconds per kilometer for a cyclist pedaling at 50 kilometers per hour in a time trial, or about eight seconds per kilometer for a cyclist climbing at 10 kph on a 10 percent ascent.” Thus, a cyclist who boosts his hematocrit by 10 percent can lop off 75 seconds in a 50-kilometer time trial, which is typically decided by a few seconds, or 80 seconds per climb on any of the numerous 10-kilometer 10 percent mountain passes the riders negotiate in the Pyrenees and Alps, often decided by a few tens of seconds. This advantage is not one that athletes can afford to give away to their competitors.

EPO forced cyclists into choosing between doping and quitting the sport, as the three-time Tour winner Greg LeMond discovered in 1991. After logging victories in 1986, 1989 and 1990, LeMond set his sights on equaling or bettering the record of five Tours de France achieved by only three cyclists before him — Jacques Anquetil, Bernard Hinault, and Eddy Merckx. “I was the fittest I had ever been, my split times in spring training rides were the fastest of my career, and I had assembled a great team around me,” LeMond told me. “But something was different in the 1991 Tour. There were riders from previous years who couldn’t stay on my wheel who were now dropping me on even modest climbs.” The following year was worse, as LeMond refused to dope and would not allow his teammates to either. The result: “our team’s performance was abysmal” and “I couldn’t even finish the race.”

Greg LeMond
Greg LeMond

Greg’s hunch is backed by data. Average speeds of the winners of the Tour de France spiked upward beginning in 1991. To control for yearly variance effected by course changes and weather over time, I averaged the speeds over 14-year periods going backward and forward in time from 1991, then compared those to the peak Armstrong era and after. The averages are plotted on the graph below. In the period 1991–2004 the winners’ average speed jumped 9 percent over the corresponding speed in the period 1977–1990, an increase that cannot be accounted for by improvements in equipment, nutrition or training. Lance’s final victory in 2005 is the fastest Tour ever recorded at 25.9 mph. The extensive disqualification of dopers in 2007 brought the average speed down to 24.2 mph. It has hovered around there ever since, bouncing around between 24.5 mph and 25.1 mph through 2019, with an average speed between 2008 and 2019 of 24.9 mph. The spike in 2017 may be a statistical anomaly or the product of varying race conditions, but it is interesting to note that the winner, Chris Froome, later that year tested positive in the Tour of Spain for salbutamol, an asthma medication that opens up the medium and large airways in the lungs. Although Froome was ultimately cleared by the UCI, it is a curious thing that some professional cyclists seem to come down with asthma around the time of the three grand tours.

Miles per hour
Join the Club or Go Home and Get a Real Job

In his bestselling book Game of Shadows, the San Francisco Chronicle investigative reporter, Lance Williams, who broke the BALCO doping scandal in baseball, made this observation: “Athletes have a huge incentive to dope. There are tremendous benefits to using the drugs, and there is only a small chance that you will get caught. So depending on your sport and where you are in your career, the risk is often worth it. If you make the team, you’ll be a millionaire; if you don’t, you’ll probably go back to driving a delivery truck.” Armstrong’s teammate for many years, Tyler Hamilton, confirmed to Zenovich that the logic applied to the sport of cycling as well: “It was either join the club or go home, finish school, and get a real job.”

Tyler Hamilton

Once it becomes known that the top competitors in a sport are doping, the rule breaking cascades down through the ranks until an entire sport is corrupted. Based on his numerous interviews with athletes, coaches, trainers, drug dealers and drug testers, Williams estimates that between 50 and 80 percent of all professional baseball players and track and field athletes were doping. Given that reality, Williams told me, “There is the conviction that everyone they are competing against is cheating already.” By way of example, Williams noted that Charlie “the Chemist” Francis, coach of Ben Johnson, the sprinter and (briefly) 1988 Olympic gold medalist in the 100-meter run who was busted for doping and stripped of his medals, told him that the doping was “completely self-defensive.” How so? “It was cheat or lose.”

Armstrong’s U.S. Postal teammate Frankie Andreu, a domestique in support of the team leader in the mid 1990s, told me: “For years I had no trouble doing my job to help the team leader. Then, around 1996, the speeds of the races shifted dramatically upward. Something happened, and it wasn’t just training.” Andreu resisted doping as long as he could, but by 1999 he was unable to do his job: “It became apparent to me that enough of the peloton was on the juice that I had to do something.” He began injecting himself with r-EPO two to three times a week. The boost was exactly what he needed “to dig a little deeper, to hang on to the group a little longer, to go maybe 31.5 miles per hour instead of 30 mph.” That seemingly small difference is actually larger than it appears as it can make the difference between staying in the peloton and getting dropped; when you’re dropped and unable to enjoy the drafting benefits of riding in a large pack of riders, that can spell the difference between staying in the race or taking a flight home. This is where the game theory matrix of incentives kicks in.

The Game Theoretic Logic of Doping

Game theory is the study of how players in a game choose strategies that will maximize their return in anticipation of the strategies chosen by the other players. The “games” for which the theory was invented are not just gambling games such as poker or sporting contests in which tactical decisions play a major role; they also include serious life matters in which people make economic choices, military decisions, and even nuclear diplomatic strategies like Mutual Assured Destruction, in which neither nation (the US and USSR during the Cold War) has an incentive to launch a nuclear first strike because the other guy will retaliate in kind, leaving both countries decimated. What these “games” have in common is that each player’s “moves” are analyzed according to the range of options open to the other players.

Prisoners Dilemma

The game of prisoner’s dilemma is the classic example: You and your partner are arrested for a crime, and the two of you are held incommunicado in separate prison cells. Even if neither of you wants to confess or rat out the other, the D.A. can change your incentive through the following matrix of options (depicted visually in the table):

  • If you both remain silent, you each get a year (top left).
  • If the other guy confesses and you do not, you get three years and he goes free (top right).
  • If you confess but the other guy doesn’t, you go free and he gets three years in jail (bottom left).
  • If you both confess, you each get two years (bottom right).

With these possible outcomes the logical choice is to defect from the advance agreement and betray your partner. Why? Consider the choices from the first prisoner’s point of view. The only thing the first prisoner cannot control about the outcome is the second prisoner’s choice. If the second prisoner remains silent then the first prisoner earns the “temptation payoff” (no jail time) by confessing, but gets a year in jail (the “high payoff”) by remaining silent. The better outcome in this case is for the first prisoner is to confess. But if the second prisoner confesses, then once again the first prisoner is better off confessing (the “low payoff” or two years in jail) than remaining silent (the “sucker payoff” or three years in jail). Because the circumstances from the second prisoner’s point of view are entirely symmetrical to the ones described for the first, each prisoner is better off confessing no matter what the other prisoner decides to do.

The prisoner’s dilemma game has been played in many experimental conditions, revealing that when subjects play the game just once or for a fixed number of rounds without being allowed to communicate with the other prisoner, defection by confessing is the common strategy. But when subjects play the game for an unknown number of rounds, the most common strategy is tit-for-tat: each begins cooperating with the prior agreement by remaining silent, then mimics whatever the other player does. Even more mutual cooperation can emerge if the players are allowed to communicate and establish mutual trust. But once defection by confessing builds momentum, it continues throughout the game and cheating becomes the norm.

“It was either join the club or go home, finish school, and get a real job.” —Tyler Hamilton

In cycling, as in baseball and other sports, the contestants compete according to a set of rules, which clearly prohibit the use of performance-enhancing drugs. But because the drugs are so effective and many of them are so difficult to detect, and because the payoffs for success are so great, the incentive to use banned substances is tempting. Once a few elite athletes defect from the rules by doping to gain an advantage, their rule-abiding competitors must defect as well. But because doping is against the rules, a code of silence — Omerta — prevents any open communication about how to flip the matrix incentives and return to abiding by the rules.

Nash Equilibrium and the Level Playing Field

In game theory, if no player has anything to gain by unilaterally changing strategies, the game is said to be in a Nash equilibrium, discovered by the mathematician John Forbes Nash, Jr., (portrayed by Russel Crowe in the film A Beautiful Mind), who went on to win the Nobel Prize in economics for his pioneering research in game theory. When everyone in a system violates the rules, or if everyone just thinks that everyone else is violating the rules (even if they are not all so doing), cheating can become a Nash equilibrium, which turns it from a moral violation to a rational choice. As the title of an article analyzing average Tour speeds put it: “Doping: A Necessity, Not a Sin.” But if everyone outside the system thinks that the rules are enforced (even while those inside the system know better), fans respond accordingly with moralistic punishment for the cheaters.

Just do the right thing: sack Lance

I have yet to see anyone inside or outside the sport explain it this way, which in a manner of speaking at least partially absolves the athletes while shifting some the moral culpability to the regulatory bodies of the sport. That is, the governing bodies of a sport must change the payoff values of the expected outcomes identified in the game matrix. First, when other players are playing by the rules, the payoff for doing likewise must be greater than the payoff for cheating. Second, and perhaps more important, even when other players are cheating, the payoff for playing fair must be greater than the payoff for cheating. Players must not feel like suckers for following the rules.

In a Nash Equilibrium of mass doping, is it a level playing field? That is, if everyone was doping then can we at least conclude that the best cyclist won all those Tours de France (not just Lance, but Pantani in 1997, Ullrich in 1998, and the others before and since)? We will never know for sure, of course, and while it is unlikely that 100% of cyclists were doping, it is telling that none of Lance’s competitors who were in a position to win but were bested by him are claiming victory or demanding restitution (and every one of the podium finishers in all seven of Lance’s TdF victories was eventually busted for doping and/or later confessed to it). In any case, anyone who would join a Fair Play for Other Dopers Committee would find it difficult to gain much sympathy among ethicists.

Anyone who would join a Fair Play for Other Dopers Committee would find it difficult to gain much sympathy among ethicists.

Some have argued that Lance’s extensive resources that enabled him to hire the best sports physiologist in the world for exclusive preparation gave him an edge over his competitors. In that vein, an unintentionally humorous moment in Zenovich’s film comes in the segment on Operatión Puerto, in which Spanish police raided the lab of a sports physiologist named Eufemiano Fuentes, whose secret code for the drugs and blood bags of athletes consisted of their initials or, in the case of Jan Ullrich, his first name. But from the time that Lance said he started doping in 1993 through his first Tour victory in 1999, he didn’t have the extensive resources that victory and fame subsequently brought him. And surely the president of the Union Cycliste International (the UCI, the sport’s governing body), Hein Vergruggen, carries some moral accountability for turning a blind eye to the corruption he not only could not have failed to notice, but actively participated in covering it up in the name of saving his sport after the catastrophic 1998 Tour that exposed the massive doping scandal already underway while Lance was still struggling to come back from cancer and chemotherapy.

Eufemiano Fuentes

It is not unreasonable to argue that the playing field wasn’t level, but it isn’t now and never was level, drugs or no drugs. Genetically gifted riders with a fire in the belly to win and the discipline to train 500 miles a week accrue not only superior fitness but greater resources in the form of more sponsorship dollars, faster support riders, smarter coaches and managers, better training facilities, food, and other creature comforts. For example, the well-capitalized British Team Sky would rent out rooms on Mount Teide in Tenerife in the Canary islands for the entire year so that their team members could train at altitude, a legal method of increasing oxygen-rich blood cells. The real victims of doping are not the other top riders whom Lance beat (all of whom doped), but the athletes who DNF’d, finished near the bottom of the leaderboard, or gave up their dream and went home to get a real job. For that you can blame systemic corruption of the system and the logical deterioration of norms of fairness that follow from it, more than any single cyclist no matter how much they capitalized on it.

Lance Armstrong
Breaking the Chain

At the end of my Scientific American article I suggested that in order to reform cycling and encourage cyclists to play by the rules, the expected values of the doping game had to be changed. For example, building and enforcing a much stricter drug-testing regimen would dramatically increase the likelihood of getting caught, thereby tilting the matrix incentive toward riding clean; for those who want to risk getting caught, increasing the penalty for doping from a temporary to a lifetime ban on competing would presumably nudge the motivation toward fair play even further.

I also suggested granting immunity to athletes for past cheating if they come clean about how doping programs worked; increase the number of competitors tested, both in competition and out-of-competition; disqualify all team members from an event if any member of the team tests positive for doping, thereby shifting the taboo on doping from an external governing body to the internal workings of the team and its members; and compel any convicted athlete to return all salary paid and prize monies earned to the team sponsors, further strengthening the incentive for athletes to enforce their own antidoping rules.

Since 2008 anti-doping controls have improved dramatically with some of these factors implemented, plus others, most notably a “biological passport,” in which an athlete’s fitness indicators are constantly monitored, such as HCT, power output, VO2 maximum rate of oxygen consumption, and others, so that any spike in improvement beyond what training can account for is an indicator of possible doping. In his 2019 book, One-Way Ticket, Jonathan Vaughters explains in game-theory language that the biological passport “wasn’t about catching people. It was always about dissuading them. It’s about limiting them. It’s about keeping things fair.” Fairness is what athletes want more than anything and Vaughters is concerned that the moralizing impulse to “find evildoers and burn them at the stake” is “what the world wants from anti-doping” but is counterproductive to the deeper fairness issue — the level playing field in which other factors like talent, training, nutrition, and drive determine outcomes. Now a team manager and cycling influencer, Vaughters has been vocal and public about his teams riding clean, thereby shifting the norms of the sport from “everyone’s doing it” to “some aren’t doing it” in hopes of arriving at a new norm of “no one’s doing it.”

Jonathan Vaughters

Jonathan Vaughters

Forgiveness and Redemption

At the end of Vaughters’ memoir he admits “We all doped. It’s inexcusable and it’s a fact.” It should be clear by now that Lance’s downfall has less to do with doping and more to do with the people whose lives he harmed in his years of denial, defense, and destruction of others. As Vaughters put it: “The bullying was the reason Lance paid a higher price than the rest of us.” One of those who feels bullied is Betsy Andreu, wife of Lance’s teammate of many years, Frankie Andreu. A deeply moral person who is the very the embodiment of Kantian deontological (rule-bound) ethics, Betsy does not suffer cheats gladly (including her husband, whom she upbraided when she discovered he doped just to compete). She explained to me in no uncertain terms exactly what the core issue with Lance is and what he has to do to redeem himself.

First, she said, the other dopers, such as Ullrich, Basso, and Pantani, did not try to destroy other people for simply telling the truth about what was going on. Doping causes harm to others in the sport who don’t dope, but attacking, suing, libeling, and curtailing the income of those attempting to expose the doping is another level of harm altogether. Second, she continued, there is the matter of apologies. “I’ve learned there are 3 components to being sorry,” she outlined:

  1. You acknowledge the wrong you did to the person you wronged.
  2. You apologize for it.
  3. You make amends. How? You ask the person what they need from you to show you they mean they’re sorry.
Betsy Andreu

Betsy Andreu

Restorative vs. Retributive Justice

In criminal justice scholarship what Betsy Andreu is proposing is called restorative justice, in which the perpetrator apologizes for the offense, attempts to set-to-rights the wrong done, and if possible initiate or restore good relations with the victim. Restorative justice is contrasted with retributive justice, in which wrong doers should get their just desserts. Think Old Testament eye-for-an-eye (Moses) vs. New Testament turn-the-other-cheek (Jesus), Malcolm X vs. Martin Luther King, Jr., Rambo vs. Gandhi. Redemption begins with an acknowledgment on the part of the wrongdoer, who must take some level of responsibility for the offense, and builds from there to include the victims’ losses and a plan for restoration. As I outlined in my chapter on the subject in The Moral Arc, retributive justice is focused on what offenders deserve whereas restorative justice is concerned about what victims need; retributive justice is about what was done wrong whereas restorative justice is about making it right; retributive justice is offender oriented whereas restorative justice is victim oriented.

I don’t know who all feels that they should be on Lance’s restorative justice list. And, clearly, there are possible legal and financial consequences of going down that path that I do not know about. But in noting the many cancer victims and their families Lance inspired and materially helped through his charitable generosity — highlighted in the film and praised as unassailably real by ESPN’s Senior Writer Bonnie Ford, who was otherwise a harsh critic — it is evident that Armstrong is capable of being a person who can make a positive difference in the world. Will he?

Lance Armstrong

This could be Lance’s greatest challenge, harder perhaps even than overcoming cancer, inasmuch as personality and temperament are relatively stable throughout the lifespan. I don’t know if he has the character to do so across the board, but he has made amends with some people, and I will note that many a person with far graver personal failings have turned their lives around so dramatically that they’re almost unrecognizable. A type specimen might be the heavyweight boxing champion George Foreman, who by his own account was, pace Landis’ description of Armstrong, a badass motherfucker … until he was humbled by Muhammad Ali in the Rumble in the Jungle. Foreman willed himself into becoming one of the most likeable and inspirational figures of the late 20th century, recapturing his heavyweight title along with the admiration of millions. Others have made similar transformations. Can Lance do the same? The only person standing in the way is Lance himself. We shall see. END

About the Author

Michael Shermer is the Publisher of Skeptic magazine, host of the Science Salon podcast, and a Presidential Fellow at Chapman University. He is the author of a dozen books including the New York Times bestsellers Why People Believe Weird Things, The Believing Brain, and The Moral Arc. His latest book is Giving the Devil His Due. He is also a co-founder of the 3,000-mile nonstop transcontinental bicycle Race Across America (RAAM), which he competed in 5 times and was the Race Director for a decade.

TAGS: , , , , , , , , , , ,

Evolution & Creationism, Part 1

Posted on May. 29, 2020 by | Comments Off on Evolution & Creationism, Part 1

Dr. Michael Shermer takes viewers to the Galápagos Islands to retrace Darwin’s footsteps (literally — in 2006 Shermer and historian of science Frank Sulloway hiked and camped all over the first island Darwin visited) and show that, in fact, Darwin did not discover natural selection when he was there in September of 1835. He worked out his theory when he returned home, and Shermer shows exactly how Darwin did that, along with the story of the theory’s co-discoverer, Alfred Russel Wallace. Then Shermer outlines what, exactly, the theory of evolution explains, how it displaced the creationist model as the explanation for design in nature (wings, eyes, etc. as functional adaptations), and why so many people today still misunderstand the theory and how that sustained the creationist model.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

About the photograph above

Charles Darwin described of what he called the “craterized district” on San Cristóbal, Galápagos Islands thusly:

The entire surface of this part of the island seems to have been permeated, like a sieve, by the subterranean vapours: here and there the lava, whilst soft, has been blown into great bubbles; and in other parts, the tops of caverns similarly have fallen in, leaving circular pits with steep sides. From the regular form of the many craters, they gave to the country an artificial appearance, which vividly reminded me of those parts of Staffordshire, where the great iron-foundries are most numerous.

The photograph was taken on 21 June 2004 by Dr. Frank Sulloway. Darwin hiked this area in September, 1835.

Mentioned in this lecture
TAGS: , , , , , , , , , , ,


Get eSkeptic

Be in the know!

Subscribe to eSkeptic: our free email newsletter and get great podcasts, videos, reviews and articles from Skeptic magazine, announcements, and more in your inbox twice a week. It’s free. We never share your address. Unsubscribe any time.

Sign me up!


Skeptic cover art by Pat Linse

Art of the Skeptic

In celebration of Skeptic magazine’s 100th issue, we present sage graphic art advice for skeptical groups and a gallery of art reflecting more than 47 years of skeptical activism from Skeptic’s long time Art Director, Pat Linse

Detecting Baloney

Baloney Detection Kit Sandwich (Infographic) by Deanna and Skylar (High Tech High Media Arts, San Diego, CA)

The Baloney Detection Kit Sandwich (Infographic)

For a class project, a pair of 11th grade physics students created the infographic shown below, inspired by Michael Shermer’s Baloney Detection Kit: a 16-page booklet designed to hone your critical thinking skills.

FREE PDF Download

Wisdom of Harriet Hall

Top 10 Things to Know About Alternative Medicine

Harriet Hall M.D. discusses: alternative versus conventional medicine, flu fear mongering, chiropractic, vaccines and autism, placebo effect, diet, homeopathy, acupuncture, “natural remedies,” and detoxification.

FREE Video Series

Science Based Medicine vs. Alternative Medicine

Science Based Medicine vs. Alternative Medicine

Understanding the difference could save your life! In this superb 10-part video lecture series, Harriet Hall M.D., contrasts science-based medicine with so-called “complementary and alternative” methods.

FREE PDF Download

Top 10 Myths of Terrorism

Is Terrorism an Existential Threat?

This free booklet reveals 10 myths that explain why terrorism is not a threat to our way of life or our survival.

FREE PDF Download

The Top 10 Weirdest Things

The Top Ten Strangest Beliefs

Michael Shermer has compiled a list of the top 10 strangest beliefs that he has encountered in his quarter century as a professional skeptic.

FREE PDF Download

Reality Check: How Science Deniers Threaten Our Future (paperback cover)

Who believes them? Why? How can you tell if they’re true?

What is a conspiracy theory, why do people believe in them, and can you tell the difference between a true conspiracy and a false one?

FREE PDF Download

The Science Behind Why People See Ghosts

The Science Behind Why People See Ghosts

Mind altering experiences are one of the foundations of widespread belief in the paranormal. But as skeptics are well aware, accepting them as reality can be dangerous…

FREE PDF Download

Top 10 Myths About Evolution

Top 10 Myths About Evolution (and how we know it really happened)

If humans came from apes, why aren’t apes evolving into humans? Find out in this pamphlet!

FREE PDF Download

Learn to be a Psychic in 10 Easy Lessons

Learn to do Psychic “Cold Reading” in 10
Easy Lessons

Psychic readings and fortunetelling are an ancient art — a combination of acting and psychological manipulation.

FREE PDF Download

The Yeti or Abominable Snowman

5 Cryptid Cards

Download and print 5 Cryptid Cards created by Junior Skeptic Editor Daniel Loxton. Creatures include: The Yeti, Griffin, Sasquatch/Bigfoot, Loch Ness Monster, and the Cadborosaurus.

Copyright © 1992–2021. All rights reserved. | P.O. Box 338 | Altadena, CA, 91001 | 1-626-794-3119. The Skeptics Society is a non-profit, member-supported 501(c)(3) organization (ID # 95-4550781) whose mission is to promote science & reason. As an Amazon Associate, we earn from qualifying purchases. Privacy Policy.