The Skeptics Society & Skeptic magazine


banner

Browse by Author

Read posts by:
Michael Shermer

Dr. Michael Shermer is the Publisher of Skeptic magazine, a monthly columnist for Scientific American, an Adjunct Professor at Claremont Graduate University and Chapman University, and the author of The Believing Brain, Why People Believe Weird Things, Why Darwin Matters, The Mind of the Market, How We Believe, and The Science of Good and Evil. His new book is The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom. Read Michael’s other posts on this blog.

Suffrage & Success
Celebrating the Centennial of Women’s Right to Vote

Posted on Aug. 18, 2020 by | Comments (2)

Today, August 18, marks the 100th anniversary of the adoption of the 19th Amendment to the Constitution of the United States, guaranteeing women the right to vote. We honor that momentous event with an excerpt adapted from the chapter on women’s rights in Dr. Michael Shermer’s 2015 book The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom (New York: Henry Holt).

Read the essay below, or listen to it being read by the author, Michael Shermer:

On August 18, 1920, the 19th Amendment of the United States Constitution was ratified, legally securing the franchise to women. It was the culmination of a 72-year battle that began when Elizabeth Cady Stanton and Lucretia Mott organized the 1848 Seneca Falls conference, after attending the World Anti-slavery Convention in London in 1840 — a meeting at which they had come to participate as delegates, but at which they were not allowed to speak and were made to sit like obedient children in a curtained-off area. This did not sit well with Stanton and Mott. Conventions were held throughout the 1850s but were interrupted by the American Civil War, which secured the franchise in 1870 — not for women, of course, but for black men (though they were gradually disenfranchised by poll taxes, legal loopholes, literacy tests, threats and intimidation). This didn’t sit well either and only served to energize the likes of Matilda Joslyn Gage, Susan B. Anthony, Ida B. Wells, Carrie Chapman Catt, Doris Stevens, and countless others who campaigned unremittingly against the political slavery of women.

Things began to heat up when the great American suffragist Alice Paul (arrestingly portrayed by Hilary Swank in the 2004 film Iron Jawed Angels) returned from a lengthy sojourn in England. She had learned much during her time there through her active participation in the British suffrage movement and from the more radical and militant British suffragists, including the courageous political activist Emmeline Pankhurst, characterized as “the very edge of that weapon of willpower by which British women freed themselves from being classed with children and idiots in the matter of exercising the franchise.”1

Upon her death Pankhurst was heralded by the New York Times as “the most remarkable political and social agitator of the early part of the twentieth century and the supreme protagonist of the campaign for the electoral enfranchisement of women”;2 years later, Time magazine voted her one of the 100 most important people of the century. Thus, when Alice Paul returned from abroad she was ready for action, though the more conservative members of the women’s movement weren’t quite ready for Alice. Nevertheless, in order to attract attention to the cause she and Lucy Burns organized the largest parade ever held in Washington. On March 3, 1913 (strategically timed for the day before President Wilson’s inauguration), 26 floats, 10 bands, and 8,000 women marched, led by the stunning Inez Milholland wearing a flowing white cape and riding a white horse. (See Figure 1 above.) Upwards of 100,000 spectators watched the parade but the mostly male crowd became increasingly unruly and the women were spat upon, taunted, harassed and attacked while the police stood by. Afraid of an all-out riot, the War Department called in the cavalry to contain the escalating violence and chaos.3

It was a gift. A scandal ensued due to the rough treatment of the women and suddenly, “the issue of suffrage — long thought dead by many politicians — was vividly alive in front page headlines in newspapers across the country.… Paul had accomplished her goal — to make woman suffrage a major political issue.”5

In 1917 women began peacefully picketing outside the White House but, once again, they were met with harassment and violence. These Silent Sentinels (as they were called) stood day and night (except Sundays) with their banners for two and a half years but, after the U.S. joined in the war, patience ran thin as it was seen as improper to picket a wartime president. The picketers were charged with obstructing traffic and were thrown — often quite literally thrown — into prison cells where they were treated like criminals, rather than political protesters, and were kept in appalling conditions. Many of the women went on a hunger strike, including Alice Paul, who was viciously force-fed in order to keep her from becoming a martyr for the cause.

Word of the brutality in the workhouse was leaked to the press and the public became increasingly incensed at the protestors’ horrific treatment. During what became known as the Night of Terror, 40 prison guards went on a rampage and the women were “grabbed, dragged, beaten, kicked, and choked”; Lucy Burns had her wrists cuffed and chained above her head to the cell door; another woman was taken to the men’s section and told “they could do what they pleased with her”; another woman was knocked unconscious, still another had a heart attack.6 These outrages were a grave tactical error. “With public pressure mounting as a result of press coverage, the government felt the need to act.… Arrests didn’t stop these protesters; neither did jail terms, psychopathic wards, force-feeding, or violent attacks. Their next decision was simply to let them out.”7

At long last, in 1920, the 19th amendment (originally drafted by Susan B. Anthony and Elizabeth Cady Stanton in 1878) was passed — by a single vote — thanks to 24-year-old Harry T. Burn, a Tennessee legislator who had originally intended to vote against his state ratifying the amendment (which needed ratification of 36 of the 48 states to pass), but changed his mind because of a note from his mother.

Dear Son:

Hurrah, and vote for suffrage! Don’t keep them in doubt. I notice some of the speeches against. They were bitter. I have been watching to see how you stood, but have not noticed anything yet.

Don’t forget to be a good boy and help Mrs. Catt put the “rat” in ratification.

Your Mother.8

In the end, then, suffrage for women came down to the vote of one man, influenced by his mom. It was rumored that, “the anti-suffragists were so angry at his decision that they chased him from the chamber, forced him to climb out a window of the Capitol and inch along a ledge to safety.”9 Thus suffrage arrived in the U.S., kicking and screaming.

It was a right that women in a number of other countries had already won years before, but one that others would have to wait for. Figure 2 (below) tracks the moral progress of women’s suffrage, while Figure 3 (below) tracks the gaps between when all men versus all women were granted the franchise, from Switzerland’s 123-year gap between 1848 and 1971, to Denmark’s 0-year gap in 1915. By comparison, the 50-year gap in the United States between 1870 and 1920 lies mid-way in this history.

Womens Right to Vote Over Time

Figure 2: Women’s Right to Vote Over Time The stair-step progress of women’s suffrage is tracked over time from 1900 to 2010, showing two big bursts, the first after World War I and the second after World War II. Tellingly, the expected date for the sovereign nation of Vatican City to grant women the right to vote is “never.”10

The Gap Between the Franchise for Men and Women

Figure 3: The Gap Between the Franchise for Men and Women. The spasmodic nature of moral progress is reflected in the shrinking time in years between the dates that men’s suffrage and women’s suffrage was legalized, from 123 years for Switzerland to 0 years for Denmark. Such change is contingent on many social and political variables that differ from country to country.

Carving Women’s Rights: A Personal Story

The trend over the past several centuries has been to grant women the same rights and privileges as those of men. Political, economic, and social advances, enabled by scientific, technological, and medical discoveries and inventions have increasingly provided women not only greater amounts of reproductive autonomy and control, but have also driven an expansion of their rights and opportunities in all areas of life, leading to healthier and happier societies across the globe. As with the other rights revolutions there is much progress that remains to be realized, but the momentum now is such that the expansion of women’s rights should continue unabated into the future.

In these ways — the rational justification for including women as full rights-bearing persons no less deserving than men, the interchangeability of women’s perspectives with that of men, the scientific understanding of the nature of human sexuality and reproduction, and the continuous thinking that enables us to see and comprehend the difference between a woman’s and a fetus’s rights — science and reason have led humanity closer to truth, justice, and freedom.

As an example of how far we’ve come in just the last two generations (and how oppressed women were as recently as the early 20th century), I close with the story of two women — mother and daughter — both named Christine Roselyn Mutchler. The mother was born in Germany and passed through Ellis Island in 1893 with her parents, who then moved to Alhambra, California. Mother Christine married her husband Frederick and gave birth to baby Christine in 1910 (and a second daughter three years later), but their lives were shattered shortly after that when Fred told his wife he was going out for a loaf of bread and never returned. Abandoned by her husband, left with no money or food to care for herself and her two small children, mother Christine was forced to return to her father’s home.

Unknown to her at the time, Fred had wandered off into the county jail with delusions that his father-in-law was after him. After being examined by a physician he was sent to a mental hospital for over a year. During this time, with his delusions in remission, Fred wrote heartbreaking letters to his wife asking about her and the children, but Christine’s father kept the letters from her and she continued to believe that she had been abandoned. In time she found work as a housemaid for a friend of a successful motion picture executive named John C. Epping, whose wife had recently died. Desperate for a daughter and enamored by three-year old Christine, Epping talked Christine’s father into forcing her to allow him to adopt the child. Young, poor, scared, and intimidated by her father, Christine reluctantly agreed to the adoption, although a series of articles in The Los Angeles Times show that a probation officer on the case opposed the adoption, declaring “she believed Epping saw possibilities of a future Mary Pickford in the little girl, and that the child should have a home in some private family where home life and education would be the principal features.”11 Based on the false information provided by Christine’s father that Fred had abandoned them, the judge granted the adoption.

Epping promptly changed the name of his newly adopted daughter to Frances Dorothy Epping, addressed her by her middle name, and (unbelievably) told her she was born in Providence, Rhode Island and that his deceased wife was her true mother. Now age four, Christine/Dorothy apparently did not accept the fictional story and rebelled — or perhaps Epping changed his mind about raising a daughter as a single Dad — because he shuffled her around through a series of surrogate parents, including sisters at the Ramona Convent in Alhambra and caretakers at the Marlborough Preparatory School in Los Angeles, before shipping her back east for a year to live with his sister in the Catskills, and then on to Germany where she lived with Epping’s relations. During that period Dorothy discovered that she had a talent for the arts, in particular sculpture.

She then returned to Los Angeles and finished her secondary education, after which she was reunited with her original family and told the truth about the adoption. She went on to college at the Otis Art Institute in Los Angeles, the Corcoran School of Art in Washington, D.C. and the prestigious Academy of Fine Arts in Munich, Germany in the 1930s under the tutelage of Joseph Wackerle who, at that time, was the Third Reich Culture Senator and received praise from both Goebbels and Hitler. (She later recalled being stunned by the hypnotic pull Hitler had on an audience of one of his speech’s she attended.) In the meantime, Dorothy’s real mother, Christine, was instructed by her father to divorce her husband Fred, after which she met and married a vegetable cart vendor in Los Angeles, left her father’s oppressive rule, and began to rebuild her life and new family. But the tragedy of being forced to give up her first-born child haunted her the rest of her life. As the world changed and Christine saw how women became more empowered in the second half of the 20th century, she continually asked herself why she didn’t speak up and oppose the adoption.

Meanwhile, as Dorothy came of age she soon discovered that family law and the adoption courts were not the only worlds ruled by men. Her chosen profession of sculpture was a heavily male dominated one, so to be taken seriously she began using a truncated version of her first name Frances — Franc — and that gained her entrée into the German academy and subsequent galleries and museums (even now one can find references to “his” work). She later recalled that when the professors at the Academy of Fine Arts in Munich found out “Franc” was a women, she had to listen to lectures from the hallways because only men were allowed inside. From the early 1930s through her death in 1983 — by which time it was acceptable for women to shape clay, wood, and stone with their hands — Franc Epping’s work was shown in numerous exhibits throughout the United States, including the prestigious Whitney Museum of American Art in New York City. One of her works, “The Man with a Hat,” even appeared in an episode of the original series of Star Trek. I know because I own that piece, along with many other sculptures of hers, which I inherited from my mother.

The Man with a Hat appeared in an episode of the original series of Star Trek.

Franc Epping’s work, “The Man with a Hat,” appeared in an episode of the original series of Star Trek (The Original Series, Season 1, Dp. 24 A Taste of Armageddon, at 17 minutes, 25 seconds). That sculpture, along with many other Epping sculptures, were inherited by Michael Shermer from his mother. Franc Epping was the author’s Aunt.

You see, Franc Epping was my Aunt, her real mother Christine was my grandmother, and I am proud to be related to such a resilient and determined woman.12 Aunt Franc’s sculptures portray strong women with muscular features in empowering poses — allegories for what women for generations have had to rise to in order to gain the recognition and equality that is rightfully theirs. This book was written in the inspiring presence of those carved stones. END

Figure 4: Sculptor Franc Epping working

Figure 4: Sculptor Franc Epping, born Christine Roselyn Mutchler and given the adopted name Frances Dorothy Epping, started using the masculinized version of her adopted name — Franc — in order to be taken seriously in the male-dominated world of sculpture.

Figure 5: Alomg Franc Epping's sculptures are strong women with muscular features in empowering poses.

Figure 5: Among Franc Epping’s many sculptures are strong women with muscular features in empowering poses.13

About the Author

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine, the host of the Science Salon Podcast, and a Presidential Fellow at Chapman University where he teaches Skepticism 101. For 18 years he was a monthly columnist for Scientific American. He is the author of New York Times bestsellers Why People Believe Weird Things and The Believing Brain, Why Darwin Matters, The Science of Good and Evil, The Moral Arc, and Heavens on Earth. His new book is Giving the Devil His Due: Reflections of a Scientific Humanist.

References
  1. Purvis, June. 2002. Emmeline Pankhurst: A Biography. London: Routledge. 354.
  2. Ibid., 354.
  3. Stevens, Doris. Edited by Carol O’Hare. Originally published 1920; 3 revised and edited 1995. Jailed for Freedom: American Women Win the Vote. Troutdale: New Sage Press. 18–19.
  4. Source: Library of Congress. George Grantham Bain Collection. Original caption reads: Inez Milholland Boissevain, wearing white cape, seated on white horse at the National American Woman Suffrage Association parade, March 3, 1913, Washington, D.C. LC-DIG-ppmsc-00031 (digital file from original photograph) LC-USZ62-77359 http://www.loc.gov/pictures/item/97510669/
  5. Ibid., 19.
  6. Adams, Katherine H. and Michael L. Keene. 2007. Alice Paul and the American Suffrage Campaign. Illinois: University of Illinois Press. 206–208.
  7. Ibid., 211.
  8. http://www.tennessee.gov/tsla/exhibits/suffrage/beginning.htm
  9. Ibid.
  10. The Wikipedia entry for “Women’s Suffrage” has a complete list of every country and when they legalized the franchise for women: https://en.wikipedia.org/wiki/Women%27s_suffrage
  11. “4-Sided Battle in Court for Child.” 1914. Los Angeles Times, October 31.
  12. Most of this story has been carefully documented by Ann Marie Batesole, a private detective and my cousin — our grandmother was Christine, Aunt Fanci’s mother.
  13. Source: Author’s collection.
TAGS: , , , , ,

Fat Man & Little Boy

Posted on Aug. 07, 2020 by | Comments (38)

On the 75th anniversary of nuclear weapons, Dr. Michael Shermer presents a moral case for their use in ending WWII and the deterrence of Great Power wars since, and a call to eventually eliminate them. This essay was excerpted, in part, from Michael Shermer‘s book, The Moral Arc, in the chapter on war.

Read the essay below, or listen to it being read by the author, Michael Shermer:

On August 6 the Little Boy gun-type uranium-235 bomb exploded with an energy equivalent of 16-18 kilotons of TNT, flattening 69 percent of Hiroshima’s buildings and killing an estimated 80,000 people and injuring another 70,000. (https://commons.wikimedia.org/wiki/File:Atomic_cloud_over_Hiroshima.jpg)

On August 6, 1945 the Little Boy gun-type uranium-235 bomb exploded with an energy equivalent of 16–18 kilotons of TNT, flattening 69 percent of Hiroshima’s buildings and killing an estimated 80,000 people and injuring another 70,000.

Three quarters of a century ago this summer, nuclear weapons altered our civilization forever. On July 16 the Trinity plutonium bomb detonated with the energy equivalent of 22 kilotons (22,000 metric tons) of TNT, sending a mushroom cloud 39,000 feet into the atmosphere. The explosion left a crater 76 meters wide filled with radioactive glass called trinitite (melted quartz grained sand). It could be heard as far away as El Paso, Texas. On August 6 the Little Boy gun-type uranium-235 bomb exploded with an energy equivalent of 16–18 kilotons of TNT, flattening 69 percent of Hiroshima’s buildings and killing an estimated 80,000 people and injuring another 70,000. On August 9 the Fat Man plutonium implosion-type bomb with the energy equivalence of 19-23 kilotons of TNT leveled around 44 percent of Nagasaki, killing an estimated 35,000 to 40,000 people and severely wounding another 60,000.1

The aftermath of Little Boy (https://commons.wikimedia.org/wiki/File:Hiroshima_aftermath.jpg)

The aftermath of Little Boy

On August 9 the Fat Man plutonium implosion-type bomb with the energy equivalence of 19-23 kilotons of TNT leveled around 44 percent of Nagasaki, killing an estimated 35,000 to 40,000 people and severely wounding another 60,000. (https://commons.wikimedia.org/wiki/File:Nagasakibomb.jpg)

On August 9, 1945 the Fat Man plutonium implosion-type bomb with the energy equivalence of 19–23 kilotons of TNT leveled around 44 percent of Nagasaki, killing an estimated 35,000 to 40,000 people and severely wounding another 60,000.

Before and Aftermath of Nagasaki (https://commons.wikimedia.org/wiki/File:Nagasaki_1945_-_Before_and_after_(adjusted).jpg)

Before and aftermath of Nagasaki

Memorandum from Major General Leslie Groves to Army Chief of Staff George Marshall (https://commons.wikimedia.org/wiki/File:Memorandum_from_Major_General_Leslie_Groves_to_Army_Chief_of_Staff_George_Marshall.jpg)

Click image to view larger PDF. Had the Japanese military hardliners had their way to continue the war into the fall, Groves had three more bombs readied for September and another three for October. Here he instructs his Chief of Staff that the next bomb will be ready to drop on after August 24. Emperor Hirohito capitulated on August 15, thereby saving millions of lives of his citizens.

As documented in the memo below dated August 10, 1945, if the Japanese had not surrendered the head of the Manhattan Project, Major General Leslie R. Groves, had another Fat Man-type plutonium implosion bomb ready to go after August 24 that would have likely killed another 50,000 to 100,000 people.2 And had the Japanese military hardliners had their way to continue the war into the fall, Groves had three more bombs readied for September and another three for October. President Harry Truman was not exaggerating when he threatened Japan with “a rain of ruin from the air, the like of which has never been seen on this Earth.” Truman did agonize about dropping more nukes on Japan, troubled as he was by the thought of more innocents and noncombatants being killed. He wrestled that decision away from the military. (Note Groves’ handwritten addendum to his memo that “It is not to be released on Japan without express authority from the President.” U.S. presidents have had sole authority to use nuclear weapons ever since.) However, further bombings proved unnecessary. On August 15 Emperor Hirohito, against the wishes of some of Japan’s military leaders, announced on the radio that Japan would capitulate. On September 2 they signed the surrender documents in Tokyo Bay, ending the Second World War.3

On this 75th anniversary of the summer of the bomb I want to make the case that their use was necessary to end the war, that their continued existence has acted as a deterrence against another Great Power war — but that we must eliminate them entirely for the long-term existence of our civilization and possibly our species.


Since 1945 a cadre of critics have proffered the claim that atomic bombs were unnecessary to bring about the end of World War II (or, at least, the Fat Man Nagasaki bomb was superfluous), and thus this act was immoral, illegal, or even a crime against humanity. Robert Oppenheimer and other physicists like Leo Szilard who worked on the Manhattan Project expressed reservations. “The physicists have known sin,” Oppenheimer opined. He went to Truman and confessed “Mr. President. I feel I have blood on my hands,” to which the President recalled “I told him the blood was on my hands — to let me worry about that.” Truman promptly dismissed Oppenheimer and told Secretary of State Dean Acheson, “I don’t want to see that son-of-a-bitch in this office ever again.”4

In 1946 the Federal Council of Churches issued a statement declaring, “As American Christians, we are deeply penitent for the irresponsible use already made of the atomic bomb. We are agreed that, whatever be one’s judgment of the war in principle, the surprise bombings of Hiroshima and Nagasaki are morally indefensible.”5 In 1967 the linguist and contrarian politico Noam Chomsky called the two bombings “the most unspeakable crimes in history.”6

More recently, in his history of genocide titled Worse Than War, the historian Daniel Goldhagen opens his analysis by calling the U.S. President Harry Truman “a mass murderer” because in ordering the use of atomic weapons he “chose to snuff out the lives of approximately 300,000 men, women and children.” Goldhagen opines that “it is hard to understand how any rightthinking person could fail to call slaughtering unthreatening Japanese mass murder.”7 Goldhagen defines “genocide” broadly enough to equate it with “mass murder” (without ever defining what, exactly, that means). In morally equating Harry Truman with Adolf Hitler, Joseph Stalin, Mao Zedong, and Pol Pot, Goldhagen allows himself to be constrained by the categorical thinking that prevents one from discerning the different kinds, levels, and motives for large scale military violence. By this reasoning, nearly every act that kills a large number of people could be considered genocidal because there are only two categories — mass murder and non-mass murder.

By contrast, continuous thinking allows us to distinguish the differences between types of mass killings (some scholars define genocide as one-sided killing by armed people of unarmed people), their context (during a state war, civil war, ethnic cleansing), motivations (termination of hostilities or extermination of a people), and quantities (hundreds to millions) along a sliding scale. In 1946 the Polish jurist Raphael Lemkin created the term genocide and defined it as “a conspiracy to exterminate national, religious or racial groups.”8 That same year the U.N. General Assembly defined genocide as “a denial of the right of existence of entire human groups.”9 More recently, in 1994 the highly respected philosopher Steven Katz defined genocide as “the actualization of the intent, however successfully carried out, to murder in its totality any national, ethnic, racial, religious, political, social, gender or economic group.”10

By these definitions, the dropping of Fat Man and Little Boy were not acts of genocide. The difference between Truman and the others is in the context and motivation of the act. In their genocidal actions against targeted people, Hitler, Stalin, Mao, and Pol Pot had as their objective the total elimination of a group. The killing would only stop when every last pursued person was exterminated (or if the perpetrators were stopped or defeated). Truman’s goal in dropping the bombs was to end the war with Japan (which it did), not to eliminate the Japanese people (which it didn’t). That the U.S. provided considerable financial, personnel, and material support to help rebuild Japan into a world economic power puts the lie to the eliminationist accusation.11

The author’s father, Richard Shermer, in 1945, serving aboard the USS Wren

The author’s father, Richard Shermer, in 1945, serving aboard the USS Wren.

More broadly morally, if we ground morality in the survival and flourishing of sentient beings,12 by that measure, then not only did Fat Man and Little Boy end the war and stop the killing, they saved lives — very probably millions of lives, both Japanese and American. My father Richard Shermer was possibly one such survivor. During the Second World War he served aboard the USS Wren (DD-568), a Fletcher-class destroyer assigned to protect aircraft carriers and other large capital ships from Japanese submarines and from Kamikaze planes on what was called antiaircraft radar picket watch. His ship was so attacked several times but sustained no major damage. The Wren was part of the larger fleet that was working its way toward Japan, escorting the carriers whose planes were bombarding the Japanese homeland in preparation for the planned invasion. My father told me that everyone onboard dreaded that day because they had heard of the horrific carnage resulting from the invasion of just two tiny islands held by the Japanese — Iwo Jima and Okinawa. If that was any indication of what was to come with a full-scale invasion, the contemplation of it was almost too much to bear.13

The USS Wren (https://en.wikipedia.org/wiki/USS_Wren_(DD-568)#/media/File:USS_Wren_(DD-568)_underway,_circa_in_the_mid-1950s_(NH_107257).jpg)

The USS Wren, a Navy destroyer deployed to protect aircraft carriers from suicidal Kamikaze pilots while their planes bombarded the Japanese homeland in preparation for the invasion that never came, thanks to Fat Man and Little Boy.

USS Lexington
USS Wren fantail
USS Wren front
USS Wren

Click an image above to enlarge it. Four photos taken by Richard Shermer on board the USS Wren, pictured fore and aft, accompanying the aircraft carrier USS Lexington, and arriving in Tokyo Bay in late August, 1945 in preparation for the surrender ceremony on September 2, marking the end of the Second World War.

During the invasion of Iwo Jima there were approximately 26,000 American casualties that included 6,821 dead in the 36-day battle. How fiercely did the Japanese defend that little volcanic rock 700 miles from Japan? Of the 22,060 Japanese soldiers assigned to fight to the bitter end, only 216 survived.14 The subsequent battle for Okinawa, only 340 miles from the Japanese mainland, was fought even more ferociously, resulting in a staggering body count of 240,931 dead, including 77,166 Japanese soldiers, 14,009 American soldiers, plus an additional 149,193 Japanese civilians living on the island who either died fighting or committed suicide rather than let themselves be captured.15 With an estimated 2.3 million Japanese soldiers and 28 million Japanese civilian militia prepared to defend their island nation to the death,16 it was clear to all what an invasion of the Japanese mainland would entail.

It is from these cold hard facts that Truman’s advisors estimated that between 250,000 and one million American lives would be lost in an invasion of Japan.17 General Douglas MacArthur estimated that there could be a 22:1 ratio of Japanese to American deaths, which translates to a minimum death toll of 5.5 million Japanese.18 By comparison, cold though it may sound, the body count from both atomic bombs — about 200,000–300,000 total (Hiroshima: 90,000–166,000 deaths, Nagasaki: 60,000–80,000 deaths19) — was a bargain.

In any case, if Truman hadn’t ordered the bombs dropped, General Curtis LeMay and his fleet of B-29 bombers would have continued pummeling Tokyo and other Japanese cities into rubble. When asked to predict when the war would end based on his bombing program, LeMay said September 1, because that was when there would be nothing left of Japan to bomb. The death toll from conventional bombing would have been just as high as that produced by the two atomic bombs, if not higher. Previous mass bombing raids had produced Hiroshima-level death rates, and it is likely that more than just two cities would have been destroyed before the Japanese surrendered. Compare, for example, Little Boy’s energy equivalent of 16,000–19,000 tons of TNT to the U.S. Strategic Bombing Survey estimate that this was the equivalent of 220 B-29s carrying 1,200 tons of incendiary bombs, 400 tons of high-explosive bombs, and 500 tons of anti-personnel fragmentation bombs, with an equivalent number of casualties.20 In fact, on the night of March 9–10, 1945, 279 B-29s dropped 1,665 tons of bombs on Tokyo, leveling 15.8 square miles of the city, killing 88,000 people, injuring another 41,000, and leaving another million homeless.21

On the night of March 9-10, 1945, 279 B-29s dropped 1,665 tons of bombs on Tokyo, leveling 15.8 square miles of the city, killing 88,000 people, injuring another 41,000, and leaving another million homeless. This is the result. (https://commons.wikimedia.org/wiki/File:Tokyo_1945-3-10-1.jpg)

On the night of March 9–10, 1945, 279 B-29s dropped 1,665 tons of bombs on Tokyo, leveling 15.8 square miles of the city, killing 88,000 people, injuring another 41,000, and leaving another million homeless. This is the result.

These facts also help refute the claim that the alternative scenario of dropping an atomic bomb on an uninhabited island or bay to demonstrate its destructive force would have worked to convince the Japanese to surrender. Given that they refused to capitulate even after numerous cities were obliterated by conventional bombs and Hiroshima was erased from the map by an atomic bomb it seems unlikely this more benign strategy would have worked.22

On balance, then, dropping the atomic bombs was the least destructive of the options on the table. Although we wouldn’t want to call it a moral act, it was in the context of the time the least immoral act by the criteria of lives saved. That said, we should also recognize that the several hundred thousand killed is still a colossal loss of life. The fact that the invisible killer of radiation continued its effects long after the bombings should dissuade us from ever using such weapons again. Along that sliding scale of evil, in the context of one of the worst wars in human history that included the singularly destructive Holocaust of six million murdered, it was not, pace Chomsky, the most unspeakable crime in history — not even close — but it was an event in the annals of humanity never to be forgotten and, hopefully, never to be repeated.


When I was an undergraduate at Pepperdine University in 1974, the father of the hydrogen bomb — Edward Teller — spoke at our campus in conjunction with the awarding of an honorary doctorate. His message was that deterrence works. At the time I remember thinking — like so many politicos were saying — “yeah, but a single slip-up is all it takes.” Popular films such as Fail Safe and Dr. Strangelove reinforced the point. But the blunder never came (and the close calls were kept secret for decades). In the game theoretic strategy of Mutual Assured Destruction (MAD), deterrence works because neither side has anything to gain by initiating a first strike against the other. The retaliatory capability of both is such that a first strike would most likely lead to the utter annihilation of both countries (along with much of the rest of the world). “It’s not mad!” proclaimed Secretary of Defense Robert S. McNamara. “Mutual Assured Destruction is the foundation of deterrence. Nuclear weapons have no military utility whatsoever, excepting only to deter one’s opponent from their use. Which means you should never, never, never initiate their use against a nuclear-equipped opponent. If you do, it’s suicide.”23

The logic of deterrence was first articulated in 1946 by the American military strategist Bernard Brodie in his appropriately titled book The Absolute Weapon, in which he noted the break in history that atomic weapons brought with their development: “Thus far the chief purpose of our military establishment has been to win wars. From now on, its chief purpose must be to avert them. It can have almost no other purpose.”24 As Dr. Strangelove explained in Stanley Kubrick’s classic Cold War film: “Deterrence is the art of producing in the mind of the enemy the fear to attack.” Said enemy, of course, must know that you have at the ready such destructive devices, and that is why “The whole point of a doomsday machine is lost if you keep it a secret!25

Dr. Strangelove was a black comedy that parodied MAD by showing what can happen when things go terribly wrong, in this case when General Jack D. Ripper becomes unhinged at the thought of “Communist infiltration, Communist indoctrination, Communist subversion, and the international Communist conspiracy to sap and impurify all of our precious bodily fluids” and orders a nuclear first strike against the Soviet Union. Given this unfortunate incident and knowing that the Russkis know about it and will therefore retaliate, General “Buck” Turgidson pleads with the president to go all out and launch a full first strike. “Mr. President, I’m not saying we wouldn’t get our hair mussed, but I do say no more than ten to twenty million killed, tops, uh, depending on the breaks.”26

This isn’t far off real projected casualties (Kubrick was a student of Cold War strategy), as in 1957 Strategic Air Command (SAC) estimated that between 360 and 525 million casualties would be inflicted in the first week of a nuclear exchange with the Soviet block.27 In 1968 Secretary of Defense Robert McNamara gave these figures for MAD to work: “In the case of the Soviet Union, I would judge that a capability on our part to destroy, say, one-fifth to one-fourth of their population and one-half of her industrial capacity would serve as an effective deterrent.” With a population of the time of about 128 million, this translates to 25–32 million dead.28 A 1979 report from the Office of Technology Assessment for the U.S. Congress, entitled The Effects of Nuclear War, estimated that 155 to 165 million Americans would die in an all-out Soviet first strike (unless people made use of existing shelters near their homes, reducing fatalities to 110–120 million). The population of the U.S. at the time was 225 million, so the estimated percent that would be killed ranged from 49 percent to 73 percent. Staggering.

Deterrence has worked so far — no nuclear weapon has been detonated in a conflict of any kind in 75 years — but it would be foolish to think of deterrence as a permanent solution.29 As long ago as 1795, in an essay titled Perpetual Peace, Immanuel Kant worked out what such deterrence ultimately leads to: “A war, therefore, which might cause the destruction of both parties at once … would permit the conclusion of a perpetual peace only upon the vast burial-ground of the human species.” (Kant’s book title came from an innkeeper’s30 sign featuring a cemetery — not the type of perpetual peace most of us strive for.) Deterrence acts as only a temporary solution to the Hobbesian temptation to strike first (also called the security dilemma in which a nation arming in defense triggers other nations to also arm in defense), allowing both Leviathans to go about their business in relative peace, settling for small proxy wars, which themselves have been in decline for decades.31


In the long run we need to work toward a world free of nuclear weapons. The risks of accidents or a deranged Dr. Strangelove-type character triggering a nuclear exchange is too high for a MAD deterrence strategy to be a permanent solution to the security dilemma it was invented to solve. Authors such as Richard Rhodes in his nuclear tetralogy (The Making of the Atomic Bomb, Dark Sun, Arsenals of Folly, and The Twilight of the Bombs32), and Eric Schlosser in Command and Control,33 leave readers with vertigo knowing how many close calls there have been. To name but a few: the jettisoning of a Mark IV atomic bomb in British Columbia in 1950; the crash of a B-52 carrying two Mark 39 nuclear bombs in North Carolina; the Cuban Missile Crisis; the Able Archer 83 Exercise in Western Europe that the Soviets misread as the buildup to a nuclear strike against them; the Titan II Missile explosion in Damascus, Arkansas that narrowly avoided eradicating the entire city off the map; and Stanislav Petrov’s decision not to trigger a retaliatory strike against the U.S. based on reports from the Soviet early warning satellite system of incoming ballistic missiles. It is not for nothing that Petrov is known as “the man who saved the world.”34

Thus, in the long run we must get to Nuclear Zero, but in the short run there are so many hurdles that few think we are anywhere near such a lofty goal. In two episodes of my Science Salon podcast Fred Kaplan, the national security journalist and author of several books on nuclear weapons, and William J. Perry, Secretary of Defense under President Clinton and a staunch advocate for eliminating nuclear weapons, both told me that they did not think this could happen any time soon, even while their books outline how it could be done.35 In The Moral Arc I summarized the consensus by experts on the most important steps to take to reduce the risk of nuclear weapons and to work toward a world free of them, including: (1) enact a “no first use” policy, (2) take all weapons off of “launch on warning”; (3) increase the warning and decision times for launching a retaliatory strike; (4) remove from the President the sole authority to launch nuclear weapons; (5) uphold non-proliferation agreements; (6) widen the taboo from using nuclear weapons to owning them; (7) increase economic interdependence; (8) expand democratic governance; (9) reduce spending on nuclear weapons; and (10) continue the disarmament of existing nuclear weapons. To that end, it is encouraging to see the decline in the total number of nuclear warheads to around 16,000 from the peak of around 70,000 in 1986, as visualized in the figure below.36

The decline in the total number of nuclear warheads to around 16,000 from the peak of around 70,000 in 1986.

Click image to enlarge. The decline in the total number of nuclear warheads to around 16,000 from the peak of around 70,000 in 1986.

I should note that some security scholars, along with many political theorists and leaders, think that the path to peace is more deterrence through more and better nuclear weapons. President Trump, for example, insists on renovating our aged nuclear weapons systems to the tune of $1.2 trillion between 2017 and 2046, an upgrade program37 he inherited from President Obama. And despite winning the Nobel Peace Prize for working toward nuclear nonproliferation, Obama nevertheless backed off from initiating a “no first use” policy under pressure from our NATO allies, who were worried that Russian saber rattling and border expansion might be encouraged if an escalation from conventional to nuclear weapons was no longer on the defense table.38

Similarly, the late political scientist Kenneth Waltz thought that allowing Iran to go nuclear would bring stability to the Middle East because “in no other region of the world does a lone, unchecked nuclear state exist. It is Israel’s nuclear arsenal, not Iran’s desire for one, that has contributed most to the current crisis. Power, after all, begs to be balanced.”39 Except for when it doesn’t, as in the post-1991 period after the collapse of the Soviet Union and the unipolar dominance of the United States. No other medium-size power rose to fill the vacuum, no rising power started wars of conquest to consolidate more power, and the only other candidate, China, has remained war-free for almost four decades. Given Iran’s outlier status in the international system and their avowed promise to “wipe off the map” Israel, anyone who would join a Fair-Play-for-Nuclear-Iran-Committee has lost their moral compass.

This all just shows how difficult it is going to be to get to a world without nukes. Nevertheless, we have to try. One more statistic is sobering in this regard, as noted by the anti-nuclear scientist and activist David Barash: The U.S. has a triad of nuclear weapons: land (missiles), air (bombers) and sea (submarines). A single Trident sub carries 20 nuclear-tipped missiles, each one of which has eight independently targetable warheads of about 465 kilotons, or about 30 times the destructive power of Little Boy. So, one sub packs the equivalent of 4,800 Hiroshimas (20 x 8 x 30), and we have 18 Trident submarines, or the equivalent of 86,400 Hiroshimas!40 In the words of President Obama during a briefing about our nuclear capability: “Let’s stipulate that this is all insane.”41

The use of nuclear weapons for both ending wars and deterring them is a 20th century phenomenon that can be phased out for the new century. As the political scientist Christopher Fettweis notes in his book Dangerous Times?, despite the popularity of such intuitive notions as the “balance of power” — based on a small number of non-generalizable cases from the past that are in any case no longer applicable to the present — so-called “clashes of civilization” like the world wars of the 20th century are extremely unlikely to happen in the highly interdependent world of the 21st century. In fact, Fettweis shows, never in history has such a high percentage of the world’s population lived in peace. Conflicts of all forms have been steadily dropping since the early 1990s, and even terrorism can bring states together in international cooperation to combat a common enemy.42

The abolition of nuclear weapons is a complex and difficult puzzle that has been studied extensively by scholars and scientists for over half a century. The many problems and permutations of getting from here to there are legion, and there is no single sure-fire pathway to zero. Nevertheless, it is a soluble problem, and humans are nothing if not innovative problem solvers.43 I do not believe that the deterrence trap is one from which we can never extricate ourselves, and the remaining threats should direct us to work toward Nuclear Zero sooner rather than later. In the meantime, minimum is the best we can hope for given the complexities of international relations, but given enough time, as Shakespeare poetically observed…

Time’s glory is to calm contending kings,
To unmask falsehood and bring truth to light,
To stamp the seal of time in aged things,
To wake the morn and sentinel the night, …
To slay the tiger that doth live by slaughter, …
To cheer the ploughman with increased crops,
And waste huge stones with little water-drops.”44

About the Author

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine, the host of the Science Salon Podcast, and a Presidential Fellow at Chapman University where he teaches Skepticism 101. For 18 years he was a monthly columnist for Scientific American. He is the author of New York Times bestsellers Why People Believe Weird Things and The Believing Brain, Why Darwin Matters, The Science of Good and Evil, The Moral Arc, and Heavens on Earth. His new book is Giving the Devil His Due: Reflections of a Scientific Humanist.

References
  1. Rhodes, Richard. 1986. The Making of the Atomic 1 Bomb. New York: Simon & Schuster.
  2. The Atomic Bomb and the End of World War II, A Collection of Primary Sources. National Security Archive Electronic Briefing Book No. 162. George Washington University. https://bit.ly/2WMuDNM See also: “The Third Shot.” https://bit.ly/39eCTuW
  3. DeNooyer, Rushmore. 2015. The Bomb. PBS documentary. https://to.pbs.org/3f2yyfw
  4. Bird, Kai and Martin J. Sherwin. 2007. American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer. New York: Knopf, 332.
  5. Quoted in: Marty, Martin E. 1996. Modern American Religion, Vol 3: Under God, Indivisible, 1941–1960. Chicago: University of Chicago Press, 117.
  6. Chomsky Noam. 1967. “The Responsibility of Intellectuals.” The New York Review of Books, 8(3).
  7. Goldhagen, Daniel Jonah. 2009. Worse Than War: Genocide, Eliminationism, and the Ongoing Assault on Humanity. New York: PublicAffairs, 1, 6.
  8. Lemkin, Raphael. 1946. “Genocide.” American Scholar, 15(2), 227–230.
  9. United Nations General Assembly Resolution 96(1): “The Crime of Genocide.”
  10. Katz, Steven T. 1994. The Holocaust in Historical Perspective, Vol. 1. New York: Oxford University Press.
  11. Kugler, Tadeusz, Kyung Kook Kang, Jacek Kugler, Marina Arbetman-Rabinowitz, and John Thomas. 2013. “Demographic and Economic Consequences of Conflict.” International Studies Quarterly, March, 57(1), 1–12.
  12. Shermer, Michael. 2015. The Moral Arc: How Science and Reason Lead Humanity to Truth, Justice, and Freedom. New York: Henry Holt, 11.
  13. In 2002 I attended the reunion of the Wren crew in my father’s stead and confirmed his memories.
  14. Toland, John. 1970. The Rising Sun: The Decline and Fall of the Japanese Empire 1936–1945. New York: Random House, 731.
  15. “The Cornerstone of Peace — Number of Names Inscribed.” Kyushu-Okinawa Summit 2000: Okinawa G8 Summit Host Preparation Council, 2000. See also: Pike, John. 2010. “Battle of Okinawa.” Globalsecurity.org; Manchester, William. 1987. “The Bloodiest Battle of All.” The New York Times, June 14.
  16. Giangreco, Dennis M. 2009. Hell to Pay: Operation Downfall and the Invasion of Japan 1945–1947. Annapolis, MD: Naval Institute Press, 121–124.
  17. Giangreco, Dennis M. 1998. “Transcript of ‘Operation Downfall [U.S. Invasion of Japan]: US Plans and Japanese Counter-Measures. Beyond Bushido: Recent Work in Japanese Military History. https://bit.ly/2ZYCLwu See also: Maddox, Robert James. 1995. “The Biggest Decision: Why We Had to Drop the Atomic Bomb.” American Heritage, 46(3).
  18. Skates, John Ray. 2000. The Invasion of Japan: Alternative to the Bomb. University of South Carolina Press, 79.
  19. Putnam, Frank W. 1998. “The Atomic Bomb Casualty Commission in Retrospect.” Proceedings of the National Academy of Sciences, May 12, 95(10), 5426–5431.
  20. K’Olier, Franklin (Ed.) 1946. United States Strategic Bombing Survey, Sum 20 mary Report (Pacific War). Washington DC: United States Government Printing Office. https://bit.ly/32TsOSL
  21. Rhodes, 1984, op cit., 599.
  22. Ibid.
  23. Quoted in: Cold War: MAD 1960–1972. 1998. BBC Two Documentary. Transcript: https://bit.ly/2EimnyA. Film: https://bit.ly/2WVLsFX
  24. Brodie, Bernard. 1946. The Absolute Weapon: Atomic Power and World Order. New York: Harcourt Brace, 79.
  25. Kubrick, Stanley. 1964. Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb. Columbia Pictures. http://youtu.be/2yfXgu37iyI
  26. Ibid.
  27. Brown, Anthony Cave (Ed.). 1978. Dropshot: The American Plan for World War III Against Russia in 1957. New York: Dial Press; Richelson, Jeffrey. 1986. “Population Targeting and US Strategic Doctrine.” In Desmond Ball and Jeffrey Richelson (Eds.). Strategic Nuclear Targeting. Ithaca, NY: Cornell University Press, 234–249.
  28. McNamara, Robert S. 1969. “Report Before the Senate Armed Services Committee on the Fiscal year 1969-73 Defense Program, and 1969 Defense Budget, January 22, 1969.” Washington, DC: Government Printing Office, 11.
  29. For a scholarly analysis of and an alternative view to deterrence see: Kugler, Jacek. 1984. “Terror Without Deterrence: Reassessing the Role of Nuclear Weapons.” Journal of Conflict Resolution, 28(3), September, 470–506.
  30. Kant, Immanuel. 1795. “Perpetual Peace: A Philosophical Sketch.” In Perpetual Peace and Other Essays. Indianapolis: Hackett, I, 6.
  31. Pinker, Steven. 2011. The Better Angels of Our Nature: Why Violence Has Declined. New York: Penguin.
  32. Rhodes, Richard. 2010. Twilight of the Bombs: Recent Challenges, New Dangers, and the Prospects of a World Without Nuclear Weapons. New York: Knopf.
  33. Schlosser, Eric. 2013. Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. New York: Penguin.
  34. See the documentary film of that title. Trailer: https://bit.ly/2CVRuPJ
  35. Science Salon podcast episode # 107 with Fred Kaplan and Science Salon podcast episode # 127 with William J. Perry were based on their new books: Kaplan, Fred. 2020. The Bomb: Presidents, Generals, and the Secret History of Nuclear War. New York: Simon & Schuster; Perry, William J. and Tom Z. Collina. 2020. The Button: The New Nuclear Arms Race and Presidential Power from Truman to Trump. BenBella Books.
  36. For a striking visual demonstration of every one of the 2,053 nuclear weapon explosions between 1945 and 1998 by the Japanese artist Isao Hashimoto, starting with the Trinity test in New Mexico, where in the world they happened and whom they were sponsored by, see: https://bit.ly/2D20l2o
  37. 2018. U.S. Nuclear Modernization Programs report. Arms Control Association. August. https://bit.ly/3fQv0yk
  38. The Nobel Prize committee’s statement on President Obama’s award: https://bit.ly/2WLWr4K Sonne, Paul, Gordon Lubold, and Carol E. Lee. 2016. “‘No First Use’ Nuclear Policy Proposal Assailed by U.S. Cabinet Officials, Allies.” Wall Street Journal, August 12. https://on.wsj.com/3hsKGYR
  39. Waltz, Kenneth N. 2012. “Why Iran Should Get the Bomb: Nuclear Balancing Would Mean Stability.” Foreign Affairs, July/August.
  40. Barash, David. 2018. “Deterrence and its Discontents.” Skeptic, Vol. 23, No. 2, https://bit.ly/2EikKkt
  41. Quoted in Kaplan, op cit., 244.
  42. Fettweis, Christopher. 2010. Dangerous Times? The International Politics of Great Power Peace. Georgetown University Press.
  43. Lipton, Judith and David Barash. 2019. Strength Through Peace: How Demilitarization Led to Peace and Happiness in Costa Rica, and What the Rest of the World Can Learn From a Tiny, Tropical Nation. Oxford University Press.
  44. Shakespeare, William. 1594. The Rape of Lucrece. Available at: https://bit.ly/2ByB5k4
TAGS: , , , , , , , , , ,

Why People Believe Conspiracy Theories

Posted on Jul. 17, 2020 by | Comments (6)

What is a conspiracy, and how does it differ from a conspiracy theory? Michael Shermer explains who believes conspiracy theories and why they believe them in the following essay, derived from Lecture 1 of his 12-lecture Audible Original course titled “Conspiracies and Conspiracy Theories: What We Should Believe and Why.”

On Friday, March 15, 2019 a 28-year old Australian man wielding five firearms stormed two mosques in Christchurch, New Zealand, and opened fire, killing 50 people and wounding dozens more. It was the worst mass public shooting in the history of that country, prompting Prime Minister Jacinda Ardern to reflect: “While the nation grapples with a form of grief and anger that we have not experienced before, we are seeking answers.”

One answer may be found in the shooter’s rambling 74-page manifesto titled The Great Replacement, apparently inspired by a book of the same title by the French author Renaud Camus. The Great Replacement is a right wing conspiracy theory that claims that white Christian Europeans are being systematically replaced by people of non- European descent, most notably from North Africa, Sub-Saharan Africa, and the Arab Middle East, through immigration and higher birth rates.

The New Zealand killer’s name is Brenton Harrison Tarrant and his manifesto is filled with white supremacist tropes focused on this conspiracy theory, starting with his opening sentence “It’s the birthrates” repeated three times. “If there is one thing I want you to remember from these writings, it’s that the birthrates must change,” Tarrant insists. “Even if we were to deport all Non-Europeans from our lands tomorrow, the European people would still be spiraling into decay and eventual death.” Tarrant then cites the replacement fertility level of 2.06 births per woman, complaining that “not a single Western country, not a single white nation,” reaches this level. The result, he concludes, is “white genocide.”

This is classic 19th century blood-and-soil romanticism, and the self-described “Ethno-nationalist” Tarrant writes that he went on this murderous spree “to ensure the existence of our people and a future for white children, whilst preserving and exulting nature and the natural order.” His screed goes on and on like this, culminating in a photo collage of attractive white people and well-armed militia men.

It is reminiscent of the “Unite the Right” event in Charlottesville, Virginia, in August of 2017 when white supremacists shouted slogans like “blood and soil” and “Jews will not replace us.” Given that there are only about 15 million Jews in the world, Judaism employs no missionary effort at conversion, and birthrates among Jewish families are among the lowest in the world, why would any group worry about being “replaced” by them? They’re not. They’re reflecting the conspiracy theory that Jews control the media, politics, banking and finance, and even the world economy.

In his manifesto Tarrant references the number 14, or the fourteen-word slogan originally coined by the white supremacist David Lane while in federal prison for his role in the 1984 murder of the Jewish radio talk show host Alan Berg. Here are the 14 words:

“We must secure the existence of our people and a future for white children.”

The number is sometimes rendered as 14/88, with the 8s representing the eighth letter of the alphabet— H—and 88 or HH standing for Heil Hitler. Lane, in fact, was inspired by Adolf Hitler’s conspiracy- theory laden book Mein Kampf, in which the Nazi leader rants:

What we must fight for is to safeguard the existence and reproduction of our race and our people, the sustenance of our children and the purity of our blood, the freedom and independence of the fatherland, so that our people may mature for the fulfillment of the mission allotted it by the creator of the universe.

Hitler goes on to identify the enemy of his mission— the Jews—which reflects another conspiracy theory called the “stab in the back,” popular in Germany in the 1920s and 1930s. According to this theory, the only reason the Germans lost World War I was that they were stabbed in the back by the “November Criminals” (the Armistice was signed on November 11, 1918), who the Nazis insisted were Jews, Marxists, and Bolsheviks.

And this “stab in the back” conspiracy theory itself derives from an earlier and larger conspiracy theory involving The Protocols of the Learned Elders of Zion, a hoaxed document purporting to be the proceedings of a secret meeting of Jews plotting global domination. A number of prominent people at the time believed the Protocols hoax, including the American industrialist Henry Ford, who published his own conspiratorial tract titled The International Jew: The World’s Foremost Problem. He later recanted and withdrew the book from circulation when he found out the conspiracy theory was a fake.

What all this shows is the power of conspiratorial belief to motivate people to act, including murderous action, from killing dozens in New Zealand to murdering millions in the Holocaust.


Conspiracy theories are as countless as they are confusing. I once met a politician who told me that he believes the fluoridation of water is the greatest scam ever perpetrated on the public. I have been confronted by 9/11 “truthers” who have insisted the al-Qaeda attack was actually an “inside job” by the Bush administration. Others have regaled me for hours with their breathless tales of who really killed JFK, RFK, MLK Jr., Jimmy Hoffa, or Princess Diana, along with the nefarious goingson of the Federal Reserve, the New World Order, the Trilateral Commission, the Council on Foreign Relations, the Committee of 300, the Knights Templar, the Freemasons, the Illuminati, the Bilderberg Group, the Rothschilds, the Rockefellers, and the Zionist Occupation Government (ZOG) that secretly runs the United States. It would take Madison Square Garden to hold all the conspiracists plotting world domination.

What is a conspiracy, and how does it differ from a conspiracy theory?

I define a conspiracy as two or more people plotting or acting in secret to gain an advantage or to harm others immorally or illegally. I distinguish a conspiracy from a conspiracy theory, which I define as a structured belief about a conspiracy, whether it is real or not. A conspiracy theorist, or conspiracist, is someone who holds a conspiracy theory about a possible conspiracy, again whether or not it is real.

Although the terms “conspiracy theory”, “conspiracy theorist,” and “conspiracist” do sometimes carry pejorative connotations meant to disparage someone or their beliefs—as in “that’s just a crazy conspiracy theory” or “he’s one of those nutty conspiracists”— the terms in fact have a rich history not meant to disparage.

Who believes in such conspiracies? Surveys by the political scientists and conspiracy researchers Joseph Uscinski and Joseph Parent show that conspiracists “cut across gender, age, race, income, political affiliation, educational level, and occupational status.” For example, both liberals and conservatives believe in conspiracies at roughly the same level, although each thinks different secret cabals are at work, with liberals more likely to suspect that media sources and political parties are pawns of rich capitalists and corporations, while conservatives are more likely to believe that academics and liberal elites control these same institutions.

There are other factors at work as well. Race, for example, is not a predictor of overall conspiracism, but it does partially determine which conspiracy theories are likely to be embraced. African Americans, for example, are more likely to believe that the federal government invented AIDS to kill Blacks and that the CIA planted crack cocaine in inner city neighborhoods to ruin them. By contrast, white Americans are more likely to suspect the Feds are conspiring to abolish the Second Amendment and convert the nation into a socialist commune.

Education appears to attenuate conspiracy thinking, with 42 percent of those without a high school diploma scoring high in conspiratorial predispositions compared to those with postgraduate degrees, who come in at 22 percent. Nevertheless, that one in five Americans with postgraduate degrees believe in conspiracies tells us something else is going on here.

In my 2011 book The Believing Brain I suggested that two cognitive processes are at work in conspiracy thinking: (1) patternicity, or the tendency to find meaningful patterns in both meaningful and meaningless noise, and (2) agenticity, or the tendency to infuse patterns with meaning, intention, and agency. I will explore these concepts in more depth in another lecture, but the idea is that the patterndetection filters of conspiricists are wide open, thereby letting in any and all patterns as real with little or no screening of potential false patterns.

Conspiracy theorists connect the dots of random events into meaningful patterns, and then infuse those patterns with intentional agency, and believe that these intentional agents control the world, sometimes invisibly from the top down, instead of the bottom-up causal randomness that determines much of what happens in our world.

To these factors we can add three cognitive biases that often distort events and evidence to fit our preconceived conspiratorial conceptions. For example, the confirmation bias is the tendency to seek and find confirming evidence in support of already existing beliefs, and to ignore or reinterpret disconfirming evidence. Once you have decided that a conspiracy theory is true, your brain sets out to find evidence to support it and filter out evidence that doesn’t.

Another is the hindsight bias, in which we tailor after-the-fact explanations to what has already happened. Once an event has occurred, we look back and reconstruct how it happened, why it had to happen that way and not some other way, and why we should have seen it coming all along, the very essence of conspiracism.

Then there’s cognitive dissonance, the phenomenon of mental tension created when someone holds two conflicting thoughts simultaneously, such as what happens when conspiracy theories about the end of the world don’t come true—instead of admitting their mistake believers double down on their belief and rationalize the failures, all in an attempt to reduce dissonance.

Anxiety, alienation, and feelings of rejection are also factors in conspiratorial cognition. For example, in 2017 Princeton University researchers had subjects write a brief description of themselves that they then shared with two other people in their small group, telling them that they would be judged by the other group members. The subjects who were told that they were rejected were more inclined to believe in conspiracy-related scenarios.

And it’s not just private anxieties. Cultural anxiety may also lead to conspiracy thinking. A 2018 survey of over 3,000 Americans, for example, found that those who reported feeling that American values are eroding were more likely to agree with conspiratorial statements, such as “many major events have behind them the actions of a small group of influential people.”

Feeling in control or powerful in your environment reduces anxiety, but the opposite—concern about what may be out of your control can increase anxiety and conspiratorial paranoia about things that could go wrong. In a 2015 study conducted in the Netherlands, for example, researchers divided subjects into three groups:

  1. those primed to feel powerless and out of control,
  2. those primed to feel in control and powerful, and
  3. a control group not primed for anything.

The subjects were then told about a construction project undergoing problems that could be related to a conspiracy by the city council to steal money from the project’s budget. Subjects primed to feel powerless and out of control were more likely to believe the conspiracy theory. Researchers have also found that conspiratorial speculation runs higher after natural disasters like earthquakes, or when people fear that they may lose their job.


There is another reason why people believe in conspiracy theories that researchers have largely neglected: a lot of them are true. Enough conspiracies are real that it pays to be constructively paranoid because sometimes “they” really are out to get us.

If we take the Oxford English Dictionary’s definition of a conspiracy theory as “a belief that some covert but influential agency (typically political in motivation and oppressive in intent) is responsible for an unexplained event,” then even a cursory review of history reveals that conspiracies have dramatically influenced the course of history and may still be found at work in modern societies. Consider some examples.

Julius Caesar was stabbed to death by a conspiracy of Roman senators on the Ides of March in 44 B.C.E.

The Gunpowder Plot of 1605 saw a group of provincial English Catholics attempt to assassinate King James I by blowing up the House of Lords during the State Opening of Parliament. The plot was discovered and thwarted days before, with the conspiracists caught, tried, convicted, hanged, drawn, and quartered.

In 1776 an elite group of soldiers were assigned to be George Washington’s bodyguards, some of whom were plotting to assassinate the future first president of the United States at the behest of the governor of New York and the Mayor of New York City. The plot was foiled, thanks only to the ineptitude of the plotters to keep a secret.

Abraham Lincoln was assassinated by a conspiracy of Southerners angered by the outcome of the Civil War, which itself was instigated by a Southern cabal to illegally secede from the United States—arguably the biggest conspiracy in U.S. history.

World War I exploded after a Serbian separatist secret society called the Black Hand conspired to assassinate the Austrian archduke Franz Ferdinand, leading to an arms race that erupted in the guns of August and the start of a conflict that resulted in the deaths of millions.

The Japanese sneak attack on Pearl Harbor was, by definition, a conspiracy that the U.S. military and intelligence agencies failed to detect, leading to conspiracy theories that President Roosevelt let it happen on purpose to drag America into war.

The obsessively paranoid Joseph Stalin wasn’t conspiratorially-minded enough to realize that Hitler was plotting to break their nonaggression pact and invade the Soviet Union, despite warnings from the British government to that effect. The consequence was the deaths of tens of millions of soldiers and civilians.

In the 1950s, with his now-infamous Congressional hearings, the conspiratorially minded Senator Joseph McCarthy launched a witch hunt to ferret out what he claimed was a Communist conspiracy to destroy America.

In the 1960s, Operation Northwoods was a document produced under the Kennedy administration that proposed a number of “false flag” operations that might be carried out in order to justify military intervention in Cuba. Among the proposals were such ideas as staging a fake attack on the U.S. military base at Guantanamo Bay, employing a fake Russian MiG aircraft to buzz a real U.S. civilian airliner, faking an attack on a U.S. ship to make it look like Cubans did it, and developing “a Communist Cuban terror campaign in Miami.” None of these crazy ideas were implemented, but that members of Kennedy’s administration considered them—even in the context of a meeting with people just spitballing ideas willy nilly—reveals the lengths to which even high ranking people in the government are willing to conspire against others to get their way.

In the 1970s, Watergate stands out as a conspiracy of dunces, and the Pentagon Papers revealed the extent to which the Kennedy, Johnson, and Nixon administrations conspired to escalate the Vietnam war without Congressional knowledge, much less approval. And we now know that Kennedy conspired to have Fidel Castro assassinated, Johnson conspired to cover up that fact when he took office, and Nixon secretly recorded conversations in the Oval office that revealed his distinctive view of presidential power—a view that he later summarized in an interview with David Frost as follows: “Well, when the president does it, that means that it is not illegal.”

In the 1980s, the Iran-Contra arms-for-hostages scandal was a conspiracy that embodied what conspiracists since World War I had been concerned about—the usurpation of power by conspirators who were legally elected to their positions instead of hijacking government agencies through a coup, which was common in centuries past.

In the 1990s, government overreach against Randy Weaver and his family in Ruby Ridge, Idaho, and against David Koresh and the Branch Davidians in Waco, Texas, understandably led to the rise of the conspiratorially-minded militia movement that culminated in Timothy McVeigh’s bombing of a federal building in Oklahoma City.

In the 2000s, the George W. Bush administration concocted a conspiracy theory that Iraq was developing weapons of mass destruction as a justification for invading that country, which proved false when inspectors failed to find any WMDs. And Wikileaks revealed the extent to which the NSA and other governmental agencies conspired to spy on Americans and foreign leaders on the heels of 9/11. As Buffalo Springfield cautioned in their 1966 hit song For What It’s Worth, “There’s something happening here. What it is ain’t exactly clear.”

In the 2010s, as if the run-up to the 2016 Presidential election wasn’t crazy enough, in the middle of it, and continuing after Trump’s victory, there emerged a bizarre conspiracy theory at Trump rallies were some of his supporters held signs reading “Q” and “QAnon.” It apparently began with an internet user called “Q Clearance Patriot” or “Q,” who posted on internet message boards like 4chan and 8chan the conspiracy theory that inside the “deep state” there is an “anonymous” source working against the Trump administration. “I can hint and point but cannot give too many highly classified data points,” the Q conspiracist wrote, adding: “These are crumbs and you cannot imagine the full and complete picture.” That complete picture apparently includes such operatives as Hillary Clinton, Barack Obama, George Soros, and various Hollywood celebrities, all alleged to be involved in a global sex trade and pedophile ring.

The “Qincidences” (spelled with a Q) include the recurrence of certain numbers, such as 17 (Q is the 17th letter) and 4, 10 and 20, corresponding to DJT, or Donald J. Trump. And since “there are no coincidences” in the mind of the conspiracist, such numerology led to the absurd 2016 “Qonspiracy theory” (also spelled with a Q) of “Pizzagate”. Promulgators of this theory asserted— without any evidence and beyond belief—that Hillary Clinton was directing a pedophile ring out of a pizza parlor. As absurd as this sounds, the Pizzagate conspiracy theory led a young man to shoot up a restaurant with an AR-15-style rifle, claiming he intended to break up the perceived perversion. It was fortunate no one was hurt in the incident, but it revealed the power of conspiratorial paranoia.

As we approach the 2020s a new type of conspiracism has been identified by the political scientists Russell Muirhead and Nancy L. Rosenblum in their 2019 book, A Lot of People are Saying: The New Conspiracism and the Assault on Democracy. Classic conspiracy theories are grounded in arguments and evidence, whereas more recent conspiracy theories are simply asserted, usually without facts to support them. This new conspiracism is captured in the book’s title, ripped from the 2016 presidential election and Donald Trump’s recurrent phrase “a lot of people are saying,” which was typically followed by no evidence whatsoever for the assertion. As Muirhead and Rosenblum explain the new conspiracism:

There is no punctilious demand for proofs, no exhausting amassing of evidence, no dots revealed to form a pattern, no close examination of the operators plotting in the shadows. The new conspiracism dispenses with the burden of explanation. Instead, we have innuendo and verbal gesture: “A lot of people are saying …” Or we have bare assertion: “Rigged!” … This is conspiracy theory without the theory.

How then does such conspiracism spread and catch hold? Repetition. In the age of social media, what counts is not evidence so much as retweets, re-posts, and likes. And by no means is the new conspiracism the product only of President Trump, given that politicians—not to mention economists, scholars, pundits, and ideologues of all stripes— have been making evidence-lacking assertions for generations, although admittedly without an audience of 60 million twitter followers the current conspiracist-in-chief commands.

More importantly, Trump’s conspiratorial assertions would go nowhere without a receptive audience, so the blame for the nefarious effects of the new conspiracism have to be spread much wider to encompass all of social media, alternative media, and even some mainstream media, which have stepped up their sensationalistic headlines in an effort to recapture advertising dollars they’ve been losing since the rise of the Internet.


Individuals act on their beliefs, and when those beliefs contain conspiracy theories about nefarious goings-on, those acts can turn deadly. That is exactly what happened at the Tree of Life synagogue in Pittsburgh on October 27, 2018, when an assailant armed with guns and one of the oldest conspiracy theories about the Jews running the world, murdered eleven congregants before his capture. “I just want to kill Jews,” he proclaimed.

Consuming content on the online social network Gab, the conspiricist grew paranoid about the Hebrew Immigrant Aid Society (HIAS), which the Tree of Life synagogue helped support. On Gab the conspiracist read that HIAS provided aid to the migrant caravans moving north from Central America toward the United States’ southern border. “HIAS likes to bring invaders in that kill our people,” the assassin posted on Gab just before he committed the mass murder, adding “I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.”

This brings us back to the mass murder in New Zealand with which we began this lecture. These are just two of countless conspiracy theories with real-world consequences, mostly bad. Ideas matter. Beliefs matter. Conspiracy theories matter. And they are not confined to the fringes of pop culture or the dark web, but instead penetrate all areas of public and private life, often directing the lives of people and the course of history.

So as we analyze examples like these in the lectures ahead, I hope you’ll reach the same conclusion that I’ve reached: The subject of this course—conspiracies and conspiracy theories— could well be one of the most important subjects any of us can study. END

About the Author

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine, the host of the Science Salon Podcast, and a Presidential Fellow at Chapman University where he teaches Skepticism 101. For 18 years he was a monthly columnist for Scientific American. He is the author of New York Times bestsellers Why People Believe Weird Things and The Believing Brain, Why Darwin Matters, The Science of Good and Evil, The Moral Arc, and Heavens on Earth. His new book is Giving the Devil His Due: Reflections of a Scientific Humanist.

TAGS: , , , , , , , , , , , , ,

The Moral Arc: How Thinking Like a Scientist Makes the World More Moral

Posted on Jul. 03, 2020 by | Comments Off on The Moral Arc: How Thinking Like a Scientist Makes the World More Moral

In this, the final lecture of his Chapman University Skepticism 101 course, Dr. Michael Shermer pulls back to take a bigger picture look at what science and reason have done for humanity in the realm of moral progress. That is, applying the methods of science and principles of reason since the Scientific Revolution in the 17th century has solved not only problems in the physical and biological/medical fields, but in social and moral realms as well. How should we structure societies so that more people flourish in more places more of the time? Science can answer that question, and it has for centuries. Learning how to think like a scientist can make the world a better place, as Dr. Shermer explains in this lecture based on his 2015 book, The Moral Arc.

Shermer’s Chapman University course, Skepticism 101: How to Think Like a Scientist, covers a wide range of topics, from critical thinking, reasoning, rationality, free speech, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, the Bermuda Triangle, psychics, evolution, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

MISSED A PREVIOUS LECTURE?

Watch the entire 15-lecture Chapman University Skepticism 101 series for free!

Learn how to think like a scientist! Click the button below to browse through the entire course lecture series 1 through 15, and watch all lectures that interest you, for free!

Watch all 15 lectures for free

TAGS: , , , , , , , , , , , , , , , , , , , , , ,

What is Truth, Anyway?

Posted on Jun. 26, 2020 by | Comments Off on What is Truth, Anyway?

In this lecture Dr. Michael Shermer addresses one of the deepest questions of all: what is truth? How do we know what is true, untrue, or uncertain? Given that none of us are omniscient, all claims to knowledge carry a certain level of uncertainty. Given that fact, how can we determine what is true? Included: subjective/internal vs. objective/external truths, Hume’s theory of causality, correlation and causation, the principle of proportionality (or why extraordinary claims require extraordinary evidence), how to think about miracles and the resurrection, mysterian mysteries, post-truth, rational irrationalities, the man who saved the world, Bayesian reasoning, and why love depends on evidence.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , , , , , ,

The Truth About Post-Truth Truthiness

Posted on Jun. 25, 2020 by | Comments (3)

Is post-truth the political subordination of reality? Is truth itself any more under threat today that in the past? Have the populists & postmodernists won the day? In response to Dr. Lee McIntyre’s essay, Dr. Michael Shermer asserts that people are not nearly as gullible as some believe.

Words embody ideas, and their changing usage and meaning are tracked by lexicographers in dictionaries, which therein become barometers of cultural trends. In 2006, for example, the American Dialect Society and Merriam-Webster’s both chose as their word of the year the neologism “truthiness”, introduced by the comedian Stephen Colbert on the premiere episode of his satirical mock news show The Colbert Report (on which I appeared twice1), meaning “the truth we want to exist.”2 It was a prescient comedic bit as a decade later three examples of truthiness entered our lexicon.

After Donald Trump’s Presidential inauguration on January 20, 2017, his special counselor Kellyanne Conway concocted the term “alternative facts” during a Meet the Press interview while defending White House Press Secretary Sean Spicer’s inaccurate statement about the size of the crowd that day. “Our press secretary, Sean Spicer, gave alternative facts to that [the inaugural crowd size], but the point remains that….” at which time NBC correspondent Chuck Todd cut her off: “Wait a minute. Alternative facts? … Alternative facts are not facts. They’re falsehoods.”3 German linguists deemed it the “un-word of the year” (Unwort des Jahres) for 2017. Later that year the related term “fake news” became common parlance, leaping in usage 365 percent and landing it on the “word of the year shortlist” of Collins Dictionary, which defined it as “false, often sensational, information disseminated under the guise of news reporting.”4

Such words (or un-words) are often invoked as evidence that we are living in a “post-truth” era brought on by Donald Trump (according to liberals) or by postmodernism (according to conservatives). Are we living in a post-truth world of truthiness, fake news, and alternative facts? Have the populists and postmodernists won the day? Is all the political, economic, and social progress we have achieved over the past several centuries in reversal—the abolition of slavery and torture, the decline of homicide, crime, and violence, the cessation of the European Great Powers wars, and the expansion of the moral sphere to include civil rights, women’s rights, children’s rights, worker’s rights, and gay rights for more people in more places more of the time? Are we lurching backwards to the Middle Ages when bigots lighted faggots to torch women as witches?

Skeptic 24.3 (cover)

No. The Fall 2019 cover story of Skeptic by the Harvard psychologist Steven Pinker, “Why We Are Not Living in a Post-Truth Era,” explains why, starting with this question: Is the statement “We are living in a post-truth era”…true? If it is, then it isn’t! That is, if you argue that the statement is true then you are making an argument, which means you are committed to determining whether the statement is true or false, which means we have not passed into a post-truth world. Similarly, is the statement “humans are irrational” rational? If it is, then it can’t be because, as Pinker asks rhetorically, “If humans were truly irrational, who specified the benchmark of rationality against which humans don’t measure up?”5 As Pinker reflected in his 2018 book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress, “Mendacity, truth-shading, conspiracy theories, extraordinary popular delusions, and the madness of crowds are as old as our species, but so is the conviction that some ideas are right and others are wrong.”6

In this issue of Skeptic the philosopher Lee McIntyre, author of the book Post-Truth,7 challenges Pinker, starting with a definition of post-truth as the “political subordination of reality,” which he ascertains to be “a tactic in the authoritarian toolbox.” McIntyre’s definition is much narrower than the way Pinker and I use the term, confining it as he does to political propaganda, which he says “is not meant to convince you, but to show you who’s boss.” The message, he says, referencing Jason Stanley’s book How Propaganda Works8, is “I am so powerful that I can dominate your reality, and there is nothing you can do about it.” To reinforce the political nature of post-truth, McIntyre also invokes Tim Snyder’s observation in his 2017 book On Tyranny that “post-truth is pre-fascism,”9 along with Hannah Arendt’s observation that “the ideal subject of totalitarian rule is not the convinced Nazi or the convinced communist, but people for whom the distinction between fact and fiction (i.e., the reality of experience) and the distinction between true and false (i.e., the standards of thought) no longer exist.”10

Post-truth as political propaganda is certainly one use (or misuse) of truth that neither Pinker nor I discount, but McIntyre then accuses Pinker (and others) of merely knocking down one or more of four post-truth straw men: (1) that truth doesn’t matter, (2) that no one really cares about truth anymore, (3) that no one can find the truth, and (4) that if we were actually living in a post-truth era, we should just give up. Instead, to steel-man the problem McIntye asserts that “the claim that we live in a post-truth era is properly based on the idea that truth today is under threat.”

Is it? There certainly are people who, pace Hannah Arendt, cannot seem to distinguish between fact and fiction, true and false, and this shortcoming can lead not only to fascism or communism, but also to Holocaust denial, evolution denial, climate denial, vaccine denial, GMO denial, and more. But is it really that people cannot discern reality, or is it that they are motivated to spin the facts to support some other agenda? Holocaust deniers are anti-Semites. Evolution deniers are religious fundamentalists. Climate deniers mistrust big government. Vaccine deniers distrust big Pharma. GMO deniers detest Monsanto. It isn’t the truth about the facts under dispute, but an underlying motive. Consider an interview reprinted in McIntyre’s book, which he presents as a type specimen of post-truth, in which CNN’s Alisyn Camerota engages the former Republican Speaker of the House Newt Gingrich on crime rates. The exchange is revealing:11

Camerota: Violent crime is down. The economy is ticking up.

Gingrich: It is not down in the biggest cities.

Camerota: Violent crime, murder rate is down. It is down.

Gingrich: Then how come it’s up in Chicago and up in Baltimore and up in Washington?

Camerota: There are pockets where certainly we are not tackling murder.

Gingrich: Your national capital, your third biggest city…

Camerota: But violent crime across the country is down.

Gingrich: The average American, I will bet you this morning, does not think crime is down, does not think they are safer.

Camerota: But it is. We are safer and it is down.

Gingrich: No, that’s just your view.

Camerota: It is a fact. These are the national FBI facts.

Gingrich: But what I said is also a fact. … The current view is that liberals have a whole set of statistics that theoretically may be right, but it’s not where human beings are. People are frightened.

Camerota: But what you’re saying is, but hold on Mr. Speaker because you’re saying liberals use these numbers, they use this sort of magic math. These are the FBI statistics. They’re not a liberal organization. They’re a crime-fighting organization.

Gingrich: No, but what I said is equally true. People feel more threatened.

Camerota: Feel it, yes. They feel it, but the facts don’t support it.

Gingrich: As a political candidate, I’ll go with how people feel and let you go with the theoreticians.

As I read it, this isn’t an example of the post-truth equivalent of, as McIntyre describes it, a “chilling exchange in the basement of the Ministry of Love in the pages of George Orwell’s dystopian novel 1984.” Camerota and Gingrich are simply talking about two different matters: crime rates and peoples’ perceptions about crime rates. The difference represents a cognitive illusion due to the availability bias, in which one assesses a problem based on the most immediate and salient available example, usually from the evening news that features individual crimes, especially homicides. Camerota is a journalist focusing on the long-term decline of crime, whereas Gingrich is a politician trying to garner support by appealing to peoples’ fears about crime, citing the equally true statistics about recent upticks in crime in a handful of U.S. cities, most notably Chicago, Baltimore, and Washington D.C., which Camerota acknowledges. Both facts are true, so this is not an example of recent post-truthiness but of good old-fashioned spin-doctoring, which has been around at least since the 1940s, when George Orwell noted: “Political language— and with variations this is true of all political parties, from Conservatives to Anarchists—is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.”12 The problem can be traced back even further, as when Edmund Burke commented on the language surrounding the French Revolution:

The whole compass of the language is tried to find sinonimies [synonyms] and circumlocutions for massacres and murder. Things are never called by their common names. Massacre is sometimes called agitation, sometimes effervescence, sometimes excess; sometimes too continued an exercise of a revolutionary power.13

McIntyre says that the purpose of post-truth “is political, not epistemological,” but is not the former subsumed in the latter? It’s all epistemological, inasmuch as everything turns on what constitutes reliable knowledge, and that is a very old problem indeed.

Even the concept of post-truth is not new. The Oxford Dictionaries has tracked its use back to 1992, the year we founded Skeptic magazine, the early years of which were devoted to the “science wars,” which were fought over the nature of truth and whether or not science was the royal road to it. Many thought not, coming to believe that there is no objective reality to be discovered and no belief, idea, hypothesis, or theory that is closer to the truth than any other. In his 1996 Skeptic article “More Higher Superstitions,” Norman Levitt (coauthor of the book Higher Superstition14) describes the problem in language that could have been written in 2019:

Science studies…overlaps what is nowadays called cultural studies, a tendency that has effaced traditional scholarship in a number of areas, and it has absorbed many of the radically relativistic attitudes that predominate in postmodern cultural anthropology. The central doctrine of science studies is that science is “socially constructed” in a way that disallows traditional notions of scientific validity and objectivity. On this view, scientific theories are merely narratives peculiar to this culture and this point in its history. Their chief function is to create stories about the world consonant with dominant social and political values. Thus, they are no more “true,” or even more reliable, than the myths, legends, and just-so stories of other cultures. All are equally culture- specific.15

Post-truth claims were just as prominent in the 1990s as they are now, and no less criticized, even parodied. Recall that this was the decade of the wildly popular television series The X-Files, a conspiracy-laden mosh pit of aliens and UFOs, monsters and demons, mutants and shape-shifters, urban legends and government cover-ups, and all manner of paranormal piffle. So trendy was the show that The Simpsons caricatured it with an episode titled “The Springfield Files,” in which Homer has a close encounter of the third kind after downing ten bottles of beer. X-Files stars Gillian Anderson and David Duchovny (Scully and Mulder) guest star as investigators of the alien abduction, and Leonard Nimoy, host of the 1970’s more-or-less nonfiction version of The X-Files called In Search of…, voiced the introduction, announcing: “The following tale of alien encounters is true. And by true, I mean false. It’s all lies. But they’re entertaining lies, and in the end isn’t that the real truth? The answer is no.”

So post-truthiness is not new, but the availability bias dialed up to eleven through social media led the Oxford Dictionaries to name “post-truth” as its word of the year in 2016 after it documented a 2000 percent spike in usage over the previous year, characterizing it as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” As the dictionaries’ editor Casper Grathwohl noted: “We first saw the frequency really spike this year in June with buzz over the Brexit vote and again in July when Donald Trump secured the Republican presidential nomination. Given that usage of the term hasn’t shown any signs of slowing down, I wouldn’t be surprised if post-truth becomes one of the defining words of our time.”16 Even the veteran CBS anchorman and 60 Minutes correspondent Scott Pelley succumbed to the temptation to think we’re living in a post-truth era. On the final day of 2019, reflecting on the message of his new book Truth Worth Telling, summed up what has happened to truth in the decade of the 2010s:

This is the thing that worries me the most about our beloved country. We have gone from the information age to the disinformation age. I think our viewers and our readers now have a responsibility that they’ve never had before, and that is that they have to be careful about how they choose their information diet. This is going to be a problem for the rest of our history, in particular for democracies.17

Not only is post-truthiness not new, but the response to challenges to objective knowledge are as robust today as in the past, if not more so, having moved far beyond the pages of niche magazines like Skeptic and Scientific American, which defend science, reason, empiricism, and fact checking, and is now routinely addressed in national news magazines and newspapers. Despite President Trump’s constant reference to the “failing New York Times” in his Twitter feed,18 for example, the circulation of the Grey Lady has skyrocketed since Trump was elected. In the 4th quarter of 2018 alone, for example, the New York Times added 265,000 digital subscriptions, turned a profit of $55.2 million, and saw its newsroom staff grow to 1,600 people, the largest number in its 167-year history.19

Today, as dictionaries track the upswing in post-truth language, and as political pundits pronounce the end of truth and with it the Republic (if you can keep it), the Internet of ideas has responded with tools to combat the illiberalism of unreason: real time fact-checking. As politicians engaged in the old-time art of spindoctoring the truth in speeches, fact-checkers at OpenSecrets.org, Snopes.com, FactCheck.org, and PolitiFact.com tallied their errors and lies, the latter cheekily ranking statements as True, Mostly True, Half True, Mostly False, and Pants on Fire. As PolitiFact’s editor Angie Holan explained: “journalists regularly tell me their media organizations have started highlighting fact-checking in their reporting because so many people click on fact-checking stories after a debate or high-profile news event.”20

Finally, the idea that post-truthiness could invade the brains of gullible citizens is gainsaid by new research by cognitive scientists that demonstrates that people are not nearly as gullible as we’ve been led to believe. A 2020 book by the cognitive scientist Hugo Mercier, Not Born Yesterday, presents a mountain of evidence “against the idea that humans are gullible, that they are ‘wired not to seek truth’ and ‘overly deferential to authority’, and that they ‘cower before uniform opinion’,” quoting Jason Brennan in his book Against Democracy. In fact, Mercier reveals through both laboratory research and real-world examples that “far from being gullible, we are endowed with a suite of cognitive mechanisms that evaluate what we hear or read.” And far from defaulting to believing everything we hear, Mercier notes that “by default we veer on the side of being resistant to new ideas. In the absence of the right cues, we reject messages that don’t fit with our preconceived views or preexisting plans. To persuade us otherwise takes long-established, carefully maintained trust, clearly demonstrated expertise, and sound arguments.”21

Mercier begins by showing why evolution could not have created animals that are so gullible as to be routinely exploited by others, as that would ultimately lead to reproductive failure and the extinction of extreme gullibility. The balance in communication between belief and skepticism led to an evolutionary arms-race between deception and deception detection, along with cognitive mechanisms “that help us decide how much weight to put on what we hear or read.” Mass persuasion, for example, is extremely difficult to pull off, and most attempts at it fail miserably, because when scaled up from two-person communication to large audiences, trust cues do not scale up accordingly. Most preachers, prophets, and demagogues fail, but because of the availability bias we only remember the biggest names in the genre, such as Jesus and Hitler. But even these examples fail upon further inspection. In his own time Jesus was a disappointment at starting a new religion (which might not have been his mission in any case), and even the apostle Paul barely got Christianity rolling. It wasn’t until the 4th century that Christianity began to number in the millions, which sounds impressive until we consider the power of compound interest, in which a small but steady growth can yield an enormous figure given enough time. Invest $1 at a constant yearly interest rate of 1% in the year 0, if the dividends are reinvested by the year 2020 the investment would be worth over $2.4 billion. Mercier cites statistics compiled by the sociologist of religion Rodney Stark, who estimates Christianity’s growth rate at 3.5% over the centuries. If each Christian only saves a few souls in a lifetime the religion could easily compile tens of millions in a matter of a few centuries, and over two billion by today. Perhaps this is why the dismal conversion rate of, for example, Mormon missionaries on their two-year missions, is not a concern for the church, as the success rate can be small if the process goes on long enough (coupled to high birth rates, of course, which most religions encourage).

As for Hitler, Mercier presents compelling evidence revealing that most Germans did not accept Nazi ideology, nor most of the planks in the regime platform, and even the anti-Semitism so famously on display in Hitler’s writings and speeches was only effective on Germans who were already anti-Semitic. The euthanasia of the handicapped in the 1930s was resisted by most Germans and got so much bad press that the Nazis made the program secret and issued orders to never speak of it, a policy carried through the Final Solution and the Holocaust, which was shrouded in secrecy and mostly carried out in Poland, far from the prying eyes of German citizens. Hitler’s anti-communism appealed to right-leaning Germans but was rejected among industrial workers. By 1942, most citizens did not believe the declarations of victory issued by the propaganda minister Joseph Goebbels, instead relying on secreted BBC reports of how the war was really going for Germany (not well). As the Nazi intelligence agency Sicherheitsdienst (SD) reported: “Our propaganda encounters rejection everywhere among the population because it is regarded as wrong and lying.”22

To the commonly asked question “How could so many highly educated, intelligent, and cultured Germans become Nazis?” the answer is: “Most didn’t.” The entire regime—not unlike the Soviet Union and North Korea—was held aloft on pluralistic ignorance, in which individual members of a group don’t believe something but believe that most others in the group believe it. When no one speaks up—or are prevented from speaking up through state-sponsored censorship or imprisonment—it produces a “spiral of silence” that can transmogrify into witch hunts, purges, pogroms, and repressive political regimes. This is why totalitarian and theocratic regimes restrict speech, press, trade, and travel, and why the route to breaking the bonds of such repressive governments and ideologies is free speech, free press, free trade, and accurate and trustworthy information.

In a wide-ranging conversation for my Science Salon podcast, I asked Mercier directly, “are we living in a post-truth era?” His answer was clear:

In many ways it’s better than it’s ever been, in that people are more informed than they used to be, and because of that they tend to be more consistent in their points of view. Fake news, for example, is a very marginalized phenomenon. Only a few percent of Twitter or Facebook users actually saw or spread fake news, and it doesn’t appear to effect those who see it. But everyone has heard of fake news, so on the whole I think the information environment is improving, slowly and perhaps not as much as we would like it to be, but I think things are better than they used to be. People still want accurate opinions and they care about the truth. Even people who support Trump—studies show when you show them that something about Trump is fake news they accept that, even while maintaining their support for Trump.

In one of my final Scientific American columns I coined my own neologism in the Colbert tradition: Factiness, or the quality of something seeming to be factual when it is not.23 But how do we know when something is factual and not factiness? We employ science and reason! There is progress in science and culture, and some ideas really are better than others. The post-Enlightenment ideal that beliefs should be tested in the laboratory and marketplace of ideas with the goal of generating objective and disinterested knowledge may seem Sisyphean, in that we are always in danger of backsliding into truthiness and factiness in which propaganda, superstition, and self-serving sophistry can slow our progress in pushing the boulder of knowledge up the mountain of ignorance, but that is precisely what we’ve been doing for millennia.

Per aspera ad astra—with difficulty to the stars. END

About the Author

Dr. Michael Shermer is the Publisher of Skeptic magazine, the host of the Science Salon Podcast, a regular contributor to Time.com, and Presidential Fellow at Chapman University. His latest book is Giving the Devil His Due: Reflections of a Scientific Humanist. As a public intellectual he regularly contributes Opinion Editorials, book reviews, and essays to The Wall Street Journal, The Los Angeles Times, Science, Nature, and other publications. He appeared on such shows as The Colbert Report, 20/20, Dateline, Charlie Rose, and Larry King Live (but, proudly, never Jerry Springer!). He was a monthly columnist for Scientific American. His two TED talks, seen by millions, were voted in the top 100 of the more than 1000 TED talks. He holds a Ph.D. from Claremont Graduate University in the history of science.

References
  1. On August 21,2007: https://on.cc.com/36i0Mzg and on July 11, 2011: https://on.cc.com/2QBjrQ6
  2. Colbert, Stephen. 2005. “The Word—Truthiness.” The Colbert Report, https://on.cc.com/1KS28h6
  3. Interview with Kellyanne Conway. January 22, 2017. NBC Meet the Press. https://nbcnews.to/2wjC7bB
  4. Definition of “fake news.” Collins Dictionary. https://bit.ly/2V7Ff7Q
  5. Pinker, Steven. 2019. “Why We Are Not Living in a Post-Truth Era.” Skeptic, Vol. 24, No. 3.
  6. Pinker, Steven. 2018. Enlightenment Now: The Case for Reason, Science, Humanism, and Progress. New York: Viking, 375.
  7. McIntyre, Lee. 2018. Post-Truth. Cambridge: MIT Press.
  8. Stanley, Jason. 2015. How Propaganda Works. Princeton, NJ: Princeton University Press.
  9. Snyder, Timothy. 2017. On Tyranny: Twenty Lessons from the Twentieth Century. New York: Tim Duggan Books/Penguin Random House, 71.
  10. Arendt, Hannah. 1951/1994. The Origins of Totalitarianism. New York: Harcourt, 474.
  11. Quoted in: McIntyre, ibid., 3–4.
  12. Orwell, George. 1946. “Politics and the English Language.” Horizon, April. https://bit.ly/18z9Ikb
  13. Burke, Edmund. 1790/1967. Reflections on the Revolution in France. London: J.M. Dent & Sons.
  14. Gross, Paul and Norman Levitt. 1996. Higher Superstition: The Academic Left and its Quarrels with Science. Baltimore: The Johns Hopkins University Press.
  15. Levitt, Norman. 1996. “More Higher Superstitions: Knowledge, Knowingness, and Reality.” Skeptic, Vol. 4, No. 4, 79.
  16. Editors. 2016. “’Post-truth’ declared word of the year by Oxford Dictionaries. BBC News. November 16. https://bbc.in/2Vh36lt
  17. Pelley, Scott. 2019. Inter view. Reliable Sources. CNN. December 31. https://cnn.it/36fu2qz
  18. For one example among hundreds see his tweet of October 3, 2018 at 4:53 a.m.: “The Failing New York Times did something I have never seen done before…”
  19. Associated Press. 2019. “New York Times subscriber numbers are skyrocketing in the Trump age.” MarketWatch. February 6. https://on.mktw.net/37Z19PS
  20. Quoted in: Glaisyer, Tom. 2016. “Cranking up the Truth- O-Meter: Giving a Boost to Truth in Politics.” Democracy Fund. January 13. https://bit.ly/2M8p7yK
  21. Mercier, Hugo. 2020. Not Born Yesterday: The Science of Who We Trust and What We Believe. Princeton, NJ: Princeton University Press, 257, 270–271. Quotes from Brennan are in: Brennan, Jason. 2019. Against Democracy. Princeton, NJ: Princeton University Press, 8.
  22. Kershaw, Ian. 1983. “How Effective was Nazi Propaganda?” In D. Welch (Ed.), Nazi Propaganda: The Power and the Limitations (180–205). London: Croom Helm, 199.
  23. Shermer, Michael. 2018. “Factiness: Are we living in a post-truth world?” Scientific American, March. https://bit.ly/2pHCzC5
TAGS: , , , , , , , , , , , , , , ,

Is Freedom of Speech Harmful for College Students?

Posted on Jun. 19, 2020 by | Comments Off on Is Freedom of Speech Harmful for College Students?

In this lecture, Dr. Michael Shermer addresses the growing crisis of free speech in college and culture at large, triggered as it was by the title lecture, which he was tasked to deliver to students at California State University, Fullerton, after a campus paroxysm erupted over “Taco Tuesday,” in which students accused other students of “cultural appropriation” for non-Mexicans appropriating Mexican food from Mexicans, which if you’ve ever been to Southern California becomes absurd on the face of it inasmuch as Mexican cuisine is among the most popular dining options. From there Shermer reviews the history of free speech, the difference between government censorship and private censorship, the causes of the current crisis, and what we can do about it.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , , , , , , , , , , , , , , ,

What are Science & Skepticism?

Posted on Jun. 12, 2020 by | Comments Off on What are Science & Skepticism?

This lecture, traditionally the first in the series for the Skepticism 101 course, is based on the first couple of chapters from Dr. Michael Shermer’s first book, Why People Believe Weird Things, presenting a description of skepticism and science and how they work, along with a discussion of the difference between science and pseudoscience, and some very practical applications of how to test claims and evaluate evidence. The image for this lecture is the original oil painting for the first cover of Why People Believe Weird Things, commissioned by the publisher and painted by the artist Lawrence Berzon.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

The audio is out of sync with the video in “What is a Skeptic?” Here’s the link to view it. If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , ,

Evolution & Creationism, Part 2: Who says evolution never happened, why do they say it, and what do they claim?

Posted on Jun. 05, 2020 by | Comments Off on Evolution & Creationism, Part 2: Who says evolution never happened, why do they say it, and what do they claim?

Dr. Michael Shermer continues the discussion of evolution and creationism, focusing on the history of the creationism movement and the four stages it has gone through: (1) Banning the teaching of evolution, (2) Demanding equal time for Genesis and Darwin, (3) Demanding equal time for creation-science and evolution-science, and (4) Intelligent Design theory. Shermer provides the legal, cultural, and political context for how and why creationism evolved over the 150 years since Darwin published On the Origin of Species in 1859, thereby providing a naturalistic account of life, ultimately displacing the creationist supernatural account. Finally, Shermer reviews the best arguments made by creationists and why they’re wrong.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , ,

Wicked Games
Lance Armstrong, Forgiveness and Redemption, and a Game Theory of Doping

Posted on May. 31, 2020 by | Comments (7)

Part 2 of the documentary film “Lance” airs tonight on ESPN and served as a catalyst for this article that employs game theory to understand why athletes dope even when they don’t want to, as well as thoughts on forgiveness and redemption. The article is a follow up to and extension of Dr. Shermer’s article in the April 2008 issue of Scientific American.

All images within are screenshots from Marina Zenovich’s 3 hour and 22 minute film and are courtesy of ESPN, who provided a press screener. In appreciation. Zenovich also produced Robin Williams: Come Inside My Mind (2018), Water & Power: A California Heist (2017), Richard Pryor: Omit the Logic (2013), and Roman Polanski: Odd Man Out (2012).

Toward the end of Marina Zenovich’s riveting documentary film on Lance Armstrong, titled simply Lance and broadcast on ESPN May 24 and May 31, 2020, the former 7-time Tour de France champion grouses about the apparent ethical hypocrisy of why TdF champions like the German Jan Ullrich (1997), the Italian Marco Pantani (1998), and himself (1999–2005) were utterly disgraced and had their lives ruined because of their doping, whereas cyclists such as the German Eric Zabel, the Italian Ivan Basso, and the American George Hincapie are “idolized, glorified, given jobs, invited to races, put on TV” even though they’re “no different from us” inasmuch as they doped as well. Of Pantani, whom Armstrong famously battled up many a mountainous climb, Lance scowls that “they disgrace Marco Pantani, they destroy him in the press, they kick him out of the sport, and he’s dead. He’s fucking dead!” Ditto Ullrich. “They disgrace, they destroy, and they fucking ruin Jan Ullrich’s life. Why? … That’s why I went. Because that’s fucking bullshit.”

Lance Armstrong (Jan 1)

This invective, in fact, comes on the heels of the most touching moment of the nearly 3.5-hour film, in which a lachrymose Armstrong loses his tough-guy composure when asked why he spontaneously flew to Europe to support his former rival in his time of need. Ullrich’s life was unraveling after a series of incidents involving drugs, alcohol, and violence, and Armstrong’s emotional fracture in recalling it is so out of character from his public image that it may give even his most cynical critics pause. His answer? “I love him.”

Lance Armstrong
Lance Armstrong (Jan 2)

That a quintessentially straight jock from Texas would admit on camera that he loves another man surely humanizes someone who has otherwise for decades been a picture of leather-neck toughness, a “badass motherfucker” as his former teammate Floyd Landis describes him. If he were a 1950’s test pilot he’d be a steely-eyed missileman staring down the sound barrier. If he were a boxer he’d be Jake LaMotta mercilessly pounding opponents into the canvas. If he were a basketball player he’d be Michael Jordan, entering each sporting contest like it was a matter of life and death, which it was for Lance. “I like to win,” he told filmmaker Alex Gibney, “but more than anything, I can’t stand this idea of losing. Because to me, losing means death.”

That such a film as this is so widely viewed, coming as it is on the heels of the most-watched show in ESPN history, on Michael Jordan’s final season with the Chicago Bulls, is one answer to Lance’s puzzlement about the asymmetrical treatment he has received. It was over two decades ago (1999) that Lance won his first Tour de France and was invited to Bill Clinton’s White House. To put this into further perspective, Armstrong won his 7th and last Tour de France two years before Steve Jobs introduced the iPhone in 2007. That seems a lifetime ago, and yet we’re still talking about Lance Armstrong. Why?

Lance Armstrong and David Letterman

To answer the question I think we must distinguish between Lance’s doping that was, in fact, a logical outcome of a corrupt system that forced most top cyclists at the time to choose between cheating and quitting, from his mendacity and intimidation to enforce Omerta that included threats, lawsuits, libelous public statements, and alleged backroom deals that harmed anyone who threatened to break their silence. As well, to invoke the title of Armstrong’s bestselling memoir, it was never about the bike and always about Lance. As the idiom suggests, those who reach great heights have further to fall. In short, the continued interest in the rise and fall of Lance Armstrong has more to do with the human condition and what it reveals about our species and the wicked games we play.

The Dope on Doping

Doping has long been a part of cycling. From the 1940s through the 1980s stimulants and painkillers were ubiquitous. As the 5-time Tour winner Jacques Anquetil snorted, “You can’t ride the Tour de France on mineral water.” And when challenged to elaborate, quipped: “Everyone in cycling dopes himself. Those who claim they don’t are liars.” With that as the norm, doping regulations were virtually nonexistent until the British champion Tom Simpson keeled over dead on the climb up the legendary Mont Ventoux in the 1967 Tour. An autopsy revealed a pharmacopoeia of drugs in his body and a vial of amphetamines in his jersey pocket. But even after that tragedy and the implementation of incipient testing, the dopers were always ahead of the doping controls. When I was competing in the 3,000-mile nonstop transcontinental bicycle Race Across America in the 1980s blood doping was both popular and allowed until after the 1984 Olympics and was a quantum leap over earlier techniques, but I begged off it because it seemed medically risky to inject a bag of your own or someone else’s blood in order to boost the amount of oxygen-carrying red blood cells in your system. Lance’s teammate Tyler Hamilton, in his 2012 book The Secret Race, recounts horror stories about injecting bags of spoiled blood and the illness that followed.

This risk was averted in the early 1990s, before Lance entered the sport professionally, with the introduction of genetically engineered recombinant erythropoietin — r-EPO. Natural EPO is a hormone released by the kidneys into the bloodstream, which carries it to receptors in bone marrow, stimulating it to pump out more red blood cells. Chronic kidney disease and chemotherapy can cause anemia, and so the development of the EPO substitute r-EPO in the late 1980s was a savior for chronically anemic patients … and oxygen hungry endurance athletes. Taking r-EPO is just as effective as getting a blood transfusion, but instead of messing around with bags of blood hanging from hotel room picture hooks and poking long needles into uncooperative veins, cyclists could now store tiny ampoules of r-EPO on ice in a thermos bottle or hotel minifridge, then simply inject the hormone under the skin, boosting the rider’s hematocrit (HCT), or the percentage of red blood cells in the total volume of blood. The normal range of HCT is in the mid-40s. Endurance training can boost it naturally into the high 40s or low 50s. Multi-week stage races like the Tour de France cause HCT to steadily decrease. EPO can push those levels into the high 50s and even the 60s and keep them there. Bjarne Riis, the winner of the 1996 Tour de France was nicknamed “Mr. 60 Percent”, and in 2007 he confessed that EPO was behind his moniker. After a test was developed in 2000 that could detect EPO, dopers shifted to micro-dosing it intravenously and/or returning to blood doping under tight supervision.

EPO

How big a difference does EPO make? In Zenovich’s film Armstrong’s U.S. Postal teammate Jonathan Vaughters makes a back-of-the-envelope calculation that the drug enhances performance by about 10 percent. In a 100-hour race in which the first and last place riders are separated by two hours, or two percent, this is a game changer. The infamous sports physiologist and convicted doping doctor Michele Ferrari, who for years worked exclusively with Armstrong and the U.S. Postal team, quantified the effect for me more specifically when I interviewed him for a 2008 article in Scientific American on doping in sports: “If the volume of [red blood cells] increases by 10 percent, performance improves by approximately 5 percent. This means a gain of about 1.5 seconds per kilometer for a cyclist pedaling at 50 kilometers per hour in a time trial, or about eight seconds per kilometer for a cyclist climbing at 10 kph on a 10 percent ascent.” Thus, a cyclist who boosts his hematocrit by 10 percent can lop off 75 seconds in a 50-kilometer time trial, which is typically decided by a few seconds, or 80 seconds per climb on any of the numerous 10-kilometer 10 percent mountain passes the riders negotiate in the Pyrenees and Alps, often decided by a few tens of seconds. This advantage is not one that athletes can afford to give away to their competitors.

EPO forced cyclists into choosing between doping and quitting the sport, as the three-time Tour winner Greg LeMond discovered in 1991. After logging victories in 1986, 1989 and 1990, LeMond set his sights on equaling or bettering the record of five Tours de France achieved by only three cyclists before him — Jacques Anquetil, Bernard Hinault, and Eddy Merckx. “I was the fittest I had ever been, my split times in spring training rides were the fastest of my career, and I had assembled a great team around me,” LeMond told me. “But something was different in the 1991 Tour. There were riders from previous years who couldn’t stay on my wheel who were now dropping me on even modest climbs.” The following year was worse, as LeMond refused to dope and would not allow his teammates to either. The result: “our team’s performance was abysmal” and “I couldn’t even finish the race.”

Greg LeMond
Greg LeMond

Greg’s hunch is backed by data. Average speeds of the winners of the Tour de France spiked upward beginning in 1991. To control for yearly variance effected by course changes and weather over time, I averaged the speeds over 14-year periods going backward and forward in time from 1991, then compared those to the peak Armstrong era and after. The averages are plotted on the graph below. In the period 1991–2004 the winners’ average speed jumped 9 percent over the corresponding speed in the period 1977–1990, an increase that cannot be accounted for by improvements in equipment, nutrition or training. Lance’s final victory in 2005 is the fastest Tour ever recorded at 25.9 mph. The extensive disqualification of dopers in 2007 brought the average speed down to 24.2 mph. It has hovered around there ever since, bouncing around between 24.5 mph and 25.1 mph through 2019, with an average speed between 2008 and 2019 of 24.9 mph. The spike in 2017 may be a statistical anomaly or the product of varying race conditions, but it is interesting to note that the winner, Chris Froome, later that year tested positive in the Tour of Spain for salbutamol, an asthma medication that opens up the medium and large airways in the lungs. Although Froome was ultimately cleared by the UCI, it is a curious thing that some professional cyclists seem to come down with asthma around the time of the three grand tours.

Miles per hour
Join the Club or Go Home and Get a Real Job

In his bestselling book Game of Shadows, the San Francisco Chronicle investigative reporter, Lance Williams, who broke the BALCO doping scandal in baseball, made this observation: “Athletes have a huge incentive to dope. There are tremendous benefits to using the drugs, and there is only a small chance that you will get caught. So depending on your sport and where you are in your career, the risk is often worth it. If you make the team, you’ll be a millionaire; if you don’t, you’ll probably go back to driving a delivery truck.” Armstrong’s teammate for many years, Tyler Hamilton, confirmed to Zenovich that the logic applied to the sport of cycling as well: “It was either join the club or go home, finish school, and get a real job.”

Tyler Hamilton

Once it becomes known that the top competitors in a sport are doping, the rule breaking cascades down through the ranks until an entire sport is corrupted. Based on his numerous interviews with athletes, coaches, trainers, drug dealers and drug testers, Williams estimates that between 50 and 80 percent of all professional baseball players and track and field athletes were doping. Given that reality, Williams told me, “There is the conviction that everyone they are competing against is cheating already.” By way of example, Williams noted that Charlie “the Chemist” Francis, coach of Ben Johnson, the sprinter and (briefly) 1988 Olympic gold medalist in the 100-meter run who was busted for doping and stripped of his medals, told him that the doping was “completely self-defensive.” How so? “It was cheat or lose.”

Armstrong’s U.S. Postal teammate Frankie Andreu, a domestique in support of the team leader in the mid 1990s, told me: “For years I had no trouble doing my job to help the team leader. Then, around 1996, the speeds of the races shifted dramatically upward. Something happened, and it wasn’t just training.” Andreu resisted doping as long as he could, but by 1999 he was unable to do his job: “It became apparent to me that enough of the peloton was on the juice that I had to do something.” He began injecting himself with r-EPO two to three times a week. The boost was exactly what he needed “to dig a little deeper, to hang on to the group a little longer, to go maybe 31.5 miles per hour instead of 30 mph.” That seemingly small difference is actually larger than it appears as it can make the difference between staying in the peloton and getting dropped; when you’re dropped and unable to enjoy the drafting benefits of riding in a large pack of riders, that can spell the difference between staying in the race or taking a flight home. This is where the game theory matrix of incentives kicks in.

The Game Theoretic Logic of Doping

Game theory is the study of how players in a game choose strategies that will maximize their return in anticipation of the strategies chosen by the other players. The “games” for which the theory was invented are not just gambling games such as poker or sporting contests in which tactical decisions play a major role; they also include serious life matters in which people make economic choices, military decisions, and even nuclear diplomatic strategies like Mutual Assured Destruction, in which neither nation (the US and USSR during the Cold War) has an incentive to launch a nuclear first strike because the other guy will retaliate in kind, leaving both countries decimated. What these “games” have in common is that each player’s “moves” are analyzed according to the range of options open to the other players.

Prisoners Dilemma

The game of prisoner’s dilemma is the classic example: You and your partner are arrested for a crime, and the two of you are held incommunicado in separate prison cells. Even if neither of you wants to confess or rat out the other, the D.A. can change your incentive through the following matrix of options (depicted visually in the table):

  • If you both remain silent, you each get a year (top left).
  • If the other guy confesses and you do not, you get three years and he goes free (top right).
  • If you confess but the other guy doesn’t, you go free and he gets three years in jail (bottom left).
  • If you both confess, you each get two years (bottom right).

With these possible outcomes the logical choice is to defect from the advance agreement and betray your partner. Why? Consider the choices from the first prisoner’s point of view. The only thing the first prisoner cannot control about the outcome is the second prisoner’s choice. If the second prisoner remains silent then the first prisoner earns the “temptation payoff” (no jail time) by confessing, but gets a year in jail (the “high payoff”) by remaining silent. The better outcome in this case is for the first prisoner is to confess. But if the second prisoner confesses, then once again the first prisoner is better off confessing (the “low payoff” or two years in jail) than remaining silent (the “sucker payoff” or three years in jail). Because the circumstances from the second prisoner’s point of view are entirely symmetrical to the ones described for the first, each prisoner is better off confessing no matter what the other prisoner decides to do.

The prisoner’s dilemma game has been played in many experimental conditions, revealing that when subjects play the game just once or for a fixed number of rounds without being allowed to communicate with the other prisoner, defection by confessing is the common strategy. But when subjects play the game for an unknown number of rounds, the most common strategy is tit-for-tat: each begins cooperating with the prior agreement by remaining silent, then mimics whatever the other player does. Even more mutual cooperation can emerge if the players are allowed to communicate and establish mutual trust. But once defection by confessing builds momentum, it continues throughout the game and cheating becomes the norm.

“It was either join the club or go home, finish school, and get a real job.” —Tyler Hamilton

In cycling, as in baseball and other sports, the contestants compete according to a set of rules, which clearly prohibit the use of performance-enhancing drugs. But because the drugs are so effective and many of them are so difficult to detect, and because the payoffs for success are so great, the incentive to use banned substances is tempting. Once a few elite athletes defect from the rules by doping to gain an advantage, their rule-abiding competitors must defect as well. But because doping is against the rules, a code of silence — Omerta — prevents any open communication about how to flip the matrix incentives and return to abiding by the rules.

Nash Equilibrium and the Level Playing Field

In game theory, if no player has anything to gain by unilaterally changing strategies, the game is said to be in a Nash equilibrium, discovered by the mathematician John Forbes Nash, Jr., (portrayed by Russel Crowe in the film A Beautiful Mind), who went on to win the Nobel Prize in economics for his pioneering research in game theory. When everyone in a system violates the rules, or if everyone just thinks that everyone else is violating the rules (even if they are not all so doing), cheating can become a Nash equilibrium, which turns it from a moral violation to a rational choice. As the title of an article analyzing average Tour speeds put it: “Doping: A Necessity, Not a Sin.” But if everyone outside the system thinks that the rules are enforced (even while those inside the system know better), fans respond accordingly with moralistic punishment for the cheaters.

Just do the right thing: sack Lance

I have yet to see anyone inside or outside the sport explain it this way, which in a manner of speaking at least partially absolves the athletes while shifting some the moral culpability to the regulatory bodies of the sport. That is, the governing bodies of a sport must change the payoff values of the expected outcomes identified in the game matrix. First, when other players are playing by the rules, the payoff for doing likewise must be greater than the payoff for cheating. Second, and perhaps more important, even when other players are cheating, the payoff for playing fair must be greater than the payoff for cheating. Players must not feel like suckers for following the rules.

In a Nash Equilibrium of mass doping, is it a level playing field? That is, if everyone was doping then can we at least conclude that the best cyclist won all those Tours de France (not just Lance, but Pantani in 1997, Ullrich in 1998, and the others before and since)? We will never know for sure, of course, and while it is unlikely that 100% of cyclists were doping, it is telling that none of Lance’s competitors who were in a position to win but were bested by him are claiming victory or demanding restitution (and every one of the podium finishers in all seven of Lance’s TdF victories was eventually busted for doping and/or later confessed to it). In any case, anyone who would join a Fair Play for Other Dopers Committee would find it difficult to gain much sympathy among ethicists.

Anyone who would join a Fair Play for Other Dopers Committee would find it difficult to gain much sympathy among ethicists.

Some have argued that Lance’s extensive resources that enabled him to hire the best sports physiologist in the world for exclusive preparation gave him an edge over his competitors. In that vein, an unintentionally humorous moment in Zenovich’s film comes in the segment on Operatión Puerto, in which Spanish police raided the lab of a sports physiologist named Eufemiano Fuentes, whose secret code for the drugs and blood bags of athletes consisted of their initials or, in the case of Jan Ullrich, his first name. But from the time that Lance said he started doping in 1993 through his first Tour victory in 1999, he didn’t have the extensive resources that victory and fame subsequently brought him. And surely the president of the Union Cycliste International (the UCI, the sport’s governing body), Hein Vergruggen, carries some moral accountability for turning a blind eye to the corruption he not only could not have failed to notice, but actively participated in covering it up in the name of saving his sport after the catastrophic 1998 Tour that exposed the massive doping scandal already underway while Lance was still struggling to come back from cancer and chemotherapy.

Eufemiano Fuentes

It is not unreasonable to argue that the playing field wasn’t level, but it isn’t now and never was level, drugs or no drugs. Genetically gifted riders with a fire in the belly to win and the discipline to train 500 miles a week accrue not only superior fitness but greater resources in the form of more sponsorship dollars, faster support riders, smarter coaches and managers, better training facilities, food, and other creature comforts. For example, the well-capitalized British Team Sky would rent out rooms on Mount Teide in Tenerife in the Canary islands for the entire year so that their team members could train at altitude, a legal method of increasing oxygen-rich blood cells. The real victims of doping are not the other top riders whom Lance beat (all of whom doped), but the athletes who DNF’d, finished near the bottom of the leaderboard, or gave up their dream and went home to get a real job. For that you can blame systemic corruption of the system and the logical deterioration of norms of fairness that follow from it, more than any single cyclist no matter how much they capitalized on it.

Lance Armstrong
Breaking the Chain

At the end of my Scientific American article I suggested that in order to reform cycling and encourage cyclists to play by the rules, the expected values of the doping game had to be changed. For example, building and enforcing a much stricter drug-testing regimen would dramatically increase the likelihood of getting caught, thereby tilting the matrix incentive toward riding clean; for those who want to risk getting caught, increasing the penalty for doping from a temporary to a lifetime ban on competing would presumably nudge the motivation toward fair play even further.

I also suggested granting immunity to athletes for past cheating if they come clean about how doping programs worked; increase the number of competitors tested, both in competition and out-of-competition; disqualify all team members from an event if any member of the team tests positive for doping, thereby shifting the taboo on doping from an external governing body to the internal workings of the team and its members; and compel any convicted athlete to return all salary paid and prize monies earned to the team sponsors, further strengthening the incentive for athletes to enforce their own antidoping rules.

Since 2008 anti-doping controls have improved dramatically with some of these factors implemented, plus others, most notably a “biological passport,” in which an athlete’s fitness indicators are constantly monitored, such as HCT, power output, VO2 maximum rate of oxygen consumption, and others, so that any spike in improvement beyond what training can account for is an indicator of possible doping. In his 2019 book, One-Way Ticket, Jonathan Vaughters explains in game-theory language that the biological passport “wasn’t about catching people. It was always about dissuading them. It’s about limiting them. It’s about keeping things fair.” Fairness is what athletes want more than anything and Vaughters is concerned that the moralizing impulse to “find evildoers and burn them at the stake” is “what the world wants from anti-doping” but is counterproductive to the deeper fairness issue — the level playing field in which other factors like talent, training, nutrition, and drive determine outcomes. Now a team manager and cycling influencer, Vaughters has been vocal and public about his teams riding clean, thereby shifting the norms of the sport from “everyone’s doing it” to “some aren’t doing it” in hopes of arriving at a new norm of “no one’s doing it.”

Jonathan Vaughters

Jonathan Vaughters

Forgiveness and Redemption

At the end of Vaughters’ memoir he admits “We all doped. It’s inexcusable and it’s a fact.” It should be clear by now that Lance’s downfall has less to do with doping and more to do with the people whose lives he harmed in his years of denial, defense, and destruction of others. As Vaughters put it: “The bullying was the reason Lance paid a higher price than the rest of us.” One of those who feels bullied is Betsy Andreu, wife of Lance’s teammate of many years, Frankie Andreu. A deeply moral person who is the very the embodiment of Kantian deontological (rule-bound) ethics, Betsy does not suffer cheats gladly (including her husband, whom she upbraided when she discovered he doped just to compete). She explained to me in no uncertain terms exactly what the core issue with Lance is and what he has to do to redeem himself.

First, she said, the other dopers, such as Ullrich, Basso, and Pantani, did not try to destroy other people for simply telling the truth about what was going on. Doping causes harm to others in the sport who don’t dope, but attacking, suing, libeling, and curtailing the income of those attempting to expose the doping is another level of harm altogether. Second, she continued, there is the matter of apologies. “I’ve learned there are 3 components to being sorry,” she outlined:

  1. You acknowledge the wrong you did to the person you wronged.
  2. You apologize for it.
  3. You make amends. How? You ask the person what they need from you to show you they mean they’re sorry.
Betsy Andreu

Betsy Andreu

Restorative vs. Retributive Justice

In criminal justice scholarship what Betsy Andreu is proposing is called restorative justice, in which the perpetrator apologizes for the offense, attempts to set-to-rights the wrong done, and if possible initiate or restore good relations with the victim. Restorative justice is contrasted with retributive justice, in which wrong doers should get their just desserts. Think Old Testament eye-for-an-eye (Moses) vs. New Testament turn-the-other-cheek (Jesus), Malcolm X vs. Martin Luther King, Jr., Rambo vs. Gandhi. Redemption begins with an acknowledgment on the part of the wrongdoer, who must take some level of responsibility for the offense, and builds from there to include the victims’ losses and a plan for restoration. As I outlined in my chapter on the subject in The Moral Arc, retributive justice is focused on what offenders deserve whereas restorative justice is concerned about what victims need; retributive justice is about what was done wrong whereas restorative justice is about making it right; retributive justice is offender oriented whereas restorative justice is victim oriented.

I don’t know who all feels that they should be on Lance’s restorative justice list. And, clearly, there are possible legal and financial consequences of going down that path that I do not know about. But in noting the many cancer victims and their families Lance inspired and materially helped through his charitable generosity — highlighted in the film and praised as unassailably real by ESPN’s Senior Writer Bonnie Ford, who was otherwise a harsh critic — it is evident that Armstrong is capable of being a person who can make a positive difference in the world. Will he?

Lance Armstrong

This could be Lance’s greatest challenge, harder perhaps even than overcoming cancer, inasmuch as personality and temperament are relatively stable throughout the lifespan. I don’t know if he has the character to do so across the board, but he has made amends with some people, and I will note that many a person with far graver personal failings have turned their lives around so dramatically that they’re almost unrecognizable. A type specimen might be the heavyweight boxing champion George Foreman, who by his own account was, pace Landis’ description of Armstrong, a badass motherfucker … until he was humbled by Muhammad Ali in the Rumble in the Jungle. Foreman willed himself into becoming one of the most likeable and inspirational figures of the late 20th century, recapturing his heavyweight title along with the admiration of millions. Others have made similar transformations. Can Lance do the same? The only person standing in the way is Lance himself. We shall see. END

About the Author

Michael Shermer is the Publisher of Skeptic magazine, host of the Science Salon podcast, and a Presidential Fellow at Chapman University. He is the author of a dozen books including the New York Times bestsellers Why People Believe Weird Things, The Believing Brain, and The Moral Arc. His latest book is Giving the Devil His Due. He is also a co-founder of the 3,000-mile nonstop transcontinental bicycle Race Across America (RAAM), which he competed in 5 times and was the Race Director for a decade.

TAGS: , , , , , , , , , , ,

Evolution & Creationism, Part 1

Posted on May. 29, 2020 by | Comments Off on Evolution & Creationism, Part 1

Dr. Michael Shermer takes viewers to the Galápagos Islands to retrace Darwin’s footsteps (literally — in 2006 Shermer and historian of science Frank Sulloway hiked and camped all over the first island Darwin visited) and show that, in fact, Darwin did not discover natural selection when he was there in September of 1835. He worked out his theory when he returned home, and Shermer shows exactly how Darwin did that, along with the story of the theory’s co-discoverer, Alfred Russel Wallace. Then Shermer outlines what, exactly, the theory of evolution explains, how it displaced the creationist model as the explanation for design in nature (wings, eyes, etc. as functional adaptations), and why so many people today still misunderstand the theory and how that sustained the creationist model.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

About the photograph above

Charles Darwin described of what he called the “craterized district” on San Cristóbal, Galápagos Islands thusly:

The entire surface of this part of the island seems to have been permeated, like a sieve, by the subterranean vapours: here and there the lava, whilst soft, has been blown into great bubbles; and in other parts, the tops of caverns similarly have fallen in, leaving circular pits with steep sides. From the regular form of the many craters, they gave to the country an artificial appearance, which vividly reminded me of those parts of Staffordshire, where the great iron-foundries are most numerous.

The photograph was taken on 21 June 2004 by Dr. Frank Sulloway. Darwin hiked this area in September, 1835.

Mentioned in this lecture
TAGS: , , , , , , , , , , ,

Holocaust Denial

Posted on May. 22, 2020 by | Comments Off on Holocaust Denial

In this lecture on Holocaust Denial, Dr. Michael Shermer employs the methods of science to history, showing how we can determine truth about the past. Many scholars in the humanities and social sciences do not consider history to be a science. Instead, they treat it as a field of competing narrative stories, no one of which has a superior claim to truth values than any others. But as Dr. Shermer replies to this assertion, are we to understand that those who assert that the Holocaust never happened have equal standing to those who assert that it did? Of course not! It is here where most cultural relativists get off the relativity train, acknowledging that, in fact, we can establish certain facts about the past, no less than we can about the present.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

Mentioned in this lecture
TAGS: , , , , , , , , , , , , , , ,

Pathways to Evil, Part 2

Posted on May. 15, 2020 by | Comments Off on Pathways to Evil, Part 2

In Pathways to Evil, Part 2, Dr. Michael Shermer fleshes out the themes of Part 1 by exploring how the dials controlling our inner demons and better angels can be dialed up or down depending on circumstances and conditions. Are we all good apples but occasionally bad barrels turn good apples rotten, or do we all harbor the capacity to turn bad?

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , , , , , , , , , ,

Pathways to Evil, Part 1

Posted on May. 08, 2020 by | Comments Off on Pathways to Evil, Part 1

In Part 1 of his Pathways to Evil lecture Dr. Michael Shermer considers the nature of evil in his attempt to answer the question of how you can get normal civilized, educated, and intelligent people to commit murder and even genocide. Are we basically good and made bad by evil situations, or are we basically evil and made good by civilized society?

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , , , , ,

How to Think About the Bermuda Triangle

Posted on May. 01, 2020 by | Comments Off on How to Think About the Bermuda Triangle

Dr. Michael Shermer examines the claims about the Bermuda Triangle using the tools of skepticism, science, and rationality to reveal that there is no mystery to explain. Selective reporting, false reporting, quote mining, anecdote chasing, and mystery mongering all conjoin to create what appears to be an unsolved mystery about the disappearance of planes and ships in this triangular shape region of the ocean. But when you examine each particular case, as did the U.S. Navy, Coast Guard, and especially insurance companies who have to pay out for such losses, it becomes clear that almost all have natural explanations, and the remaining unsolved ones are lying on the bottom of the ocean beyond our knowledge.

Skepticism 101: How to Think Like a Scientist covers a wide range of topics, from critical thinking, reasoning, rationality, cognitive biases and how thinking goes wrong, and the scientific methods, to actual claims and whether or not there is any truth to them, e.g., ESP, ETIs, UFOs, astrology, channelling, psychics, creationism, Holocaust denial, and especially conspiracy theories and how to think about them.

If you missed Dr. Shermer’s previous Skepticism 101 lectures watch them now.

TAGS: , , , , , , , , ,

NEXT PAGE

Get eSkeptic

Be in the know.

eSkeptic delivers great articles, videos, podcasts, reviews, event announcements, and more to your inbox.

Sign me up!

Donate to Skeptic

Please support the work of the Skeptics Society. Make the world a more rational place and help us defend the role of science in society.

Detecting Baloney

Baloney Detection Kit Sandwich (Infographic) by Deanna and Skylar (High Tech High Media Arts, San Diego, CA)

The Baloney Detection Kit Sandwich (Infographic)

For a class project, a pair of 11th grade physics students created the infographic shown below, inspired by Michael Shermer’s Baloney Detection Kit: a 16-page booklet designed to hone your critical thinking skills.

FREE PDF Download

Wisdom of Harriet Hall

Top 10 Things to Know About Alternative Medicine

Harriet Hall M.D. discusses: alternative versus conventional medicine, flu fear mongering, chiropractic, vaccines and autism, placebo effect, diet, homeopathy, acupuncture, “natural remedies,” and detoxification.

FREE Video Series

Science Based Medicine vs. Alternative Medicine

Science Based Medicine vs. Alternative Medicine

Understanding the difference could save your life! In this superb 10-part video lecture series, Harriet Hall M.D., contrasts science-based medicine with so-called “complementary and alternative” methods.

FREE PDF Download

Top 10 Myths of Terrorism

Is Terrorism an Existential Threat?

This free booklet reveals 10 myths that explain why terrorism is not a threat to our way of life or our survival.

FREE PDF Download

The Top 10 Weirdest Things

The Top Ten Strangest Beliefs

Michael Shermer has compiled a list of the top 10 strangest beliefs that he has encountered in his quarter century as a professional skeptic.

FREE PDF Download

Reality Check: How Science Deniers Threaten Our Future (paperback cover)

Who believes them? Why? How can you tell if they’re true?

What is a conspiracy theory, why do people believe in them, and can you tell the difference between a true conspiracy and a false one?

FREE PDF Download

The Science Behind Why People See Ghosts

The Science Behind Why People See Ghosts

Mind altering experiences are one of the foundations of widespread belief in the paranormal. But as skeptics are well aware, accepting them as reality can be dangerous…

FREE PDF Download

Top 10 Myths About Evolution

Top 10 Myths About Evolution (and how we know it really happened)

If humans came from apes, why aren’t apes evolving into humans? Find out in this pamphlet!

FREE PDF Download

Learn to be a Psychic in 10 Easy Lessons

Learn to do Psychic “Cold Reading” in 10
Easy Lessons

Psychic readings and fortunetelling are an ancient art — a combination of acting and psychological manipulation.

FREE PDF Download

The Yeti or Abominable Snowman

5 Cryptid Cards

Download and print 5 Cryptid Cards created by Junior Skeptic Editor Daniel Loxton. Creatures include: The Yeti, Griffin, Sasquatch/Bigfoot, Loch Ness Monster, and the Cadborosaurus.

Copyright © 1992–2020. All rights reserved. | P.O. Box 338 | Altadena, CA, 91001 | 1-626-794-3119. The Skeptics Society is a non-profit, member-supported 501(c)(3) organization (ID # 95-4550781) whose mission is to promote science & reason. Privacy Policy.