The Skeptics Society & Skeptic magazine

How to Determine if a Doctored Photograph, Video, or Audio Recording is Real

In 2016, I wrote a piece on critical thinking and politics for Skeptic (Vol. 21 No. 4) titled “Political Obfuscation: Thinking Critically About Public Discourse.” In my research on the topic since then I have discovered that much of the obfuscation comes from partisan tribalism that has grown ever more powerful over the last two presidential election cycles. In my new book — Political Tribalism in America: How Hyper-Partisanship Dumbs Down Democracy and How to Fix It — I describe how our partisan attachments motivate us to acquire, perceive and evaluate political information in a biased manner, and how that results in an electorate that is more extreme, hostile and willing to reject unfavorable democratic outcomes. The book also provides feasible strategies that are designed to reduce the influence of political tribalism in our lives, including instructions for plumbing the depths of political views; evaluating sources of political information; engaging in difficult political conversations; appraising political data; and assessing political arguments. In short, the book is a guide to help citizens become critical political thinkers. The following excerpt from the book, on deepfake videos, audio recordings, and photographs captures well the problem we are facing and, hopefully provides some solutions.

I don’t think we can sustain a democratic society if citizens can’t distinguish fact from fiction.1 — Emily Thorson

False information has been present throughout American political history.2 Yet, the emergence of new digital technologies has all but ensured that the future of false information will include a heavy dose of fabricated photographs and altered audio and video files. Unfortunately, recent research suggests that our ability to distinguish real images and files from false ones is, to say the least, underwhelming.3 This trend is a problem. For as the political scientist Emily Thorson contends in the epigraph above, our inability to discern fact from fiction may severely threaten the well-being of our body politic. However, there are some simple steps that we can take to exercise our critical political thinking skills and strengthen the constitution — with a lowercase “c” — of the United States.

Evaluating Doctored Photographs

A few weeks after the terrorist attacks of September 11, 2001, an email containing a photograph from a camera allegedly found in the rubble remains of the twin towers began to circulate online. The snapshot, which featured an unsuspecting tourist posing for a picture on the roof of the World Trade Center as a hijacked airliner approached in the distance, was a fake — a crude forgery created by none other than the photo’s in the figure to morbidly amuse his friends.4 Nevertheless, the haunting image was passed from inbox to inbox, reminding its recipients both of the horrors borne on that awful day and the power that altered pictures possess.

Photo manipulation is nothing new. Indeed, “The history of fakery in photography is as old as the medium itself.”5 Yet the quantity and quality of doctored images have undoubtedly increased with the passage of time.6 According to Charles Seife, author of Virtual Unreality: Just Because the Internet Told You, How Do You Know It’s True?, “Photo manipulation used to be a tricky business, requiring thousands of dollars in darkroom equipment, airbrushes, and some pretty specialized artistic talent to pull off properly.” But now, thanks to image-processing software such as Adobe Photoshop, “Anyone with a camera and a computer can attempt it, and with a little talent… can do a very credible job.”7 As a consequence, American politics has become flush with fake photographs. Consider just a few recent examples:

  • In 2002, an altered image of George W. Bush depicted the President holding a children’s book upside down while reading to a group of students at a charter school in Houston, Texas.8
  • In 2008, a widely circulated email during the Democratic presidential primary included a doctored photograph of then-Senator Barack Obama holding a telephone upside down, and the statement: “When you are faking a pose for a camera photo opportunity, at least you can get the phone turned in the right direction! And he wants to be President???”9
  • In 2012, an altered image of Mitt Romney — one of the wealthiest presidential candidates in U.S. history — appeared on Facebook. The photograph featured the Republican nominee posing with a line of children whose shirts spelled out the phrase “R-MONEY,” instead of the well-heeled candidate’s last name.10
  • In 2017, a photoshopped image of President Donald Trump with a diarrhea stain down the back of his golf pants appeared online accompanied by the claim that Mr. Trump was incontinent.11
  • Soon after Hurricane Florence hit the Carolinas in September 2018, a doctored photograph of President Trump reaching over the side of a raft to distribute a MAGA hat — rather than a helping hand — to a stranded flood victim began to make the rounds on Facebook.12
  • In 2020, a photoshopped image showing then former Vice President Joe Biden groping a female journalist went viral. The image was coupled with a report that Mr. Biden had “denied any impropriety, claiming that he was merely ‘checking her sources.’”13
  • In 2021, an altered image of President Joe Biden asleep at his desk in the Oval Office behind a pile of executive orders was posted on Facebook with the message: “AMERICA IN DECLINE: This decrepit old grifter works MAYBE five hours a day. We traded in a workhorse, for someone that belonged out to pasture or sent to the glue factory a long time ago. Nothing says we threw in the towel better than this nauseating image, the ‘commander in chief’ can’t even stay awake.”14

What’s most disconcerting is the fact that fake images can have real effects. For instance, an analysis by Dario L. M. Sacchi, Franca Agnoli, and Elizabeth F. Loftus found that doctored photographs of the 1989 Tiananmen Square protest affected the way the study’s participants remembered the event.15 And a study led by Steven J. Frenda revealed that nearly half of those who were shown fake images of incidents that never occurred — such as President George W. Bush entertaining major league baseball pitcher Roger Clemens at his ranch in Crawford, Texas during Hurricane Katrina or President Obama shaking hands with Iranian President Mahmoud Ahmadinejad at the United Nations — “reported that they remembered the false event happening.”16 These results are sobering and should serve as a clarion call to handle politically-tinged images with care. If not, we run the risk that fake photographs will reconstruct our old memories or even fabricate new ones.

Make Use of Reputable Fact-Checking Organizations

On August 29, 2008, the presumptive Republican presidential nominee John McCain announced Alaska Governor Sarah Palin as his vice-presidential candidate. Two days later, a photograph of Palin posing in an American flag bikini while holding a rifle began to spread online. By the following week, the image had become a topic of conversation on cable news, as a guest panelist on CNN wondered whether “people [will] say, yes, she looks good in a bikini clutching an AK-47, but is she equipped to run the country?”17

The photo was a fake. A investigation found that the image was the handiwork of a twentyseven- year-old website editor in New York City who had simply photoshopped Palin’s head onto another woman’s body and posted the composite on Facebook. And from there, the image was copied, shared, and spread like wildfire.18 But we don’t have to contribute to the conflagration. Instead, we can inquire about the authenticity of an image by checking in with the fact-checkers at, Snopes, PolitiFact. All we need to do is visit their websites, enter a few key words related to the image into the search bar, hit return, and read their results. It must be noted that a skeptic best not assume that those organizations are always correct or never make mistakes.

Conduct a Reverse Image Search

Of course, if we stumble upon an image that hasn’t been investigated by a reputable fact-checking organization, we will have to verify it on our own. We can do so by performing a reverse image search, a process that allows a user to search for images rather than text. One simply uploads an image, or provides a link to an image that can be found online, and the search engine will find similar images on other websites. For instance, in 2017, posted a story containing a picture of a judge and the headline, “Muslim Federal Judge Rules Two Items of Sharia Law Legal.” The article claimed that Judge Mahal al Alallaha-Smith issued a ruling that a Muslim man in America may “beat [his wife] in a non-life-threatening manner” and marry his first cousin because such actions are “prescribed by the Koran.”19

Yet, if one were to left-click on the image, copy its URL, paste it into the search bar on, and press return, that search would yield an identical picture from CNN’s website, with one telling exception — the judge’s nameplate. In truth, the photograph came from a news report about Los Angeles Superior Court judge Halim Dhanidina titled “Being a Muslim judge in the age of Trump.”, which bills itself as a satirical website, lifted the photo from CNN and digitally manipulated the plaque by replacing Dhanidina’s name with that of an imaginary federal judge.20 Conducting a reverse image search is simple, fast, and extremely effective. As such, critical political thinkers should have this tool at the ready, and faithfully use it to expose any fake photos that might come their way.

Evaluating Fake Audio and Video Recordings

Unfortunately, advances in digital technology have also engendered the rise of deepfakes, manipulated audio and video files that make a person appear to say something he never said, or do something she never did. As of now, this technology is in its infancy. However, as deepfakes become more sophisticated and widespread, our ability to distinguish between a real recording and a fake one will surely be tested.21

An audio deepfake occurs when a person’s voice is “cloned” to produce synthetic audio that’s indistinguishable from the original.22 In 2016, the computer software company Adobe held its annual conference in San Diego, California. During the MAX Sneaks segment of the event, Adobe’s Kim Chambers and the American actor, comedian, and filmmaker Jordan Peele introduced Adobe VoCo, an unreleased software prototype with the ability to not only edit audio files but to also use their phenomes to generate words from scratch. During the big reveal, Adobe research scientist Zeyu Jin pasted an audio clip — featuring Peele’s friend and co-actor Michael Key humorously recalling his reaction to being nominated for an Emmy — into the VoCo program. The file, which had been converted into an audio waveform and transcribed sentences, was projected onto the auditorium’s silver screen. Jin pressed play, and the audience erupted in laughter as Key said, “I jumped on my bed and I kissed my dogs and my wife, in that order.” Then, the research scientist proceeded to erase certain portions of the transcript and type out new phrases. Within seconds, VoCo altered the file and made Key say: “I kissed my wife and then my dogs,” “I kissed Jordan and my dogs,” and “I kissed Jordan three times.”23 It was both flawless and terrifying.

One shudders to think of how this technology — dubbed “Photoshop-for-voice” — could be used for more nefarious purposes. Imagine an audio file from one of Barack Obama’s audiobooks being transmogrified into a recording of the former president admitting that he was born outside of the United States, or an audio file from one of Donald Trump’s campaign rallies being used to generate a confession that he worked closely with the Russians to hack ballot boxes in the 2016 election. Synthetic audio undoubtedly has its benefits. It gave the film critic, Roger Ebert, his voice back after thyroid cancer had taken it away. And let’s be honest, audio deepfakes can be wildly entertaining. I mean, who wouldn’t want to listen to a gaggle of former U.S. presidents rapping “F— Tha Police” by N.W.A.?24 But this technology also has its costs, and they might just be more than our politics can afford.

Similar fears have been aroused by the rise of deepfake visuals — “videos in which one person’s face is swapped out for another, often so seamlessly that it can be difficult to tell that they have been altered.”25 As it stands, several computer software applications, such as DeepFaceLab and Zao, allow users to make altered videos and post them online. Some of the recordings are benign. For instance, one deepfake video swapped the face of actor Nicolas Cage with that of actress Amy Adams as she sang “I Will Survive” by Gloria Gaynor, while another showed actress Jennifer Lawrence speaking at the Golden Globes with the face of actor Steve Buscemi.26

Yet others are downright disturbing — such as those that have swapped the faces of famous actresses, like Gal Gadot and Scarlett Johansson, onto the bodies of pornographic movie stars without their knowledge or consent.27 What’s more, consider the impact that deepfake videos could have on domestic politics and international affairs. What would happen, law professors Robert Chesney and Danielle Citron wonder, “if a fake video of a white police officer shouting racial slurs or a Black Lives Matter activist calling for violence” went viral? Or how many recruits and acts of terror could ISIS inspire if they created “a video depicting a U.S. soldier shooting civilians or discussing a plan to 28 Or what about an altered video depicting “emergency officials ‘announcing’ an impending missile strike on Los Angeles or an emergent pandemic in New York City?”29 As digital forensic expert Hany Farid notes, the fact that such “nightmare scenarios… aren’t out of the question…should scare us.”30

Yet the problem with deepfake technology isn’t just its ability to present a lie as the truth, but also its capacity to provide cover for those seeking to dismiss the truth as a lie. When a public figure is accused of having said or done something inappropriate, and that allegation is supported by a genuine audio or video recording, he or she may try to cast doubt on the authenticity of that evidence by dismissing it as a deepfake. This phenomenon — which Chesney and Citron call the liar’s dividend — is already rearing its ugly head in American politics.31 In November 2016, the Washington Post released a recording of then-presidential candidate Donald Trump vulgarly bragging about groping women to an Access Hollywood correspondent on a hot mic in 2005. Although Trump publicly acceded to the authenticity of the tape and apologized for his comments in the final days of the campaign, he later claimed that it was not his voice on the tape after all.32 Likewise, in the wake of an assault on the U.S. Capitol by hundreds of Trump supporters on January 6, 2021, President Trump delivered an address in which he promised to punish the rioters and acknowledged President-elect Joe Biden’s victory. Soon thereafter, however, a post appeared on Facebook claiming that the broadcast was fraudulent. “That’s not real,” the message exclaimed, “That’s not real guys. Something’s wrong with this video. This is a deep fake.[sic]”33 The post went viral.

Although it’s been possible to alter audio and video files for decades, doing so took time, skill, and a lot of money.34 This high bar is no longer the case. Fortunately, numerous efforts to develop deepfake-detecting software are currently underway. Yet these programs will never be foolproof. First, a deepfake that goes viral will likely be seen by millions of people before it’s ever debunked by such software,35 for as Jonathan Swift said, “Falsehood flies, and the Truth comes limping after it.”36 Second, researchers fear that these efforts will inevitably result in a sort of arms race, in which the methods used by those attempting to identify deepfakes will simply be incorporated and circumvented by those generating them. As such, the political scientist Brian Klass argues that, “Ultimately, the solution lies with us… If better forgers are coming, we, as citizens, need to…become better detectives.”37 The tips described below will help us begin to do just that.

Again, Make Use of Reputable Fact-Checking Organizations

In 2020, a video of then-Democratic presidential frontrunner Joe Biden lolling his tongue began to make the rounds on Facebook. The video wasn’t realistic, nor technically even a deepfake.38 Nonetheless, an investigation by PolitiFact offered definitive evidence of its fraudulent nature, including a link to the app that was used to doctor the video and a link to the original, unaltered recording. That same year, a video of President Trump appearing disoriented on the White House lawn appeared on Instagram. But PolitiFact once again proved that the twelve-second clip — which was accompanied by a message claiming that Trump was “deep into his degenerative neurological disease” — was a fake.39 In short, the fact-checkers at, Snopes, and PolitiFact, do commendable work, and our efforts to verify suspicious recordings should begin with them.

Attempt to Verify a Recording on Your Own

However, if we stumble upon an audio or visual recording that has yet to be investigated by a reputable fact-checking group, we will have to do our best to verify it on our own.

  • First, if a recording shows a politician in an unduly negative light, we should be skeptical. It’s not a deal breaker — politicians say and do dumb things all the time. But, again, if it’s too good (or bad) to be true, it probably is.
  • Second, we should seek the source of the video. If one is lacking, that’s a good indicator that it could be misleading.
  • Third, we should paste the video’s link into tools like Amnesty International’s YouTube Dataviewer or the InVid browser extension and gather information on the recording’s origins.
  • Fourth, we should take a screenshot of the video, upload it to Google or TinEye, and conduct a reverse image search to see if it appears elsewhere online, particularly in an unadulterated form.40
  • Finally, we should make a habit of watching as many deepfake videos as possible, so we can learn how to discern a real recording from a fake one.41 The differences can be hard to place, but the more we watch deepfake videos, the more we’ll be able to detect slight movements of the mouth or the head that just don’t seem quite right.42

In Blur: How to Know What’s True in the Age of Information Overload, press critics Bill Kovach and Tom Rosenstiel argue that we’ll increasingly have to rely on ourselves, rather than the press, to evaluate information.43 We are our own editors, and we must take this responsibility seriously. As they write: “Democracy stakes everything on a continuing dialogue of informed citizens, and that dialogue rises or falls on whether the discussion is based on propaganda and deceit or facts and verification.”44 Using the simple strategies described here can improve our ability to distinguish fact from fiction, and thereby contribute to the well-being of our democratic society. END

Excerpt from Political Tribalism in America: How Hyper-Partisanship Dumbs Down Democracy and How to Fix It, by Tim Redmond, published by McFarland Press (June 2022)

About the Author

Dr. Timothy J. Redmond is an award-winning educator who teaches history, government, and critical thinking at Williamsville East High School and Daemen College in Buffalo, NY. He is the author of Political Tribalism in America: How Hyper-Partisanship Dumbs Down Democracy and How to Fix It. Redmond, who received his PhD in political science from the University of New York at Buffalo, is also a Jackson Center fellow and associate director of the Academy for Human Rights. Follow him on Twitter @tjredmo.

  2. Boller, P. F. (1996). Presidential Campaign (pp. 9–13). New York: Oxford University Press; See also
  3.; see also Köbis, N., Doležalová, B., & Soraperra, I. (2021). “Fooled Twice — People Cannot Detect Deepfakes but Think They Can.” SSRN Electronic Journal.
  6. Farid, H. (2007). Digital Doctoring: Can We Trust Photographs? Darmouth.
  7. Seife, C. (2014). Virtual Unreality: Just Because the Internet Told You, How Do You Know It’s True? (pp. 100–101) Viking.
  15. Sacchi, D. L. M., Agnoli, F., & Loftus, E. F. (2007). “Changing History: Doctored Photographs Affect Memory for Past Public Events.” Applied Cognitive Psychology, 21(8), 1005–1022.
  16. Frenda, S. J., Knowles, E. D., Saletan, W., & Loftus, E. F. (2013). “False Memories of Fabricated Political Events.” Journal of Experimental Social Psychology, 49(2), 280–286.
  26. Ibid.
  29. Chesney, R., & Citron, D. K. (2018). “Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security.” SSRN Electronic Journal.
  31. Chesney, R., & Citron, D. K. (2018).
  43. Kovach, B., & Rosenstiel, T. (2011). Blur: How to Know What’s True in the Age of Information Overload (p. 7). Bloomsbury.
  44. Ibid., (p. 197)

This article was published on August 30, 2022.

Skeptic Magazine App on iPhone


Whether at home or on the go, the SKEPTIC App is the easiest way to read your favorite articles. Within the app, users can purchase the current issue and back issues. Download the app today and get a 30-day free trial subscription.

Download the Skeptic Magazine App for iOS, available on the App Store
Download the Skeptic Magazine App for Android, available on Google Play
Download the Skeptic Magazine App for iOS, available on the App Store
Download the Skeptic Magazine App for Android, available on Google Play
SKEPTIC • 3938 State St., Suite 101, Santa Barbara, CA, 93105-3114 • 1-805-576-9396 • Copyright © 1992–2024. All rights reserved • Privacy Policy