Recognize your assumptions. Question them regularly. Don’t fall prey to mirror-imaging and related mindsets. Avoid cherry-picking to support your preferred hypothesis. Value evidence over belief.
Skeptics in diverse fields ranging from the hard sciences to intelligence analysis know these maxims well. But plenty of research has made it clear that only exceptional effort keeps us all from falling prey to the same troublesome mental traits; it’s just plain hard to move beyond mere recognition of critical thinking best practices to actually practicing them best.
This becomes even more daunting when it comes to collective decision-making. After all, companies and government offices alike suffer not only from lapses of critical thinking by individual members of the group, including its leaders, but also from various biases inherent in hierarchical structures. The same organizational processes and cultural characteristics that facilitate smooth corporate operations tend to inhibit original and contrarian thinking.
Micah Zenko’s Red Team explores a promising corrective technique: challenging established views through dedicated “red teams” that run simulations, conduct vulnerability probes, and analyze alternatives. Zenko—a Senior Fellow at the Council on Foreign Relations, columnist on ForeignPolicy.com, and frequent author on national security topics—suggests that adequately constructed and empowered red teams deliver fresh perspectives to decision-makers that can make the difference between success and failure.
Zenko starts the book by describing the roots of the red team concept in the Roman Catholic Church’s “Devil’s Advocacy” method for vetting the qualifications of potential saints. Tasking a specific Advocatus Diaboli to formally challenge every candidate’s credentials, it was thought, would ensure that objectivity served as a brake on rapid rushes to sainthood. A fine attention getter, that. But it raises the question: Why didn’t the process of challenging superstitions lead to the debunking of all attempts at sainthood? The skeptical reader can be forgiven for concluding that the church’s red team wasn’t nearly red enough.
And yet, even this mild brake on the system for “confirming” miracles eventually proved too annoying for the faithful. The Advocatus Diaboli, as ineffective as it had been overall, apparently cast enough doubt on miracles attributed to people (or, quite often, to their bones) that Pope John Paul II in 1983 got rid of the position altogether, while slashing the number of miracles to qualify for sainthood. Comically, but predictably, beatifications and canonizations skyrocketed—more of each came during John Paul II’s 26 years than across almost two millennia, under all of his predecessors combined. Zenko notes, without irony, “the integrity associated with the process and outcome was negated.”
The book’s focus moves quickly onward from how the church jettisoned even this feeble attempt at challenging established beliefs to how high-stakes decisions in the modern world benefit from red teaming. Many examples of effective red teams follow, enabled by Zenko’s extensive and creative research into various forms of red teaming within the military, intelligence community, law enforcement entities, and the private sector.
Zenko seems particularly fond of the US Army’s University of Foreign Military and Cultural Studies (UFMCS) at Fort Leavenworth, Kansas, which has come to be known informally as “Red Team University.” Its program includes instruction on thought processes, cultural empathy and semiotics, groupthink mitigation, and brainstorming techniques. As one of the first civilians to attend UFMCS, Zenko gained an unusual appreciation for the nuts and bolts of teaching people how to think like the enemy. He passes these insights on to readers via many examples and quotes but he admits that even UFMCS officials remain uncertain about just how successful their decade’s worth of efforts have been.
Other red teams he cites as effective include the CIA’s Red Cell, created immediately after 9/11 to present outside-the-box thinking to senior US national security policy makers, and Sandia National Laboratories’ wide-ranging Information Design Assurance Red Team, which helped identify security deficiencies at the site where the federal government released its jobs report each quarter. In New York City, the successes of the NYPD’s large-scale tabletop exercises warrant Zenko’s comparison of the red teaming there to Star Trek’s famous Kobayashi Maru test, wherein “winning” is not an option—but much can be learned about individual and group behavior under deliberately difficult circumstances. And the extensive US government red teaming of Usama Bin Ladin’s assessed presence at an unusual compound in Abbottabad, Pakistan gets extensive treatment. Common elements among such cases of red teaming gone right include assembling the right mix of knowledgeable but skeptically inclined people, getting buy-in from bosses for both challenging conventional wisdom and applying lessons learned, and varying techniques within and between red team exercises.
Zenko is methodologically astute enough to avoid looking only at times when red teams worked well. To balance such cases, his spotlight shines on several red teams done poorly, especially the US military’s three-week Millennium Challenge 2002. This elaborate war game attempted to simulate an all-out confrontation between the US military and an adversary’s less capable but highly motivated forces. Much was at stake; planners spent two years and $250 million dollars putting together a combined live-force and computer-simulation exercise.
Only minutes after the war game started, as the US “blue team” followed largely predictable military practices, the adversary red team’s asymmetric assault—including a swarm of explosives-laden speedboats and low-flying, radio-silent airplanes—overwhelmed US forces. To complete the exercise, senior officers had to bring sunken US ships back to the surface and start again, as if the red team had never happened. The blue team learned from the initial defeat and fared better for a few days. But later in the war game, to prevent the need for a similar “do-over,” commanders prohibited the challengers from firing at incoming troop transport planes, forced them to put the adversary’s air defense assets out in the open to be destroyed easily, and denied their request to deploy chemical weapons. With the red teamers’ hands thus tied behind their backs, they proved unable to effectively challenge US operational plans. A post-mortem report admitted that the end result was scripted, limiting the lessons that the stillborn war game could provide.
In many cases where circumstances suggested red teams would bring value, leaders have failed to employ them at all. On this front, the book relates the failure of the US Government to seriously question the purpose of the Shifa Pharmaceutical Industries Company facility in Khartoum, Sudan—suspected of harboring links to both Usama Bin Ladin and the production of chemical weapons precursors—before the Clinton Administration blasted it with cruise missiles in 1998. Corporations are far from immune; Zenko cites a study of US publicly traded companies showing that nearly half of their major failures across 25 years could have been avoided if the companies had been more aware of potential pitfalls, which red teams might have anticipated. Resistance to red teaming in the private sector often derives from the cold, hard bottom line: particularly robust business war games or simulations can cost more than half a million dollars, usually without any discernable positive impact on company profits.
Red Team’s value will vary widely depending on the background and needs of the reader. Informative and entertaining stories fill the book’s pages, but it’s easy to see how some of those who need red teaming the most will feel frustration at the scant practical advice on optimizing the level of red teaming. Zenko offers a version of the Goldilocks solution, encouraging practitioners to stay “flexible in adoption of best practices” and to find “the undefined sweet spot between institutional capture and institutional irrelevance,” concluding with guidance to “red team just enough, but no more.” With the practice still underused across so many domains, it’s little surprise that many government and corporate leaders will prove unable to magically discover what “just enough” means for them.
On the other hand, leaders who are highly motivated to make their most important decisions better, and who have access to the proper resources, will find much here to help them get red teams rolling. The research is deep and wide, appealing to just about anyone interested in how to apply critical thinking techniques for helping group to arrive at better decisions.
It’s hard to remain anything but skeptical, however, about the prospects for the Roman Catholic Church to benefit from red teaming. Even under a seemingly more worldly new pope, the adoption of a robust, meaningful Devil’s Advocacy looks unlikely. After all, if such a red team were to be so successful as to completely challenge the Church’s core beliefs, would the organization’s leaders really want it?
About the Author
Dr. David Priess, author of the forthcoming The President’s Book of Secrets—relating the history of the President’s Daily Brief and its use by presidents and other senior US officials across fifty years—obtained his PhD in political science from Duke University and has published articles and book reviews in various journal and magazines. He worked as an analyst and manager in the US government and the private sector and taught courses at Duke University, the George Washington University, and George Mason University. He is now Director of Analytic Services for Analytic Advantage, Inc., which offers training and consulting in critical thinking and presentation skills to government and corporate clients. He lives near Washington, D.C. and is on Twitter @davidpriess.
This article was published on December 1, 2015.