Red Team Summary and Review

by Micah Zenko

Has Red Team by Micah Zenko been sitting on your reading list? Pick up the key ideas in the book with this quick summary.

Can you imagine paying someone to plan a bomb attack on your company? Well, there are companies out there who are happy to do this for you – and no, we’re not talking about insurance fraud.

These kinds of companies are known as red teams, groups of specialists hired by organizations to plan a bomb attack or a similar catastrophe as a way to find out about any security gaps in an organization, or the weak points in a business strategy.

In this book summary, you’ll find out how red teams work, what they can do for many government organizations, how they help to keep us all safe from terrorists and why they constantly face criticism and resistance.

In this summary of Red Team by Micah Zenko,You’ll also learn

  • about the link between red teamers and method actors;
  • why your government might be making plans to kill you; and
  • how your last flight via Frankfurt may have been a lot riskier than you thought.

Red Team Key Idea #1: Organizations can use red teams to uncover hidden problems, but only if they’re willing.

As humans, we’ve got a rather peculiar blind spot for our own mistakes. This is why, for example, we get friends to proofread our essays in university. And it’s also why organizations hire red teams, groups of experts whose job it is to work out the weaknesses in a company’s strategies, structures and security measures.

Though red teams can be incredibly effective, many leaders are reluctant to accept their help. Authoritarian figures or personalities don’t enjoy being contradicted and often refuse to enlist red teams in the first place. This was the case with the head of Federal Aviation Administration. It took a terrorist attack on a Pan-American Airway plane in 1988 with 270 fatalities before he decided to bring a red team into his regular operations to uncover security weaknesses.

Red teams also require the right members in order to be effective. Red teamers are those who can think outside the box, and there are fewer of those people around than you might think. Psychologist Scott Eidelman demonstrated how we often fall victim to existence bias by assuming things are fine just the way they are.

An excellent red teamer does not have this bias. Instead, they have an incredible eye for detail when it comes to working out what could be done better. Red teamers must, of course, be able to think like the enemy.

CIA analyst Rodney Faraon even likens sharp red teamers to method actors capable of immersing themselves in the minds and identities of someone else. In the case of a red teamer, they’re becoming one with the enemy.

Finally, organizations must ensure that red teams play an appropriate role in day-to-day life. Being assessed is stressful for both leaders and employees, so don’t let your red team run rife! Restrict them to where their expertise is needed to keep staff from feeling like they’re under constant surveillance.

Red Team Key Idea #2: The US Army incorporates red teams in its strategy to varying degrees of success.

What comes to your mind when you think of the US military? For many people, the instant association is with blind loyalty, absurd rituals and old men rambling on about the good old days when people stood up for their nation. This is not a flattering or particularly modern image, which is something the army realized themselves.

Before the 2003 US invasion of Iraq, the operating military was repeatedly warned by both experts and Iraqi expats that an invasion could result in an insurgency by the Iraqi people. Military superiors decided to stick to their guns, as it were, and ignored this rather crucial advice. Chaos ensued and demonstrated how certain traditional attitudes were causing the military leaders to repeat the same avoidable mistakes. To shake things up, red teaming was institutionalized.

Unfortunately, old habits die hard. Even in recent years, military leaders have still failed to make full use of their red teams. In 2010, for example, the head of the Marine Corps decided to make red teams an integral part of the force. This angered many Marine Corps officials, who thought they were doing just fine on their own.

In 2011, a red team was deployed to support marine activity in Afghanistan. Unfortunately, the marine colonel leading the operation ignored red team findings and analysis. For example, the red team found that Afghan farmers would be better off transitioning from heavily-taxed opium crops to quinoa crops. However, the colonel insisted that wheat crops be grown, even though red team findings showed that quinoa could be grown far more effectively.

Despite their expert knowledge, the red team’s efforts were ignored and they thus became redundant. This is, unfortunately, a common theme for red teams. And yet, they have the potential to be powerful tools in several forms of security. Find out more in the next book summarys.

Red Team Key Idea #3: Intelligence communities such as the CIA are in dire need of red teams.

Though we’d perhaps like to believe otherwise, the life of a spy isn’t all gadgets, glamor and car chases. The CIA is more about the best and brightest science nerds joining forces to collect intelligence on anybody or anything that may help officials make the right move.

However, even intelligence groups make mistakes. The most significant intelligence output is the National Intelligence Estimate, which gathers data about specific countries or regions, and reveals corresponding trends. This data is highly confidential, accessible only to a handful of influential policymakers. But despite the crucial nature of this information, the National Intelligence Estimate has featured mistaken and misleading findings for decades.

In 1949, for instance, the CIA issued an intelligence report stating with confidence that the soonest that Soviet forces could produce an atomic bomb would be in 1950. The reality was that Soviet researchers had been conducting nuclear tests long before the report was even published.

In some ways, the CIA is like any other organization. It has a hierarchy, which often means that good intelligence slips through the cracks.

After the 1998 terrorist attacks on US embassies in Tanzania and Kenya, for example, the CIA was certain that Osama bin Laden was the perpetrator. A group of powerful US officials secretly planned a military retaliation. They decided to bomb the Al Shifa chemical plant, even though several insiders advised against this. The officials ignored the insiders’ advice, and the bombing caused a diplomatic disaster.

Contrary to CIA intelligence, the plant had no relation to bin Laden, nor did it have the capacity to produce nerve gas. In order to stop the upper rungs of a hierarchy from silencing accurate advice from less influential intelligence officers, the CIA requires an independent red team to highlight possible problems.

Red Team Key Idea #4: Red team insights about security weaknesses may help prevent further terrorist attacks.

Red teams can be of tremendous service to the public when highlighting security gaps and thus offering citizens better protection against terrorists; of course, this is only when they’re listened to.

Long before 9/11, red teams had used their strategies to test the safety of air traffic. A 1996 operation saw a red team assess the security of Frankfurt’s International Airport, with frightening results. Out of 60 simulated attempts to smuggle a suitcase bomb onto a flight, none were unsuccessful. No airport staff detected what should have been a severe hazard.

One of the red teamers even succeeded in obtaining a baggage handler’s uniform and ID and posted himself in the baggage transit line. An accomplice would then drop off a bag containing bomb equipment and notify the member when the bag was going through. These 1996 findings could have made a big difference to air safety – had they been acted upon. Ultimately, nothing was done.

But there are certainly cases when a timely response to red team advice has led to great success. One of these is plane vulnerability to shoulder-held missiles. After an Israeli Boeing 757 was the target of two missiles launched by Al-Qaeda, the US Department of Homeland Security was determined to develop preventative measures. Red teams were deployed and, this time, found their audience to be far more receptive.

The red team posited that a shoulder-held missile attack was likely to target planes from a specific country. Such an attack would also require exhaustive but undetectable surveillance of airport takeoff and landing patterns.

The red team discovered that the most likely place to launch an attack on a plane at JFK International Airport would be from the cemeteries in the borough of Queens, which offered good vantage points for the airport runways. The Department of Homeland Security took this finding on board to eliminate vulnerabilities, making their efforts to hinder terrorists all the more effective.

Red Team Key Idea #5: Red teaming has been introduced into the private sector to assist in decision making and security.

For some reason, companies seem to think that one aspect of their business they can skimp on is security. Sure, an investment in security might reduce immediate profits. But failing to tie up loose ends can prove disastrous.

A 2008 episode of the reality show Tiger Team follows a red team as they demonstrate just how easy it is to rob a car dealership. By scoping out the office through an unsecured skylight, the team was able to find out which IT company supported the dealership.

The red team then posed as tech support from the company, accessed the company server room and proceeded to delete the ID numbers for every car. The team finally posed as customers to discover that security cameras are positioned too high to capture a human crawling on the ground. After all this, the break-in itself was a cakewalk.

A company’s online presence is often even easier to break into than its physical headquarters, though the results are often just as damaging. Discount retailer Target learned this the hard way in 2013, after hackers infiltrated their system to steal the credit card numbers of over 40 million customers.

Savvy companies avoid catastrophes like this by recruiting white-hat hackers who are paid to test the susceptibility of their IT systems to intruders. These digital red teamers typically find it very easy to hack into the company’s system. If the company is smart enough to take the advice of red team hackers on board, they’re looking at a far safer future.

Red teaming has a whole range of benefits for any organization. But it’s not all positives – red teaming has its own inevitable limitations, including a changing future. Find out more in the final book summary.

Red Team Key Idea #6: Not everyone is cut out to be a red teamer.

Do you enjoy getting recognition for your efforts, receiving praise and being patted on the back by others? Then don’t join a red team.

Red teams don’t get to call the shots; they just advise the decision makers. If their advice is taken on board, it’s that decision maker who gets the credit for success, not the red team. This is nearly always the case, even in the capture of Osama bin Laden by American forces. Huge resources were deployed in order to determine whether bin Laden was hiding inside a well-secured Pakistani mansion.

Three red teams examined the data and came up with 75, 60 and 40 percent probabilities, respectively, that bin Laden was inside. It was up to President Barack Obama to decide whether to take the risk despite these discordant estimates and storm the building. When the operation was a success, it was a credit to Obama, not the work of the red teams.

Nevertheless, the future of red teaming is set to be interesting, particularly as human labor comes to be replaced by artificial intelligence in the coming decades. Even today, red teams make use of computer models and complex algorithms to uncover information that a rival doesn’t want to be known.

White-hat hacker specialist Raphael Mudge has developed a range of computer programs to assist red teams in their attempts to probe the vulnerabilities of security systems. Armitage is one such program that allows red team members to work from a common server to share information instantly.

Mudge is also working on a scripting language called Cortana, which makes it possible to build virtual robots to simulate the activity of red teamers within the Armitage program. These developments, among others, are sure to make red teaming incredibly effective in the future.

In Review: Red Team Book Summary

The key message in this book:

Far too many military and security mistakes have been made because of these institutions’ traditional, hierarchical decision-making structures. Red teams work to find the weak points of leadership decisions and security measures, thereby making them stronger.

Actionable advice:

Use red teams to challenge the biggest, and only the biggest, decisions you have to make. The author recommends that the White House create a temporary red team that can be gathered before any critical decision is implemented. This team will not have been invested in the whole decision process and will therefore look at the problem with more objectivity, spotting potential pitfalls.

Do the same thing for your team, corporation or government body.