You never really know what you’re going to get into in a war. A young upstart country disrupted a global powerhouse in what we now call American Independence. The tragedy of 9/11 triggered a reaction from the United States that was quick and powerful. The nation decided that it would not allow terrorism to invade its borders. The results were a series of initiatives designed to bring about the end of terrorism. And it led to Stanley McChrystal being placed in the heart of Iraq trying to combat a different enemy and, ultimately, create a new kind of operating structure. It was a Team of Teams: New Rules of Engagement for a Complex World.
Efficiency Versus Agility
When it came to efficiency, the US forces had it down. They had hundreds of years of learning the most efficient way to do things. Fredric Taylor’s scientific management had long since taken hold of the military. Sure, it was as bureaucratic as any other large government organization, but it was as efficient as possible when operating at this scale. Of course, scale had its disadvantages, too. The larger the organization, the more difficult it is to be agile.
Al Qaeda Iraq (AQI), as McChrystal consistently refers to it in the book, was none of these things. Their network followed no discernable hierarchy. They didn’t do things efficiently at all. But they were exceedingly agile, and that seemed to allow them to keep one step ahead of the US forces. It was something that was humiliating to endure. A vastly overpowered, rag-tag force was able to keep and even gain ground. McChrystal had to find a new way of doing things that more closely aligned with the new rules of engagement. New rules forged not in a strategic planning session in some war room but at a kitchen table using improvised materials and intelligence.
Red Teams and Blue Teams
The military world – and the intelligence community – have long since learned about how to work together – and how not to do it. Richard Hackman explains decades of experience in Collaborative Intelligence and even outlines the pitfalls that await people who must defend themselves against attackers. It turns out that the structure of attack makes the success more likely. It’s possible to focus limited resources in specific places and punch through the defenses that must be more spread out.
Hackman has many other measures that seemed to indirectly inform McChrystal as he tried to figure out how to make his task force more effective in a foreign land – not just geography but also in the way that things worked.
While we like to believe in cause and effect, we’re collectively becoming increasingly aware of the probabilities of things happening rather than their certainties. (See The Halo Effect for more.) We’re beginning to recognize that we’re living in a world of complex systems that interact in ways that are difficult to predict. Donella Meadows in Thinking in Systems began to expose us to how systems work – and how they change. She exposed the kinds of emergence that happens in large systems. Everett Rogers explained this in terms of the law of unintended consequences and the story of how steel axe heads for Stone-Age Australians went horribly wrong. (See The Diffusion of Innovations for more.) Even Judith Rich Harris discusses Lorenz’ “butterfly effect” in terms of how small changes can make big impacts on children in the same home. She explains that there are really No Two Alike.
In short, what McChrystal and the task force experienced wasn’t new or unique. It’s the way that we have to begin to think if we want to succeed in this VUCA world. (See Stealing Fire for more about VUCA – volatile, uncertain, complex, and ambiguous.)
Most things in our world experience change and growth in a relatively linear way. Our children may grow more in inches during their 10th year than in their first, but it seems proportional. Rarely do we encounter the kind of rapid changes that happen when we toss a match on something recently doused with lighter fluid. Things change, but they do so at a relatively constant rate. However, in complex environments, what’s stable one moment can be very unstable the next. You’re going along fine – until you’re not.
Lorenz first wrote about chaos theory, or complex adaptive systems, when he wondered – as a part of a published paper – whether a butterfly could set off a tornado in Texas with the flap of its wings. This is rooted in Lorenz’ attempts to model weather and a chance happening that he entered data back into the system to restart it with less precision – and got radically different results. He explained that even small changes in a complex system can create a chain reaction that leads to big events – but most of the time it has little or no impact at all. Little things can matter – but they don’t necessarily always matter.
Herein lies the biggest problem. As humans, we are prediction machines. We’re always trying to predict the next move, the next thing that will happen – and when systems get too complex, we can no longer do that. It’s the failure of our predictive capacity that makes us laugh at jokes. We get a little spark every time we detect that we’ve made an error in our prediction. (See Inside Jokes for more.)
What makes prediction hard is both the breadth – number of actors involved – and the velocity. When news traveled via the Pony Express, there weren’t iterations happening every second – or every millisecond. It was possible to play out scenarios and anticipate the future. Mail – and even newspapers – generally reached fewer people than someone could potentially reach now with a single retweet from a celebrity with millions of followers. And that one retweet might spawn dozens, hundreds, or thousands to retweet on their own with the result of spreading the message even further.
Efficiency and Effectiveness
Simply, efficiency is doing things “right,” and effectiveness is doing the “right” things – “right” being right in context. How many times do we find ourselves making things more efficient without considering whether we’re doing the right thing? If we do ask the important question, what chance do we have that we’ll know what the right thing even is? It’s frustrating and demoralizing to realize that you don’t know how to be effective because you have no idea what the right thing is.
Entrepreneur means “risk bearer.” Being the leader of a command in a time of war carries with it a greater weight. You’re literally making life or death decisions every day. If you act, you risk the lives of those under your command. If you fail to act, you may risk their lives and also the lives of those you’re there to protect. The core problem is knowing which actions to take and when to hold tight.
As an entrepreneur for a few decades now, I can tell you that I’ve never known for sure that I was right – except when I was wrong. I’ve had so many good ideas and excellent execution fall flat. It’s fallen flat, because what I thought was the absolute right thing wasn’t in fact what was right – at least as far as the market was concerned.
Absolutely we’ve got to recognize that efficiency isn’t sufficient. We have to make sure that we’re doing the right things, while realizing that it may not be possible to know for sure what right even is.
Most organizations have organization charts. Most of the time, it’s the clean, direct line and hierarchy that we expect to see when we describe a large organization. Just as map makers must decide which details to leave out and which ones to include, the architects of the organizational chart must simplify the rather messy connections that every organization has.
As a child, I watched M.A.S.H. It’s a fictional show about a mobile surgical hospital in Vietnam. As a situational comedy, there’s something to making the show interesting, but it exposed something that traditional organizational chart doesn’t. It exposed dynamics of the relationships between the surgeons, the nurses, and the rest of the staff. More importantly, it exposed the difference between real power and position. If you wanted to get something done, you spoke to Radar. The commanders could come and go, but Radar was always there making things happen one way or another.
Instead of the clean lines and pristine hierarchy, AQI emphasized relationships and connections to people that could get things done. When diagramed, the structure looked like a mess – because it was. However, the mess wasn’t a byproduct of the organizational structure, it was the results of connections and relationships. We see these today in business and society through social network analysis. We can see who we’re connected to and who they’re connected to, building a web of connections that can lead everyone to someone else in startlingly few hops. (See Analyzing the Social Web for more about social network analysis.)
For McChrystal, the problem wasn’t that he had poor equipment or poorly trained soldiers. He had good tech and good talent. The problem was that there were gaps between the groups and these gaps created inefficiencies, redundancies, misses, and mistrust.
Navy SEALs are trained together and trained to trust one another as if their life depends upon it – because often it does. Army Green Berets and the special forces of the other branches of the military are also similarly trained and teamed. The problem was that there was relatively little integration between the forces in a branch of the military and even more so across branches. They simply didn’t know or trust each other – and that was making it difficult to leverage the power that each group brought to the mission.
This is the core concept in Team of Teams. It’s building trust within the team and then layering in additional trust with people outside of the small insular group. McChrystal’s insight was in the way that he intentionally created these connections between different groups and then allowed his team – or team of teams – to leverage these relationships in the same way that AQI might. However, better resources, training, and efficiency weren’t removed – the informal trust was added to what already existed.
There’s a tendency in military – particularly military intelligence – circles to over-restrict the dissemination of information. Things are marked “top secret” when “secret” would do. They’re marked for internal use only when most of the information is available publicly.
It was a special opportunity to get to visit NSA headquarters, and we took it. NSA Family Day is a time when those who are working on some of the most secret operations to protect our nation can share a glimpse of what they do with their families. I wasn’t expecting that one of the briefings would explain that all the sophisticated gear sitting out on the table, which was used to gather information, was available on Amazon.com. It was a moment when I realized how much that was secret wasn’t really secret.
McChrystal knew that information was far more valuable shared – even with the risk of a leak – than kept behind locked doors. The result was a change where the entire command area was declared safe for top secret discussions. It meant that everyone could talk (more or less) freely about sensitive information. It reduced the friction and increased the sharing. Luckily, there weren’t critical breaches that would have caused this novel experiment to be torn down. Operational effectiveness increased, and with no known negative effects, it stood. The risk was made that everyone could be trusted – and the benefits outweighed the potential risk of a breech.
I’ve spent a few decades in the knowledge management space, supporting and teaching people how to share knowledge and information in ways that forward organizations and people. In this, I’ve learned there is so much value to knowledge that it’s almost always worth it. (See The New Edge in Knowledge for more about this value.) Sharing your knowledge is an exercise in trust – and necessarily opens you up to the chance of betrayal – but properly managed, it’s almost always worth the trade.
Daniel Pink in Drive revived and extended Edward Deci’s work from Why We Do What We Do. In short, intrinsic motivation comes from autonomy, mastery, and purpose. In this situation, purpose was clear, mastery was as elusive as it ever is, but autonomy could be granted. McChrystal pushed decision making authority as low as possible into the organization. He realized that, for the most part, he was going to agree with the plans of those he led – because otherwise why would he have them in his command? So, except where required, he stepped out of the way.
It changed the thinking, beliefs, and internal monologue. People felt more empowered and autonomous. They knew they’d have to accept that mistakes would be made. However, the reduction in the time to action was worth the price of verification. Instead of missing opportunities because it took too long to get approval, actions would be taken, and combatants apprehended.
McChrystal uses the analogy that he started gardening people. Instead of dictating and controlling, his responsibility was to create the right conditions for people to flourish. By changing the structure and the thinking, McChrystal was able to change people from thinking about what they needed to do to bide their time until they were sent home to how they could make a real impact while they were deployed. Team of Teams is powerful – if you’re willing to create the conditions for success.