Skip to content

Book Review-Going to Extremes: How Like Minds Unite and Divide

If you’ve ever wondered how a group of people might become so polarized and radical that they would do things that would normally be unthinkable?  What’s crazy is that what we often see as “crazy” levels of polarization is the normal reaction.  In Going to Extremes: How Like Minds Unite and Divide, Cass Sunstein explores the ways that people radicalize.  (Cass also wrote Nudge and was an author on Noise.)

The Amplification Effect

One might surmise that when you put people in a group, the resulting attitudes and behaviors would become homogeneous around the median attitudes and behaviors.  What we’d expect to see, then, is everyone behaving the same.  They’d have the same Ford Model T car in the same color – black.  However, observationally and through the research, we find that this is decidedly not the case.  Instead of moving towards the middle, groups tend to move towards the extreme.  As The Tell-Tale Brain explains, art relies on exaggeration of proportions to draw us in.  We’re naturally drawn to the extreme, and it’s particularly apparent when it comes to the behavior of groups.

Richard Hackman, in Collaborative Intelligence, helps us to see that the most insular groups aren’t the best groups.  He encourages us to look for groups that have the right properties of boundedness.  They must have enough stability to know who they are but enough openness to allow new members of the group to introduce new ideas.

In extremist groups, Sunstein explains that the boundedness is at its extreme.  The greater the degree to which you isolate a group of any kind from the broader society, the greater the probability that they’ll go to an extreme.

Like and Trust

One of the pernicious problems that we face as a society is the polarization that we see around us, from the Flat Earth Society, who seem to believe that the Earth is flat and can’t understand why the rest of us can’t see it, to the politics invading every aspect of our lives, including public health.  It’s difficult to have an intelligent conversation with people who see the world differently than you do, because they don’t like you and they don’t trust you.

Sunstein explains that people like you just a bit more when you tell them something they already know – or that aligns with their existing beliefs.  We know the people who are more than willing to tell you what you want to hear.  They’re called “Yes men,” and in the language of Buy-In, they’re given the name “Bendi Wendi.”  While most of us naturally recoil from the idea, we frequently fall into their web and become entangled in their lies and desire to build affinity (like) and trust with us.  (Paul Eckman in Telling Lies would argue against my characterization here, because the people who are misleading us are often genuinely moved in the moment towards our point of view.)

Trust is a complicated topic that’s core to being human and our need to be social.  Rather than addressing the topic in full here, see Trust=>Vulnerability=>Intimacy, Revisited for a comprehensive view on trust and The Righteous Mind for more on why we need to be social.  What’s important in the context of Going to Extremes is recognizing that we trust people we like.  If we simplify this, we trust people who agree with us.  In Trust: Human Nature and the Reconstitution of Social Order, Francis Fukuyama, talks about how different cultures have different power bases for trust and how those power bases are changing.  Traditional centers of trust are breaking down, and social networks are becoming more powerful strongholds of trust.

One More Like and a Share

The idea that social media feeds us more of what we like makes sense.  The algorithms are designed to push to us more of what we like, so we will like the platform more, and they can sell advertisers on the need to advertise on their platform.  The algorithms, therefore, intentionally filter out views that may be contrary to ours.  The problem with this is that this is exactly the kind of isolation that Sunstein is explaining is a prerequisite to extreme behavior.  So, even if we’re still going to the grocery store and interacting with others, it can be that our electronic lives are intentionally being biased towards people with similar, or at least not objectionable, opinions.

This becomes evil when we recognize that it is the repeated portrayal of others as sub-human that is a part of every genocide.  Albert Bandura in Moral Disengagement and Phillip Zimbardo in The Lucifer Effect speak to what can happen when other people are considered sub-human.  However, this is exactly what we’re talking about when we’re talking about isolating groups together.  It allows for them to define the outside group as sub-human.

Exit Amplification

Another group polarization effect occurs as people elect to exit from the group rather than remain.  In Exit, Voice, and Loyalty, Albert Hirschman explains that when people exit, they have eliminated voice.  Their voice, which might have been a mediating factor, is removed from the group, pushing them further to the extreme.  He explains further that when exit is removed because the costs to exit are too high, the person may disengage and voluntarily remove their voice, because they’re resigned to the fact that their voice doesn’t matter.  This hopeless resignation may end in burnout, but that’s not generally a concern for the group.  (See Extinguish Burnout for more information about burnout and overcoming it.)

Drawing a Line at Asch

In Decision Making, Irving Janis and Leon Mann spoke of the research of Solomon Asch and how, in a simple experiment, he demonstrated that you could change people’s perceptions of line lengths simply by including confederates who answered with the wrong answer first.  This was a part of the research that followed the Second World War, when everyone was trying to figure out how genocide was possible.  Subsequent research has shown that the effect of others answering incorrectly doesn’t trigger conflict, but rather the person who is influenced by others consistently giving an incorrect answer is an altering of perception.

In the context of groups, when more people coalesce around an idea that may be objectively wrong, others are pulled along into the perception.  This is the heart of Janis’ concept of groupthink.  Groups will tend to think the same way because of the persuasive effect that it has on its own members.

Boomerang

Sometimes when you try to push people in one direction, they move in the opposite direction.  In Nudge and Decision Making, we find that sometimes a small push in one direction results in a larger response than necessary – and that moves towards extremism when we’re pushing towards the middle.  While overcorrection is at the heart of being Antifragile, that kind of overcorrection means we have to be careful to make sure we don’t make things worse.  Science and Pseudoscience in Clinical Psychology explains that many of the anti-drug-abuse programs from the 1980s made things worse.  DARE in particular was classified as a potentially harmful intervention.

Memes

Memes are more than just funny images with captions on the internet.  Richard Dawkins coined the term “meme” in his 1976 book The Selfish Gene.  The fundamental concept is that ideas can replicate, mutate, and evolve – just like genes do.  Sunstein’s language is focused on the replication of ideas, particularly the rapid replication that he calls “social cascades.”  Think of Malcolm Gladwell’s book The Tipping Point – at some point, movements become self-sustaining.  Sunstein points out that these movements change folks: they go from holding beliefs and perspectives of their own to relying on the beliefs and perspectives that they believe others hold.

Said differently, your internal trust of another person climbs high enough that you’re willing to accept their beliefs and perspectives without question – and that can be tragic.  For instance, Jim Jones convinced his followers in The People’s Temple to drink poisoned Kool-Aid – and to serve it to their children.  Most believed Jones so blindly they didn’t question anything – even mass suicide.  (See The Hidden Brain, Influence, and Split-Second Persuasion for more on the tragedy.)

Can I, or Must I?

The real problem with extremists is that, in whatever form, the question a person asks themselves switches from “Can I accept this?” to “Must I accept this?”  This subtle shift has powerful ramifications as the burden climbs much higher to reach “Mount Must.”  When our minds close and we become fixated on our existing perceptions – because they’re supported by our peers – we filter information with such a high degree of “must” that we can safely exclude almost everything.  We cannot, of course, apply the same filter to the information that we believe.  We need to cling to it and accept the lower chasm of “can” – so that we can maintain our beliefs.

The best way to avoid others going to extremes is to read and understand Going to Extremes.