Do You Double Down?

In the 1950s, there was a religious group called the Seekers who believed that aliens were going to rescue them from a coming apocalypse. On a cold night in December in 1954, they gathered on the street outside their leader’s home in Oak Park, Illinois, singing Christmas carols and waiting for a spaceship to land and escort them away (the aliens had instructed them to sing carols while waiting, which is a fantastic thing to do while waiting for this sort of thing).

They waited for 20 minutes for the “spacemen,” but [SPOILER ALERT] they were ultimately disappointed. Worse yet, this was the fourth time they had been stood up by the aliens.

You would think that this group would become discouraged and the leader, Dorothy Martin, would disappear in humiliation. Apparently she did the opposite and continued her claims for years to come, with increasing conviction.

While most of us aren’t waiting on rooftops for E.T. to rescue us, we might hold on to our viewpoints in the same way.  Even when presented with clear evidence that should at least result in an honest reassessment, we often double down on our opinion and become even more determined to defend our ideas when there is a perceived threat to the way we view a particular situation or issue.  After years of invested attention to the same ideas and the same kinds of voices that support those ideas, it becomes difficult to consider even the slightest alteration. Psychological studies have shown a tendency for people to become more entrenched when encountering opposition. Social psychologists call this a boomerang effect, when people become more extreme in their own initial position, when faced with the cognitive dissonance that results from discovering data that is unfavorable to their view.**

In the card game of Blackjack, a player can “double down” by doubling their bet and in return they only get one more card. I’m no expert, but I do know that you should never double down in a situation where getting one more card would probably put you over 21 (in which case you would “bust”). A show of confidence when the player is actually not in a good position only makes them look foolish when the cards are turned over and the dealer easily wins.

In the arena of ideas, this becomes especially pronounced when there is a more volatile, “us vs. them” environment, where members maintain an unquestioned belief in the group’s actions and ideas, regardless of how valid critique of the group might be.

This is a form of self-protection and likely has something to do with our primal tendency to feel safer in groups that share our views. We may sink ourselves deeper into a particular viewpoint as a way to insulate ourselves from potentially relevant facts that threaten our position simply because we fear what life might look like apart from the safety of our like-minded horde. That fear is much greater than the desire for a more rational viewpoint, if left unchecked.

How can we elevate each other to more thoughtful consideration of ideas?  Here are a few difficult but effective things we can do:

1. Be cautious if your identity with a particular group starts to have a negative influence on how you view other people outside of your group.  Being a part of a social group is an important part of a fulfilled life.  Our social connections have the potential to help us grow mentally, emotionally, and relationally to create a better world.  But if allegiance to a group becomes more important than treating others with respect and dignity, then identity with a group starts to become toxic.

2. Remain open to new information, even if it threatens an established opinion you have.  When you see a headline for an article that appears to question the basic understanding you have about an issue, do you swipe it away because the mental work of questioning your own views is too exhausting?   Then, start making it a regular habit to do this hard work instead of avoiding potentially “threatening” (but valuable!) input about an issue.  Even if you are not swayed, it can serve as a reminder that no viewpoint is completely infallible.

3. When discussing difficult topics, take a posture of curiosity toward those who hold opinions you do not agree with.  Instead of going on the defensive, ask questions about their views, not as a way to corner them, but to understand what path led them to where they are on that particular issue.

None of this is easy to do with consistency.  But with practice, we can learn to resist the more primal impulses to automatically double down on our ideas, even in a culture that is deeply divided and increasingly committed to groupthink.

**One such study was done by researchers who infiltrated our alien-seeking friends in the story above. The resulting book from that study is called When Prophecy Fails, by Leon Festinger.


Some articles for reference:

How to Convince Someone When Facts Fail – Article at Scientific American

Why You Think You’re Right — Even If You Are Wrong – TED Talk by writer Julia Galef

Escalation of Commitment – Wikipedia page with more in-depth info on “doubling down” when we should be facing critique honestly

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.