One of the most troubling aspects of this new hyper-digital era is our crumbling ability to communicate with people who disagree with us.

There is the sense that we are shouting across a void at one another — in completely different languages.

Our rebuttals are locked and loaded as we scan tweets, posts, and articles asking, “Is this person with us or against us?” Anyone who has questions about our party line is revealed as one of “them”, swiftly labelled as such, and excommunicated from our friends lists.

What is happening here?

It seems that we are splintering into groups rallied around completely different frameworks for what is in fact true, and who is at fault.

It has become crystal clear that there are tremendous injustices and complex societal challenges ahead of us. While we must absolutely stand up for what we believe in, without serious course correction in our ability to communicate across divided lines and how to say I disagree in a smart way, we are in serious trouble.

Common Mistakes When We Communicate

In this article, we will explore three common ways our minds make us feel that we are absolutely right, regardless of whether or not we actually are, and how this contributes to the fracturing of our communities.

  • We will look at how we tend to believe things that confirm what we already believe,
  • Then we will look at the influence of group dynamics, and finally,
  • We will examine how even our memories are altered to support our narratives.

The simple fact is that everyone is biased to a certain degree. Our brains are wired to operate in an unfathomably complex world. In order to keep us alive they have evolved to be reductionistic in nature. If we weren’t capable of filtering information into manageable chunks and cohesive narratives, we would be completely unproductive and suffer permanently from existential crisis.

Heuristics, or mental shortcuts, allow us to solve problems and make judgements more efficiently. They shorten decision making time, helping us to function without writing peer-reviewed articles for every position we take. While heuristics are absolutely necessary, without significant self-awareness and humility, these mental shortcuts can be problematic. This is because, in their simplicity, they are prone to bias.

If our orienting narratives are in fact flawed and we continue to shortcut beliefs and decisions around them, we become vulnerable to misinformation.

If they hold truth but are entirely too narrow in scope, they risk oversimplifying complex issues into mere symbolism lacking substantive utility.

The Scientific Reasons Why We Go to Extreme Length to Justify Our Opinions

According to Francis Bacon, the father of the scientific method, “human understanding, once it has adopted opinions…draws everything else to support and agree with them…and therefore our first conclusion colours and brings into conformity with itself all that come after.”

Simply put, we see what we want to see. This phenomenon is called confirmation bias and has been widely proven by psychologists.

We naturally consume information which bolsters our pre-existing beliefs, while avoiding information that questions or contradicts them. We do this to maintain our narratives. Our rightness. Our identity.

It can be enormously uncomfortable, even painful, to question our beliefs about reality. We will go to extreme lengths to justify ourselves rather than face cognitive dissonance. This is why, for many of us, empirical evidence can prove largely ineffective at changing our perspectives.

We bypass clear contradictions with hashtag catch phrases.

One way that we trick ourselves is by holding different kinds of information to different metrics. We ask, “Can I believe this?” of information that confirms what we already believe while asking “Must I believe this?” of evidence that contradicts what we believe. These are two vastly different questions, requiring a significantly different amount of cognitive and emotional effort. Ultimately, our flawed positions lead to flawed actions that can seriously disrupt our own mental health or our relationships.

The Major Reason for Decline of Critical Thinking and Radicalization

Compounding the problem is the fact that our need for certainty is amplified in times of chaos. When our world has been turned upside down we desperately need some way — any way to compartmentalize what is happening. To bring some sense of internal order to our minds.

This is why even the most ludicrous conspiracy theories gain a foothold in times of crisis.

Additionally, the choices about which pieces of information we consume are being made for us with more and more frequency. With the rise of search engines and social media algorithms, confirmation bias has been amplified and exploited by super-powerful AI. The companies responsible are primarily motivated by one very specific currency — our attention.

Much more powerful than the systems programmed to beat us at chess, these AI overlords are optimized for attentional efficiency. They do not care whether the information they are selecting for is real or not. Useful or not. Harmful or not.

Platforms that capture our attention do so because they have hijacked our minds. Information that confirms our biases keeps us scrolling by making us feel good about ourselves. Information that outrages us takes over our limbic systems, flooding us with stress hormones, and keeping us clicking.

The algorithms have also found a way to send us down lengthy rabbit holes, which keeps us online longer. They do this by feeding us more extreme versions of what we already believe. These systems have directly contributed to the decline of critical thinking and the rise of misinformation and radicalization.

Reasons Why the Communities You Are In Influence How You Communicate

And don’t kid yourself.

This doesn’t just happen to “them”. These systems are designed to affect human minds across the spectrum. Whichever team you are a part of, it is guaranteed that there are a growing number of people who vote the same way as you who are being radicalized. Which leads us to the influence of our communities. A major factor.

We are deeply social creatures and the desire for human connection lies at the core of who we are. According to Maslow’s Hierarchy of Needs, the need for belonging directly follows our most basic survival needs. During times of chaos, we feel this need for belonging even more acutely.

While participating in a community of like-minded people can be one of the most rewarding aspects of our lives, it also has a shadow side.

Our ancestors existed in relatively small tribes and required a significant amount of group cohesion in order to flourish. As such, our minds evolved mechanisms to promote group consensus which helped to facilitate interpersonal harmony.

Tribes with more in-group cohesion tended to do better than tribes without. There is a significant amount of research supporting the impact of this group selection on the expansion of our species.

An Old Communication Mechanism that Still Has Impact Today

Anthropologists believe that one of the key ways we increase group cohesion is by having a common enemy. We require scapegoats to project our interpersonal rivalry onto, expelling them from our tribes or sacrificing them on the alters of our certainty.

And the clincher is that when we do this, we always believe we are doing the right thing. Scapegoats inherently seem guilty and deserving.

Looking back at history, or at our social media feeds, we see this mechanism playing out very clearly. If we have a collective enemy, we are less likely to fight with each other as our negative energy is directed towards “them”.

A nation is never more united than when it is at war.

If our group is rallied around one very specific philosophy and someone questions that philosophy, or brings an alternate viewpoint that challenges the narrative, our perceived “purity” demands their expulsion.

This age old mechanism plays out in typical human fashion.

The Precarious Reasons that Keep Us Vulnerable to Erroneous Beliefs

Furthering the problem of confirmation bias, our minds have evolved to more easily believe information coming from someone perceived to be within our own group. If the information is coming from an enemy camp, we have many ways of quickly disregarding it — and them.

We typically wage ideological attacks against their most basic straw man while avoiding the complexity of their best arguments. Or we simply disregard them entirely by labelling them in a way that discredits their perspective.

When something happens that reveals an error in our group narrative, we adjust the narrative. We then find some “proof” for the the adjusted narrative, post it online, and create enough attention around it, to drive consensus and a new party line.

By blaming it all on “them” we keep our people united. Superior. And maintaining the offensive.

Because of this in-group bias, we become more vulnerable to erroneous beliefs within our own communities and less likely to thoroughly examine alternate points of view.

It is obvious that as we form more insulated groups, we run the risk of creating echo chambers of misinformation or flawed philosophy, unable to withstand the discomfort of complexity. Or civil conversation.

The Odd Relationship Between Memory and Bias and Its Effects it Has on How We Communicate

The final area I want to explore is the odd relationship between memory and bias.

We typically think that our memories are video recordings, stashed away in some cognitive corridor. If we have the map to that part of the brain, we can retrieve the memory intact and unaltered.

However, this is not the case. When we retrieve long-term memories from our brains and, after using them, place them back into our memory bank we risk altering them. We reconsolidate them. Psychologists believe that this reconsolidation process is vulnerable to the biases of our present.

Without our awareness, we change our memories to fit with our present point of view. To maintain a consistent narrative. Which feels better.

We all desire an internal logic to keep us on track. To a certain degree this is necessary. Maintaining a consistent narrative helps us handle the darkest existential questions lurking within.

It can be more uncomfortable for us to realize that we have changed our minds than to unconsciously shift how we remember things. This shifting tendency is called consistency bias and is one of the blind spots of our subjectivity.

A Key on How to Say I Disagree in a Smart Way

Related to this is egocentric bias, the tendency to change how we remember the past in order to make us look good to ourselves and to others. If memory needs to be altered a little bit in order to maintain our ego structure, the reconsolidation mechanism can take care of that for us.

This process can cause us to believe that we always viewed things a certain way, relieving us of contradiction or guilt. We may even misremember someone who has ended up on another philosophical team, focusing on whatever helps us place them where we want them to be.

  • What do we do then, when our minds so easily fall victim to bias?
  • How do we become critical thinkers with a developed capacity to question our own groups?
  • What can we do about the flawed consistency of our own memories?

Like almost anything, I believe that the answer is practice.

In order to combat the negative effects of personal and collective biases, we must be willing to continually place ourselves into postures of humility and states of discomfort.

The truth is, we rally around different narratives and different values, much of which is determined by our different personalities. It’s important to remember that it’s not just the bad ones who put on blinders.

How we Build the Ability Required for Complex Thinking or Difficult Conversations

As social psychologist Jonathan Haidt puts it, “We think the other side is blind to truth, reason, science, and common sense, but in fact, everyone goes blind when talking about their sacred objects.”

Studies indicate that people vary in their level of intellectual humility, or their willingness to recognize that their beliefs may be incorrect. Someone with a higher level of intellectual humility is much more likely to consider the views of someone from another camp. Someone with lower intellectual humility is more likely to diminish their opponents immediately.

The good news is that the ability to hold the tension required for complex thinking or difficult conversations is expandable. Studies have made it clear that certain amounts of regular stress build strength and resiliency. Sort of like lifting weights. This applies to our critical thinking as well.

As many are becoming aware, we are enmeshed in a dangerous new world of narrative warfare. Our polarization is profitable.

If we are to move through this endless shouting match and actually find a way to hear each other, we must become intellectually humble. We must become curious about our own biases. We must develop the habit of exposing ourselves to alternate perspectives.

We must stop buttressing ourselves into ivory towers while hurling our best furniture down upon our enemies.

Summary

As therapy modalities and de-radicalization efforts have proven, and as the saints of social change have demonstrated: even the people most enmeshed in harmful belief structures can be restored. How to say I disagree in a smart way it typically takes people who are willing and able to reach across the divide.

We can still stand up firmly for what we believe in, but when we do, we must be clear headed and aware of our instinctual tendencies to simplify, categorize, and cast out.

Otherwise, in our rightness, we’ll only foster hatred.

http://www.cavecanemproductions.com
Contributor
Do you like Kenton Klassen's articles? Follow on social!
Comments to: How to Say I Disagree In a Smart Way

    Your email address will not be published. Required fields are marked *

    Attach images - Only PNG, JPG, JPEG and GIF are supported.