Leveraging Our Differences

Democracy requires us to work on our biases — all of them

Pedestrians on a sidewalk
Dara G. Friedman-Wheeler argues, "If we better understand our own and others' thought processes, we will have a better chance of changing minds and moving forward."
Spencer Platt/Getty Images

Friedman-Wheeler is the director of the Center for Psychology at Goucher College.

This year we will all need to decide who we want to lead our country forward. Making a decision of this sort requires a clarity of vision that is hard to attain. As humans, we all have psychological biases; we see the world not objectively as it is, but through lenses that distort our perceptions. In many cases, these are shortcuts our brains take in processing information. But they can be quite harmful.

We all have an obligation to work to overcome these biases and we cannot be informed, rational participants in our democracy without doing so.

Implicit biases have gotten increased attention in recent years, particularly as they pertain to race. Implicit racial bias can cause innumerable harms — from the injustices of being arrested at a Starbucks and being more likely to be suspended from preschool, to the spread of anti-Asian discrimination along with the coronavirus and deaths of African-Americans at the hands of law enforcement.

These types of biases, where we associate skin color with threat, are deadly. We must figure out how to rise above them. And that means all of us, not just law enforcement. These biases influence our thinking about major issues we face today. For example, experts suggest one reason universal health care seems so obvious in northern European countries is that people in Scandinavia tend to look alike. They are mostly white. White people in the United States, on the other hand, are less inclined to think of their non-white fellow citizens as deserving of the same health care they get, though possibly many would not say so explicitly. Making these biases explicit might not cause everyone to rethink their position, but it might do that for some.

Sign up for The Fulcrum newsletter

Clearer thinking, unclouded by psychological biases, would certainly seem desirable in making decisions about health care and other major questions facing our democracy today.

We have other biases, too, that we must check. Perhaps you've also heard of confirmation bias, a tendency to see new information through the "filter" of our already-existing beliefs. This one might seem fairly innocuous, but unequivocally it is not.

When Democrats in Congress got up, one by one, to talk about President Trump's wrongdoing, and then the Republicans said the president did nothing wrong, many were left wondering how these members could have heard the same witnesses and read the same evidence. That's the work of confirmation bias: We hold on to evidence that aligns with our beliefs and discard or discount anything we can't make fit. If you listened in as members of Congress marked up the articles of impeachment or prepared to vote on them, you know this is not a productive way to have a conversation: One person states their case, and then the next person, almost seeming not even to have heard their colleague, makes the opposite case.

Civil discourse, in this case, seems to be nonexistent. Let's take it upon ourselves to bring it back, in our own conversations, and to demand it of our leaders.

You may also have heard of cognitive dissonance, a powerful and pervasive form of biased thinking. This causes us to feel uncomfortable when our behaviors and our beliefs are inconsistent — for example, if you voted for Trump and then heard he did something you find morally reprehensible. The purely rational response might be regret for having voted for him. But that could be pretty uncomfortable, especially if you spent the last three years defending him, digging in your heels in conversations with your liberal neighbors.

And we, as humans, are not purely rational. It would, however, be preferable if our participation in our democracy were more rational.

Before this administration, I had no idea how dangerous cognitive dissonance could be. It's causing half of us to defend a man who separates children from their parents. It causes others to declare our neighbors stupid or unthinking, rather than trying to understand they, too, have thought processes — and biases. It's another reason so many of us seem to think it's okay to let our fellow humans go without access to health care: It's been that way our whole lives, so it must be okay. Otherwise, we must have been participating in an immoral system.

We must become aware of these biases and how they operate — not just in the abstract, but in our own heads. Democracy demands that we do so. Otherwise we will continue to talk (or shout) at each other and we will remain polarized and unable to make progress.

I am not suggesting we try to understand others' thought processes in order to excuse oppression, cruelty, corruption or racism. I am also not arguing there is such a thing as purely rational thinking, or that we are likely to get particularly close to that even with our best efforts.

But if we better understand our own and others' thought processes, we will have a better chance of changing minds and moving forward.

We all have these biases. It's how our brains work. Blame evolution, if you like. And then commit to make the effort, every day, to see where these biases might be influencing us and doing harm, and to try to think more clearly and rationally to mitigate those harms.