The brain is designed with blind spots, optical and psychological, and one of its cleverest tricks is to confer on its owner the comforting delusion that he or she does not have any.
Carol Tavris, Elliot Aronson – Mistakes Were Made (But Not by Me)
Every day an overflow of thoughts dashes through our minds as we must adapt as efficiently and quickly as possible to everchanging environments. We can’t process all streams of information around us at once, so we turn to mental shortcuts or heuristics (a judgement or problem-solving approach where rather than finding an optimal solution as it is impossible or impractical, we favour a good-enough solution). However, during this process, we might make erroneous decisions.
For example, one way to solve the never-ending and conflicting stream of information that might affect how we make a decision is to accept new information that confirms our previous knowledge and ignore the data that does not support our initial opinion. This way of thinking tries to simplify our information processing so that we make sense of what is happening around us. Unfortunately, using this process might lead to confirmation bias. And this is where heuristic-based thinking falls short: adopting it might lead to biases.
Etymologically, bias entered the English vocabulary through the French language (biais – “a slant, a slope, an oblique”) and via the game of bowls, about balls made with a greater weight on one side. Thus, metaphorically, bias is “a one-sided tendency of the mind”.
Some biases are related to memory. For example, consistency bias is remembering our past attitudes or behaviours as more similar to our current attitudes or behaviours. Hindsight bias or “I-knew-it-all-along” is where we tend to see past events as predictable based on our present knowledge. Rosy retrospection is a bias about remembering the past as having been better than it was. The spacing effect is about remembering the information better if we are exposed to it repeatedly over a long time rather than a short period.
Other biases are related to attention (paying more attention to certain stimuli). Anchoring bias is the tendency to rely too heavily on the first piece of information when making a decision (as the saying goes, the first impression matters the most).
Wikipedia’s list of cognitive biases contains hundreds of entries, and one way of categorization is like in the image below:
The Cognitive Bias Codex – 180+ biases, designed by John Manoogian III Image Credit: Wikipedia
As mentioned, biased thinking is not precisely malfunctioning as it allows us to adapt quickly and efficiently to our surroundings. Distributed practice is a learning method that builds upon the spacing effect with impressive results. I wrote more details about this technique in Mastering a Crucial Skill for Adaptation: Learning How to Learn.
The concerns about cognitive biases arise when we take our preconceptions, values, or different affiliations (political, national, religious, etc.) at face value and then biased thinking spills into biased behaviour. We only have to look at our shared history to see bloody examples of tribalism, discrimination, prejudice, abuse. And in the future, biased thinking will exponentially warp how we see information because of biased AI with horrible implications. For more details, check AI’s Impact on the Future of Jobs.
Biases are rather slippery, as usually, they are subconscious. And the first step we can take is to pay attention to how we think and react. How does this action make me feel? Do I feel rage, triumph, denial? Why?
Between stimulus and response, there is a space. In that space is our power to choose our response. In our response lies our growth and our freedom.
Viktor Frankl, Austrian psychiatrist and Holocaust survivor
Like Julia Gillard, former Prime Minister of Australia (you can watch a famous speech of her here), reported to writer Mary Ann Sieghart in Siehgart’s The Authority Gap book:
We’ve never lived in an environment free of any stereotyping. We’ve never lived with true gender equality. So even young people who are very highly sensitized, and don’t want to be discriminating on the basis of gender or indeed on the basis of anything else, can’t shed all of that social conditioning just through an act of will. You can’t say “I’m a feminist”, and somehow all your social conditioning is gone. We need to be second-guessing what’s happening in our brains, so we don’t just give in to it.
There is a famous saying attributed to Buddhist monk Lin Chi from the ninth century, “If you meet the Buddha on the road, kill him.” A secular interpretation of this parable is that we should start questioning when we think we have found all our answers.
So, how do we ask better questions?
It’s often surprisingly easy to find bias, if you look. Who was omitted or disempowered or disadvantaged when the cultural practice was formed? Who didn’t have a voice? Who wasn’t asked their view? Who got the least share of power and the largest share of pain? How can we fill in the blind spots and reverse the bias?
Melinda French Gates – The Moment of Lift
Another way of killing our Buddha is by having someone who can objectively advise us when we give too much power to our beliefs, thus becoming less capable of adapting to contradictory information.
Our greatest hope of self-correction lies in making sure we are not operating in a hall of mirrors, in which all we see are distorted reflections of our own desires and convictions.
We need a few trusted naysayers in our lives, critics who are willing to puncture our protective bubble of self-justifications and yank us back to reality if we veer too far off. This is especially important for people in positions of power.
Carol Tavris, Elliot Aronson – Mistakes Were Made (But Not by Me)
An example of a trusted naysayer is found in an article about Katalin Karikó, the leading researcher for Phizer/BioNTech Covid-19 vaccine. Dr David Langer, a neurosurgeon who has worked with Karikó, reminisces about Kariko’s ways of experimenting:
Langer thinks it was Kariko who saved him – from the kind of thinking that dooms so many scientists. Working with her, he realised that one key to real scientific understanding is to design experiments that always tell you something, even if it is something you do not want to hear. The crucial data often come from the control, he learned – the art of the experiment that involves a dummy substance for comparison.
“There’s a tendency when scientists are looking at data to try to validate their own idea,” Langer says. “The best scientists try to prove themselves wrong. Kate’s genius was a willingness to accept failure and keep trying, and her ability to answer questions people were not smart enough to ask.”
After all, as famous physicist Richard Feynman said: “The first principle is that you must not fool yourself, and you are the easiest person to fool.”
A man travels many miles to consult the wisest guru in the land. When he arrives, he asks the great man: “O wise guru, what is the secret of a happy life?” “Good judgment,” says the guru. “But, O wise guru,” says the man, “how do I achieve good judgment?” “Bad judgment,” says the guru.
Nobody is exempt from biases. But we do have hope for change if we try to spot our biases and actively try to reduce or fix them. And we will fail multiple times over. It is a remarkable feat to pause, think, ask, listen and take the bricks apart from our old egos to build ourselves afresh, anew. We need opportunities, money, support, and stamina to start this task. And above all, we need compassion for ourselves when we make mistakes. Still, even though we tend to adhere to malfunctioning heuristics, this does not give us a justification for not trying.
Sometimes, it is the prison of our mind that we need to leave behind.
Previously published at https://www.roxanamurariu.com/how-to-reduce-biased-thinking/