paint-brush
People Lie. Even When They Think They’re Telling the Truthby@a_belova
New Story

People Lie. Even When They Think They’re Telling the Truth

by Alena BelovaFebruary 20th, 2025
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Cognitive distortions are tricks our own brain sets for us. They can ruin our lives (and work, of course) and most importantly - how to deal with them. The most common are confirmation bias and herd instinct.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - People Lie. Even When They Think They’re Telling the Truth
Alena Belova HackerNoon profile picture
0-item

A meeting, an important project, a tense atmosphere. And then one of the managers, with the air of an expert, declares: “We knew from the very beginning that this approach would work!” Everyone around nods in agreement, admiring his insight. But you remember how just six months ago this same person was pushing for a completely different idea! What’s going on? Is he lying? A hypocrite? Most likely, no. And, most likely, these are the tricks of cognitive distortions - cunning traps that our own brain sets for us. And believe me, absolutely everyone falls into them. Not only at work, but also in everyday life.


This article is exactly about this very “sincere lie” that imperceptibly creeps into our thoughts, decisions, and even simple conversations with friends. Let's figure out where these cognitive “glitches” come from, how they ruin our lives (and work, of course), and most importantly - how to deal with them.



The Brain: A Genius of Saving and a Master of Deception

Our brain is, of course, a powerful thing. A supercomputer created by evolution itself! But it is not designed to search for truth, but to survive. Millions of years of evolution have taught it to save energy and process information at the speed of light. And this is, of course, cool. But this coin has a downside: the same mechanisms that help us survive often throw us cognitive distortions - a kind of "bugs" in our thinking.


Let's look at the main "energy-saving" tricks of our brain:

  1. We save on matches (that is, on neurons): heuristics are our faithful (but not always) assistants. The brain is a terrible lazybones. Why should it carefully analyze the situation every time if it can use a proven "template"? So heuristics appear - simple rules that allow you to make decisions quickly. It seems convenient, but they are often the ones that lead us into cognitive thickets.
  2. Emotions are color filters for reality. Our emotions are like colored glass: they color everything we see. Fear makes us inflate danger to the size of Godzilla, while joy, on the contrary, dulls vigilance.
  3. “Like everyone else, so am I”: social pressure and herd instinct. We are social creatures, and it is important for us to be part of a group. Therefore, we often adjust our beliefs to the opinion of the majority, even if inside we feel that something is wrong. At work, this can manifest itself as blindly following the “general opinion”, which in fact may turn out to be complete nonsense.
  4. Habit is second nature (sometimes very harmful). Every time we make a decision, a new neural path is trampled in our brain. The more often we make the same decision, the wider and more convenient this path becomes. And now we act automatically, even if this decision was wrong from the very beginning.


So, a “lie” is when a person intentionally distorts the facts. And cognitive distortions are when the brain gives us a distorted picture of reality, and we sincerely believe it. The difference, as they say, is obvious. Let's delve into the most common and insidious cognitive biases.

1. Confirmation Bias: Reality Filter


We are all a little “favorites” of our own ideas. Confirmation bias is exactly about this weakness of ours. In simple terms, we are talking about situations when we look for, see, and remember only what coincides with our beliefs. And everything that contradicts - well, that’s just an accident or a mistake.

Here are some examples from different areas

  • In business: Imagine an entrepreneur who is confident that his new product will “take off”. He will enthusiastically collect positive reviews from friends and acquaintances, but completely ignore critical comments from potential clients. He will only see what confirms his optimistic forecast.
  • In Politics: Supporters of a certain politician will happily read news and articles that praise him, and will be skeptical of any criticism of him. They will interpret ambiguous events in favor of their candidate.
  • In everyday life: People who believe in horoscopes will remember only those cases when the predictions "came true" and forget about those when they turned out to be wrong.


Sound familiar, right? Each of us does this, at least sometimes. This does not mean that we consciously lie to ourselves. Our brain is simply designed this way: it is more comfortable when everything matches what we already know and believe.

About the brain and research

Research shows that when we receive information that confirms our beliefs, pleasure centers in our brain are activated, in particular the nucleus accumbens. This is the same area that is activated when we receive a reward or pleasant sensations. It turns out that confirmation of your rightness is literally pleasant for the brain! It is like checking a box: "I was right!" And, of course, the brain remembers this pleasant experience and strives to repeat it in the future.


Nickerson (1998) conducted a large-scale review of confirmation bias research and showed how much it affects a wide range of areas of our lives - from scientific research to everyday judgments. He collected a huge amount of data, demonstrating how people interpret ambiguous information to support their biases.


A classic example is a study in which supporters and opponents of the death penalty were asked to read the same scientific papers on the effectiveness of this punishment. And guess what? Supporters found arguments in favor of the death penalty in these papers, while opponents found arguments against it! Everyone saw what they wanted to see and became more convinced of their position. This is a clear demonstration of confirmation bias in action.


Confirmation bias can lead to serious errors in decision-making, especially in business and science. We can miss important facts, underestimate risks, or make bad decisions based on incomplete or distorted information.

2. Hindsight Bias: “I Told You So!” (or Didn’t I?)


Have you ever had this: an event has happened, and everyone around you starts shouting, “It was obvious! I knew it!” And you sit and think, “Wait, just yesterday we all had our doubts.” This is hindsight bias – a cognitive error due to which we begin to consider events that have already happened to be inevitable and predictable, even if no one could have imagined such an outcome before.

An example from the life of a startup

Imagine that a startup has launched a new app, and it suddenly “takes off.” Everyone starts telling how they “believed in success from the very beginning,” how “it was obvious,” and how “they didn’t have the slightest doubt.” But if you dig deeper and remember the initial discussions, it turns out that there were disputes, doubts, and technical problems that seemed insurmountable. It's just that now that success has already occurred, all these difficulties seem to have evaporated from memory, and in their place there is confidence that everything was predetermined.

About the brain and research

Our brain is a tricky thing. When it learns the outcome, it seems to "rewrite" the past, adjusting it to the present. It creates the illusion that we always knew how everything would end. This happens because after the outcome is known, the brain receives a "hint" that makes it easier to remember and interpret the events that preceded this outcome.


Fischhoff B. back in 1975 (yes, half a century ago!) showed in his experiments how knowledge of the outcome of an event distorts our memories of how likely this outcome seemed before it happened. Participants were asked to estimate the probability of various outcomes of historical events and then told the actual outcome. After that, they were asked to remember their initial assessments. It turned out that after the outcome became known, people significantly overestimated their initial predictions, as if they “always knew it.”

3. Illusory Truth Effect: The Power of Repetition


We’ve probably all heard the phrase “repetition is the mother of learning.” And it’s true. But there’s a nuance: repetition can make even the most outright nonsense seem credible. This is called the “illusory truth effect” or “repetition effect.” The gist of it is simple: the more often we hear a statement, the more likely we are to believe it to be true, even if we initially knew it to be false.

An example from the business world

In the development world, you can often hear that “Agile is a silver bullet.” This phrase is repeated at conferences, in blogs, at meetings. And then teams, without even figuring out whether this approach is right for them, begin to blindly follow Agile because “everyone does it.” And then they wonder why the results don’t match their expectations.

About the brain and research

The thing is that repetition increases the so-called “processing fluency” of information. Simply put, the brain processes information that is already familiar faster and easier. And this ease of processing is subconsciously perceived as a sign of truth. The brain seems to say, “Since this is so easy to remember and understand, it must be true.”


Research confirms this effect. Begg, Anas, and Farinacci (1992) experimentally proved that repeating statements, even those labeled as false, increases their perceived truthfulness over time. Participants were shown a series of statements, some of which were labeled as false. After several repetitions, even the statements labeled as false began to seem more truthful.


Hasher, Goldstein, and Toppino (1977) also studied this effect and showed that the effect of repetition on the perception of truthfulness persists even after some time. That is, even if you heard a false statement a long time ago, encountering it again can make you believe it.

4. Confabulation: Filling in the Gaps of Memory

It happens that a person tells a story full of details, but... which never happened. He describes the events so vividly and convincingly that it is difficult to suspect a catch. But this is not a lie in its purest form. This is confabulation. In simple terms, this is when the brain fills in the gaps in memory as fictitious events, and the person sincerely believes in their reality.


Confabulation differs from ordinary lies in that the person does not realize that his memories are fiction. He does not try to deceive intentionally, his memory simply “completes” the missing fragments to create a complete and logical picture.

An example from life (not only for managers)

Imagine that you and your friends are discussing an old hiking trip. You remember that it was fun, but the details have been erased from your memory. And then one of your friends starts telling you how you heroically saved a lost raccoon, although there was no raccoon in sight. He sincerely believes this story because his brain “filled” the gap in his memory with a more vivid and memorable (albeit fictitious) event.

About the brain and research

Research shows that damage to the frontal lobe of the brain can increase the tendency to confabulate. The frontal lobes play a key role in controlling memory and distinguishing between real and imagined events. But confabulations can also occur in completely healthy people, especially when the memory of an event is incomplete or fragmentary.


To understand the mechanism of confabulation, it is worth turning to the Source Monitoring model developed by Johnson, M. K., Hashtroudi, S., & Lindsay, D. S. (1993). This model explains how we determine the source of information and why memory errors occur.


Source Monitoring describes the cognitive processes that allow us to answer the question: “How do I know this?” Did I see it with my own eyes, hear it from someone, read it in a book, or just make it up? The model suggests that we do not store “labels” in memory indicating the source, but rather reconstruct it based on various characteristics of the memory itself.


What is this reconstruction based on?


The reconstruction process relies on heuristics (rules for thinking quickly) and characteristics of the memory itself, such as:

  • Sensory details: vivid visual images, sounds, smells, and other sensory sensations associated with the memory.
  • Contextual information: where and when the event occurred, what people were present.
  • Semantic content: the meaning and significance of the information.
  • Emotional coloring: the emotions associated with the memory.


For example, if a memory is filled with vivid colors, sounds, and details, we are more likely to attribute it to personal experience. If it is abstract and lacking specifics, we may assume that we heard about it from someone else or read about it.


How do Source Monitoring errors lead to confabulations?

Source monitoring errors occur when we incorrectly identify the origin of a memory. These errors are the basis for confabulations. For example, a person may mistake a dream for reality or attribute someone else's idea to themselves, sincerely believing that it is their own memory. The Source Monitoring model gives us the key to understanding how and why these "fakes" arise in our memory.

5. Motivated Reasoning: When "Want" Wins Over "Can" (and Common Sense)


Do you know the feeling when you really want to believe in a certain outcome of events, and everything around you begins to seem like confirmation of this belief? In business (and not only), this is called motivational reasoning. The essence is simple: we adjust the facts to the desired result, ignoring or distorting everything that contradicts it. "Want" overshadows "can", and objectivity goes on vacation.

Example

Imagine a product manager who is fired up by the idea of ​​using a new, "hyped" technology in a project. He is sure that this is the very "silver bullet" that will solve all problems and bring the company millions. Risks? Nonsense! Warnings from colleagues? They are simply jealous! As a result, the product manager begins to see only those articles, studies, and examples that confirm his belief. And all the "inconvenient" facts are either ignored or interpreted in a favorable light.


Each of us has fallen into such a trap at least once. It is important to understand that this is not just optimism or excessive self-confidence. This is a very specific cognitive process that can lead to serious mistakes and losses.

What Science Says (and Ziva Kunda)

Back in 1990, Ziva Kunda conducted a study that has become a classic, which clearly showed how our desires affect thinking. Kunda proved that when we have a strong motivation to come to a certain conclusion, we begin to use reasoning strategies that are most likely to lead us to this conclusion, even if these strategies are not entirely logical or objective.


In her experiments, Kunda used different methods to induce one or another motivation in the participants. For example, in one experiment, participants were asked to read an article about the health effects of caffeine. One group was told that the study showed that caffeine was harmful, while the other group was told that caffeine was beneficial. Participants who liked coffee were motivated to believe in the benefits of caffeine, while those who didn't like it were motivated to believe in its harm. The results showed that motivated participants didn't just ignore "inconvenient" information.


On the contrary, they were much more subtle:

  • They looked for confirmation. They spent more time reading the parts of the article that supported their desired conclusion.
  • They critically evaluated contradictions. They came up with counterarguments and explanations for why “inconvenient” facts were irrelevant or were the result of flaws in the research.
  • They used “rules” and “strategies”: They used logical tricks and dubious reasoning to justify their position. For example, they might find fault with the research methodology if the results contradicted their beliefs.


In other words, participants “adjusted” reality to their expectations, using all the cognitive tools available to them.

6. The Dunning-Kruger Effect: The Worse, the More Confident


We have all met people who, having barely mastered the basics of something, begin to position themselves as experts. On the contrary, real professionals are often modest and doubt their abilities. This is not just a coincidence - it is a manifestation of the Dunning-Kruger effect, one of the most famous cognitive biases.


The essence of the effect is simple: people with a low level of qualification in a certain area tend to overestimate their abilities, while people with a high level of qualification, on the contrary, tend to underestimate them. It turns out to be a funny curve: the less you know, the more it seems that you know everything. And vice versa.

Real-life examples

  • A beginner programmer: He has mastered the basics of HTML and CSS and already considers himself a web development guru, ready to take on complex projects that he most likely will not cope with. He is sure that "everything is simple" and does not understand how much more needs to be learned.
  • An experienced marketer: Having carried out a successful advertising campaign, he writes off the success to luck or a coincidence, not realizing that his experience and professionalism are behind it. He tends to doubt his abilities and is afraid of new challenges.

What Science Says (and Dunning and Kruger)

David Dunning and Justin Kruger conducted a series of clever experiments that clearly demonstrated this effect. They gave participants tasks in different areas (humor, grammar, logical thinking) and asked them to evaluate their results.


The results were striking.

Not only did incompetent participants show poor results, but they also could not adequately assess their incompetence. They believed that they did much better than they actually did. Moreover, they could not assess the competence of other participants, believing that everyone was at about the same level.


Competent participants, on the contrary, underestimated their results, believing that others did as well as they did. They did not realize how much their knowledge and skills exceeded the level of most people.


The key conclusion of the study was that metacognitive skills - the ability to recognize your own knowledge and skills - are necessary for adequate self-assessment. Incompetent people simply don’t have these skills, so they can’t understand how incompetent they are.

Imposter Syndrome: The Other Side of the Coin

It’s important to note that the Dunning-Kruger effect is closely related to imposter syndrome, a phenomenon in which competent people, on the contrary, feel insecure about their abilities and are afraid of being exposed as “imposters.” They tend to attribute their successes to luck, chance, or help from others, rather than to their own knowledge and efforts.

7. Bias Blind Spot: Other People’s Mistakes Are More Visible Than Your Own


Do you know that feeling when other people’s shortcomings are so obvious, but your own somehow slip away? There’s a term in psychology that explains this: “bias blind spot.” This is when we clearly see how cognitive biases affect other people, but we don’t notice them at all in ourselves. It seems that "they" have flawed thinking, while we have crystal clear and objective thinking.

An example from office life

Imagine a project manager who constantly criticizes a colleague for his commitment to outdated development methodologies. He considers him a conservative who is holding back progress. At the same time, the project manager himself may stubbornly use familiar tools, even if more effective and modern alternatives have long appeared. He simply does not notice his own inertia, but he clearly sees it in others.

Why and about research

The fact is that we only have access to our inner thoughts and feelings. We know why we made this or that decision, what our motives were. And we can only guess about the motives of other people, based on their behavior. Therefore, it seems to us that our decisions are based on rational analysis, and the decisions of others are based on prejudices and stereotypes.

In addition, we tend to consider ourselves "above average" in many parameters, including rationality and objectivity. This also contributes to the emergence of a "blind spot."


Emily Pronin, Daniel M. Lin, and Lee Ross (2002) introduced the concept of a "bias blind spot" and conducted a series of studies that clearly showed how people tend to consider themselves less susceptible to cognitive biases than those around them.

In one of their experiments, participants were asked to rate how susceptible they and other people were to various cognitive biases. The results showed that participants consistently rated themselves as less susceptible to biases than the average person. This effect was observed even when participants were given a detailed description of these biases.

8. Attribution Error: A Mote in Someone Else's Eye


We all tend to judge others by their actions, often without thinking about what could have influenced their behavior. We see a person late for a meeting and immediately conclude that he is unreliable and unpunctual. But anything could have happened to him: a traffic jam, a broken alarm clock, an urgent call from loved ones. This is the fundamental attribution error - the tendency to explain other people's behavior by their personal qualities, ignoring external circumstances. But we usually explain our own behavior by the situation: “I was late because of traffic jams,” and not because I am disorganized.

My favorite example

Imagine a situation: you are conducting an interview, and the applicant answers the questions unclearly, looks nervous, constantly adjusts his clothes. Having succumbed to the attribution error, you can conclude that he is underqualified, uninterested in the job, or even incompetent.


But think about it: maybe he is simply uncomfortable in his new shoes, and all he can think about is the moment when he will finally take them off? Or is it too hot in the room? Or a million other reasons that we are not aware of. Ignoring these situational factors, and forgetting that the applicant did a great job on the test, got to the office and was not even late, you risk making an erroneous conclusion about the applicant's personal qualities and missing out on a good specialist.

How does it work?

We tend to pay more attention to people than to the situation. When we see how someone acts, we automatically focus on the person himself, his appearance, the way he speaks. External circumstances seem to fade into the background. This happens because it is easier for us to explain a person's behavior by their inner qualities than to understand the complex intricacies of the situation.


A classic experiment conducted by Jones and Harris clearly demonstrates the fundamental attribution error. Subjects were asked to read essays written by other people with a predetermined position (for or against Fidel Castro). The most interesting thing is that the subjects knew that the authors were writing the essays on assignment, that is, they did not have freedom of choice. But even knowing this, the subjects were still inclined to attribute the corresponding beliefs to the authors of the essays. That is, they ignored the situational factor (the assignment to write an essay on a certain topic) and made a conclusion about the personal qualities of the authors (their real attitude towards Fidel Castro).

9. Framing: How to Present Information


Have you ever seen two advertisements for the same product? The first one says: "90% of our customers are satisfied with the result!" In the second: "Only 10% of our customers are unhappy!".


It seems to be the same thing, but it sounds completely different, right? This is framing - a way of presenting information that affects how we perceive it and what decisions we make. In simple terms, how you "wrap" information is how it will be "eaten".


Framing is like a frame for a picture. It does not change the picture itself, but it changes our perception. The same information, placed in a different context, can cause diametrically opposed reactions.

Examples

  • Politics: The phrase "Cutting spending on social programs" sounds negative, and the phrase "Redistribution of budget funds to improve efficiency" is not so scary, although the essence may be the same.
  • Medicine: A doctor may say: "The probability of survival after surgery is 90%", or: "The probability of death after surgery is 10%." Although the numbers are the same, the emotional effect will be different.

Classic Tversky and Kahneman Experiment

Amos Tversky and Daniel Kahneman conducted a number of studies that clearly showed how framing affects decision-making, especially in risky situations. One of the most famous experiments is the "Asian disease problem."


Kahneman and Tversky divided their subjects into two groups. Both were given the same beginning of a hypothetical problem: the United States is preparing for an epidemic of an unknown Asian disease that is expected to kill six hundred people.


Then both groups were given two options for further conditions and asked which one they would prefer.


The first group was given the following options:

  • If program A is adopted, two hundred lives will be saved.
  • If program B is adopted, then one-third of all six hundred sick people will be saved and two-thirds of all of them will die.


The options for the second group stated:

  • If program C is adopted, four hundred sick people will die.
  • If program D is adopted, there is a one-third chance that no one will die, and a two-thirds chance that everyone who gets sick will die.


Now take a small pause and reread both scenarios. Which will you choose for each group?


Programs A and C describe the same outcome: two hundred people will be saved, four hundred will die. The same applies to programs B and D: there is a one-third chance that everyone will be saved, and a two-thirds chance that no one will survive.


In theory, if a person prefers option A, he should also choose option C, since the consequences in both cases are exactly the same.


But no. In the options offered to the first group, the solution was formulated in terms of the number of lives saved, so 72 percent of respondents preferred option A. But in the task of the second group, the answer was formulated in terms of the number of deaths, and 78 percent chose option D.

Neuroscience and Behavioral Economics: A Look from the Inside

As we have already found out, our brain is a tricky thing and likes to play dirty tricks on us, slipping us cognitive distortions. But why does this happen? What is going on in our heads when we "lie" to ourselves? Neuroscience and behavioral economics help answer these questions.


Neuroscience studies the biological basis of our behavior, showing how the brain affects thinking, emotions, and decision-making. Behavioral economics, in turn, studies how psychological factors influence our economic decisions, combining psychology with economics. Together, they give us a powerful tool for understanding why we so often make irrational decisions and how the mechanism of "self-deception" works.

What happens in the brain when we "confirm" our beliefs?

Research using functional magnetic resonance imaging (fMRI) shows that when we encounter information that matches our beliefs, areas of the brain associated with pleasure and reward are activated. These are such "happy centers" as the nucleus accumbens and the ventral tegmental area. Simply put, the brain gets a "hit" of dopamine and experiences pleasant sensations. This creates the illusion that the decision is "correct", even if it is based on distorted information.


Conversely, when we encounter information that contradicts our beliefs, areas of the brain associated with conflict and discomfort are activated, such as the anterior cingulate cortex. This causes unpleasant feelings and makes us look for ways to avoid this cognitive dissonance. One such way is to simply ignore the "inconvenient" information or interpret it in our favor.

Dopamine: The Main "Culprit" (and Not Only)

The neurotransmitter dopamine plays a key role in the processes of learning, motivation, and reward. The release of dopamine when receiving information that confirms our beliefs not only brings pleasure but also strengthens these beliefs, making us more resistant to contradictory information. It turns out to be a vicious circle: the more we believe in something, the more pleasure we get from confirmation of this belief, and the more difficult it is for us to change our minds.

Behavioral Economics: Why Are We So Afraid of Losses?

Behavioral economics concepts such as loss aversion and the endowment effect also play an important role in the formation of cognitive biases. Loss aversion means that we worry more about losses than we enjoy equivalent gains. And the endowment effect makes us overvalue what we already own.


These effects can reinforce motivational reasoning and other cognitive biases. For example, a product manager who has already invested a lot of time and effort into developing a new technology will be even more inclined to ignore its flaws because admitting a mistake will be tantamount to admitting a loss of time and effort. He will cling to any confirmation of its potential in order to avoid the unpleasant feelings associated with loss.


Neuroscience and behavioral economics help us understand that “self-deception” is not just a weakness of character, but the result of complex processes occurring in our brain. Understanding these processes is the first step to learning to make more rational and balanced decisions.

How to Fight Illusions? Instructions for Exiting the Matrix


So, we found out that our brain is a real prankster, and cognitive distortions are not just some abstruse terms, but very real things that affect our lives every day. But don't panic! These "glitches" can and should be fought. The main thing is to arm yourself with the right tools.

Awareness is the First Step to Healing

You should start with the simplest, but also the most important thing - with awareness. Recognizing that cognitive distortions exist and that you are also subject to them is already half the battle. It's like admitting that you have a bad habit: until you do this, you will not be able to get rid of it.

Metacognition: Observing Your Thinking from the Outside

Next, you need to learn to observe your thoughts. Imagine that you have an internal "commentator" that monitors what is happening to you think and notices possible mistakes in time. This is metacognition - the ability to be aware of your own thought processes.


How to develop it? Here are some simple tips:

  • Mindfulness Meditation: breathing exercises, walks in nature - all this helps to develop the ability to concentrate on the present moment and not let thoughts wander into the distance.
  • Thought Journal: Write down your thoughts, decisions, and the reasons why you made them. Regular analysis of these entries will help identify patterns and notice your biases.
  • Reflection After Making Decisions: After an important decision, spend some time thinking about how you came to it, what factors you took into account, and which ones you might have missed.

Critical Thinking: Turn on the "Analyst" Mode

The next important tool is critical thinking. This is the ability to analyze information and your beliefs in terms of logic and evidence.


How to develop critical thinking?

  • Ask the Right Questions: "What evidence do I have?", "Are there other explanations?", "Could I be wrong?", "How reliable are my sources of information?".
  • Use a Scientific Approach: formulate hypotheses and test them against facts. Don't be afraid to refute your own assumptions.
  • Look for a "Devil's Advocate": find someone who will criticize your point of view and offer alternative options.

Feedback: An Outside Perspective is Invaluable

We often don't notice our "blind spots", so feedback from other people is like a mirror in which we can see ourselves from the outside.

Deconstructing Frames: Learning to See Manipulations with Information

We have already talked about how framing affects our perception. Therefore, it is important to learn to recognize how they are trying to "lead" us to a certain conclusion by manipulating the presentation of information.


How to resist?

  • Reformulate: Try to present the information in a different context or from a different point of view.
  • Analyze the Language: Pay attention to the words and expressions that are used. Often they carry a hidden meaning or emotional coloring.
  • Study Manipulation Techniques: Get to know the techniques used in advertising, politics, and the media.

Doubt and Verify: Trust, but Verify

Last but not least: doubt and verify. Do not take information on faith, especially that which coincides with your beliefs.

Conclusion

Cognitive biases are an integral part of human nature. They are caused by the way our brain works and the influence of various factors, from evolutionary to social. It is impossible to get rid of them completely, and it is not always necessary - in some situations, heuristics can be useful.


However, awareness of their existence and the use of strategies to minimize their influence are critical for making informed decisions, effective communication, and successful work. Remember: not only others "lie", but we ourselves, even when we sincerely believe that we are right. Constant work on yourself and the development of metacognitive skills are the key to a more objective perception of reality.