The art of using cognitive biases is trending quickly. Cognitive and decision science experts are everywhere and for a good reason. Marketing directors, politicians, and everybody who bet their success on public perception want to pick their minds about how to get their message across. Or maybe (nobody is populism proof), what their message is supposed to be for them to win. Appearances over substance, but who am I to judge the winners? When we get down on a technical level the question is: how to nudge (or hack) people’s decisions making?
Cognitive biases are our weak spot, our Achilles heel. We are all biased, even the best of us, well informed about their biases are still vulnerable (maybe a bit less). Every bias (and there are tens of them) has been a subject of extensive studies . As a common treat (consider only the name) they are all considered to be defects in our reasoning, a glitch in the otherwise pinnacle of Earth’s biological evolution. Here I will try to broaden that negative view about cognitive biases to a degree that transforms a bug into a feature. In other words, to include but is not limit ourselves to the popular negative notion.
Let’s start with the very nature of biases. Our mind is constantly running a model of reality in order to infer what happens next and prepare our body for it. The model is based on our past experiences and is limited to our mental and intellectual abilities. All that happens mostly in real-time where in-depth analysis is not an option, so we use stereotypes and prejudices to make it quick and we call it intuition. Even we can reflect deeper upon some matter, our starting position is intuitive. The model is like any model imperfect and stochastic, but it’s the only way we can survive in a constantly changing and often aggressive environment. Our mind has not evolved to think but to help us survive, the thinking part is just a side effect.
First, confirmation bias: “Paying more attention to information that reinforces previously held beliefs and ignoring evidence to the contrary.“. Yes, facts are facts and we all must base our decision-making on facts. Any cherry-picking or other distortions should be avoided to the best of our abilities. The problem is that we get second-hand facts, how many of the facts you accept as true you can check first hand. You may have some other ways to estimate the validity of what you have been presented with. Here comes the fake news phenomenon. You are browsing around and a “news” pops up: Bill Gates conspiracy claiming he kills (or is about to kill) 700000 people thru Covid-19 vaccination. Next is Hillary’s pizza-gate conspiracy, and another, and another. Can you check all the “facts” presented? If you don’t, that means you are not open-minded and you are biased (confirmation bias that is). Do I have the time to double-check all the craziness I come across wandering around in cyber-space? Is that can be classified as confirmation bias, technically — yes, but I have only one life, so… Finally, where is the borderline of having certain socio-political and moral stands based on my core values and my confirmation bias?
Next comes Bandwagon Effect: A person is more likely to go along with a belief if there are many others who hold that belief. Other names for this are “herd mentality” or “group thinking.”. It is nice to be appreciated by your family, community, peers, and in many cases that is the only way for you to survive. Yes, in contemporary society critical thinking must be paramount, but still, we all do it one way or another. Take the language, it contains the wisdom of the respective society accumulated throughout its history. In other words that is the wisdom of the average person, in order to get into the language something must be popular enough in time and population. Later we use that language to express our thoughts and feeling and that shapes them in a sublime way. There is no better example of group thinking than language. Should we try to make our individuality shine any other way? If you have some talent in non-verbal arts, you may, but for the rest of us…
Next is Anchoring Bias: “Over-relying on the first piece of information obtained and using it as the baseline for comparison.”. Yes, we have to do our homework. A decision is to be made after a thorough investigation, collecting and weighing all for and against, etc. Our intuition (or System 1) provides us with a quick (sometimes vital), but maybe not an optimal solution. Plus our intuition reflexes in a shortcut way our rational thinking (System 2). Maybe the point of Anchoring Bias is not the lack of time but the lack of will to consider more details, which is fair. Still, time is just one resource the lack of which we may compensate with intuition. In a real situation, the lack of any other resource (importance of the problem, mental capacity, etc.) could be treated in a similar way. Another point could be that our intuition has been trained with numerous detailed studies of the matter, and in certain circumstances, our intuition (superficial but quick) is the only way.
Next is Ingroup Preference Bias: “People tend to divide themselves into groups, and then attribute positive attributes to their own group.” A..ah, the stereotypes — national, religious, racial, class… We all use them mostly on others but on ourselves too. We think about ourselves as belonging to some stereotypical group as a part of our identity. We know, that is not a fair way to judge people, but life is so much easier that way. That is another example of System 1 vs System 2 approach, intuitive and superficial vs objective and systematic. The argument goes in a similar way: if we have the time, the smarts, the will, the world would be a much more tolerant place. In the movie “Up in the air” Ryan has a line: “I’m like my mother, I stereotype. It’s faster.”
I can go on and on, but I hope you get my point. The so-called cognitive biases are more features than bugs depending on our bias towards them.