Eight Fallacies To Avoid in Daily Life
As we live our daily lives, we tend to make many decisions that are ruled by our subconscious minds. We can imagine ourselves as learning machines. Every unique experience sets a new connection in our brain that helps us decide the next time we encounter the same situation. An excellent example of this experience is driving a car. Think of the first time you drove a car. It was a very taxing experience. Given the hundreds of decisions one needs to make when riding a vehicle. However, over time, driving a car becomes a seamless experience. Our mind’s ability to learn and make decisions almost instantaneously helps us get through day-to-day life without consuming the least energy.
Daniel Kahneman is an Israeli psychologist and economist notable for his work on the psychology of judgment, decision-making, and behavioral economics. He was awarded the 2002 Nobel Memorial Prize in Economic Sciences. In his book Thinking Fast & Slow, he talks about a dichotomy in the human brain that consists of two thought processes. The first is a slow, conscious thought process, and the other is a fast, subconscious slow process.
While the fast, subconscious mind has its advantages, it comes with certain disadvantages too. This article discusses some of the drawbacks of biases that prevent us from thinking clearly. A natural question arises as to why we should care about these biases. There are several reasons for it. These are as follows:
- Fast decision-making can be erroneous.
- Awareness about preferences can help us correct them.
The Survivorship Bias
Survivorship or survival bias is a logical error when we generalize a theory based on the survivors of an ordeal and do not consider the collection of people.
- We often hear about the stories of college dropouts such as Bill Gates, Mark Zuckerberg, or Ritesh Agarwal as having built large enterprises. It is a logical fallacy if we start to attribute their success due to their dropping out of college. Furthermore, we may conclude that a college education doesn’t help an individual’s success. However, this runs into the fallacy where we haven’t considered all the college dropouts and then looked at the percentage that makes it enormous compared.
- During World War II, the statistician Abraham Wald took survivorship bias into his calculations when considering how to minimize bomber losses to enemy fire. The Statistical Research Group (SRG) at Columbia University, of which Wald was a part, examined the damage done to aircraft that had returned from missions and recommended adding armor to the areas that showed the most minor damage, based on his reasoning. This contradicted the US military’s conclusions that the most-hit areas of the plane needed additional armor. Wald noted that the military only considered the aircraft that had survived their missions; any bombers that had been shot down or otherwise lost had logically also been rendered unavailable for assessment. The bullet holes in the returning aircraft, then, represented areas where a bomber could take damage and still fly well enough to return safely to base. Thus, Wald proposed that the Navy reinforce areas where the returning aircraft were unscathed since those were the areas that, if hit, would cause the plane to be lost. His work is considered seminal in the then-nascent discipline of operational research.
The Confirmation Bias
Confirmation bias is the father of all biases. It posits that we tend to accept the facts that agree with the hypothesis we have in mind and ignore the data that does not agree with our thesis.
- When reviewing a candidate you like, we tend to focus on only those skills that he/she performed well on and vice-versa.
- When we try to change a habit, we tend to focus on only those days where we could successfully follow the practice and tend to overlook the days when it didn’t work.
- When voting for a particular electoral party, we glorify the successes from their previous run and downplay their failures while the opposition does just the opposite.
- The dot com bubble of the late 90s was also an example of the confirmation that the tech industry down-played the negative signals coming out of poorly run organizations and over-played the internet’s impact in solving real-life problems.
The Cultural Bias
Also known as Herd Effect, or Social Proof.
The Cultural Bias posits that we believe something is right if a large collection or a majority of people around us think it is true.
Examples (from History)
- The Holocaust was an example of such an event where a dictator was able to convince an entire nation to wipe off a certain section of the human race.
- The Sati system was a practice that was practiced in India where a widow had to sit in the funeral pyre of her deceased husband.
- The Untouchability system was a practice that was/is practiced in India where the society disallows mingling with people of a certain caste.
All these practices and incidents, in isolation, seem abhorrent and yet were widely adopted by society and considered legal at some point in time in human history.
- You go on a road and see a mob by the side of the road. All of a sudden, you wait to see what has happened to the mob.
- You are running a marathon and start to feel tired. Suddenly, a small group of runners passes by, and you start to feel back energized.
- A new social media app hits the app store, and although you hate downloading yet another social media app, you still try it anyway.
- Sharing this article on LinkedIn is another example of Social Proof or Cultural Bias that we suffer from on a day-to-day bias 😉.
One of the rationales why this occurs commonly, is because of evolutionary purposes. As hunter-gatherers, one of the most important characteristics that were needed to survive was to stay and hunt in groups. This meant that humans had to agree to group thinking; otherwise, they would be outcasts. With the advancement of civilization, this holds no longer to be true (except for in very extreme scenarios, e.g., war-zones, mountaineering, etc.). However, our genetic makeup hasn’t yet evolved, and we still believe in listening to the wisdom of the crowd.
The Sunk Cost Fallacy
The Sunk Cost Fallacy is one in which we are not ready to give up losses once we have invested some resources in an effort. The resources may be time, money, effort, etc.
- You have worked on a project for multiple years and now hit a dead-end. You are still not ready to abandon it because of the cost involved.
- You have invested emotionally in a relationship and now do not see it going anywhere. Still, you do want to ensure that you can withhold it given the effort that has gone in.
- You have invested in a stock that is going to the drain, and yet you cannot part away with it because of the time, money, and losses that you have already incurred.
The fallacy lies in the fact that the losses are in the past, and the future doesn’t care much about what you have already lost. In general, humans tend to cling to their past experiences and get emotionally attached to losing battles and are not able to think rationally in times of difficulties.
One of the prime examples of this is the never-ending Vietnam War that the US government fought over multiple decades and eventually lost. Multiple governments repeatedly made the argument that given they had already invested so many men and money into the war, pulling out would make the present-day government look like a failure. It then led to a Sunk Cost Fallacy.
There are multiple reasons why humans are not able to overcome the Sunk Cost Fallacy. These are as follows:
- Emotional attachment to past
- Ego clashes when accepting defeat
- Signaling inconsistency, thus losing credibility
The Halo Effect
It is also related to Success Bias.
The Halo Effect is a tendency of humans to judge the positive impressions of a person, brand, and company in one dimension as impacting all the other dimensions of their personality. As humans, we tend to place successful people on a pedestal and start to worship them as Gods. In reality, every one of us has our strengths and weaknesses.
- We see actors, sportsmen in media campaigns endorse products, services, etc., that they are not capable of judging.
- We believe in our leaders’ judgment in policy areas that are entirely out of their scope of the decision.
- We tend to follow the policies adopted by large successful companies even though they aren’t related to our domain, have the same set of resources, or may not be in the same era.
- We tend to hate or worship leaders like Mahatama Gandhi, Nelson Mandela, or Adolf Hitler without looking at their positive and negative qualities.
Instead, the right way would be to look at each individual, company, brand, etc., from a fresh perspective, with a uniform lens, and gather data about the field they are being evaluated in. They will be useful in the fields that they are experts in. However, for the rest of the fields, an unbiased analysis will help you make the right judgment. A simple mental experiment that one could do is to apply the blindfold test. This says that in case the endorsement was done by a person you did not know, would you still be willing to accept/reject the idea. If yes, then the idea and not the vehicle of the idea has merit and should be followed. Otherwise, the idea should be rejected.
The Information Bias
The Information Bias posits that any information after a certain threshold is rendered meaningless. This tends to happen because we tend to obfuscate or ignore the obvious facts and try to go deep down into questions that are not necessarily meaningful.
- Consider an investor who interacts with many entrepreneurs every single day. He/she has the best possible view of all the latest ideas/technologies that are being developed throughout his/her area of expertise. However, he/she still fails to find the next big social-network, E-commerce platform, or chat-application.
- In his short story, “del rigor en la Ciencia” (or “On Exactitude in Science”), is a one-paragraph short story written in 1946 by Jorge Luis Borges about the map–territory relation, registered in the form of a literary forgery. In this story, Borges imagines an empire where the science of cartography becomes so exact that only a map on the same scale as the empire itself will suffice. The only catch here is that the excess of information renders the whole process meaningless.
- Think of all the work that the many epidemiologists and economists do before we hit the Covid-19 pandemic. Most of the work they have done in the last decade was rendered useless as soon as the pandemic hit different countries. Most of them were not looking in the right direction or couldn’t surface up the right set of problems to the world’s governments.
The Action Bias
The action bias indicates our tendency to act even when we think that there is no clear reason as to why the action will lead to clear benefit. There is a popular saying that Movement is not progress. Progress comes with only well-thought-out actions and deliberate action. Therefore, stopping for sometime before deciding which direction to take is sometimes the right way to go about it than taking action and then figuring out what the results are going to look like.
Often, the individuals who are in high-stress environments suffer from this fallacy. Some common examples include hospitals with emergency wards, fast-moving startups, the week before an examination, etc. We have often felt the urge to move fast and take a lot of decisions without thinking about the consequences or the impact of those actions in the long run.
- The action bias is much more pronounced in the field of investing. The famous investor Warren Buffet says that all he needs to make 1–2 right decisions in a year, and that defines his success. Young investors often believe that the more the number of companies they invest in, the higher their chances of success. Unfortunately, this falls flat in the face of reality.
- There is a big examination coming up in the next month. What do you do? You spend 16 hours a day studying every topic until you have learned it by heart. However, if you look at the set of questions from the last ten years, you can easily figure out what areas or topics to study. Quite often, 80% of the tough questions come from 20% of the topics only. This is also called the Pareto Principle.
- Side-notes: This is an interesting blog that I wrote sometime back on common techniques for prioritization that one can use in his/her own daily life.
The Quantity Bias
The quantity bias is a tendency in humans to believe that large quantities of entities have a larger impact than each entity’s quality. This is a common fallacy that gets exacerbated esp. with data-science measures where the emphasis is laid heavily on numbers than anecdotes. Furthermore, the advent of social networks has exacerbated the need for virtue-signaling through quantitative measures. Our success has become the number of re-tweets, likes, loves, shares, etc., that we get on each and every one of our posts and status updates. On the contrary, the world is shaped heavily by a few significant events, outliers, or Black Swans, as the former author Nassim Nicholas Taleb puts it in his book called The Black Swan.
- While hundreds of thousands of books are published a year, only a handful of those end up making any sizeable impact on the conscience of the human population. It is because most of the books published out there do not add to the pyramid of knowledge or experiences of human society.
- In his research paper, The Mundanity of Excellence, Daniel F. Chambliss studied Olympic-level swimmers for more than six years to understand what leads to excellence. He concluded that excellence is not correlated to the number of hours spent in your art. Achieving excellence is closely related to the quality of the hours you spend doing the work. Time is so precious, And you need to ask yourself, “what am I going to do today?” But more importantly, you need to ask yourself, “How will I do it?