What is Intuition, Anyway?
Before we proceed with Kahneman’s critique on intuition, it is important that we first know what intuition really is.
Intuition is defined as the ability to obtain knowledge without using inference or reason. Intuition gives one the understandings, judgments, views and beliefs he cannot justify or verify empirically. The right hemisphere of the brain, which is usually linked with creative processes, is said to be the area behind intuitive thinking.
How Everything Started[wp_ad_camp_2] The path towards the critiquing of intuition started in 1969, when Kahneman was still a professor at the Hebrew University. There, he bore witness to a lecture given by Amos Tversky, an authority in the realm of decision making. What started out as a discussion between the two thinkers – Tversky thought of humans as good intuitive statisticians, while Kahneman did not – brought about the path of research partnership between the two academicians.
What “Thinking Fast and Slow” is All About
Tversky and Kahneman’s decades of research and debates are the rich contents of the book “Thinking Fast and Slow.” In this book, the author talks about the dual process of human judgment. This school of thought characterizes the decision-making process as a relationship between interlinked yet rivaling systems, namely System 1 and System 2.
According to Kahneman, system 1 is in charge of fast and intuitive thinking, while system 2 operates on slow and premeditated thought. Most of the time, system 1 acts on its own – it dictates a person’s feelings, his thoughts and associations, as well as whether or not he likes a person. It can never be switched off.
System 2, on the other hand, is responsible for keeping things in perspective. However, unlike system 1, it only acts up when needed. Firing system 2 is hard work, and requires the synthesis of many chemicals in the brain. Unfortunately, most of the people do not like undergoing this laborious type of thinking.
The consequences of system 2’s slow reaction – if it is not able to stop system 1 from its impulsive actions – is what Kahneman showcases in his book “Thinking Fast and Slow.”
Kahneman’s Experiment[wp_ad_camp_3] After years of debate, Kahneman finally convinced Tversky that humans are not good intuitive statisticians with the “Linda Experiment.”
In this research, Kahneman and Tversky surveyed undergraduate students from top American universities to give their opinion of the following statements, choosing one that is most probable:
- Linda is a bank teller.
- Linda is a bank teller and is active in the feminist movement.
The survey results were surprising, as almost 95% of the respondents chose statement number 2 as most probable.
From a statistical point of view, Kahneman emphasized that the group of bank tellers who happen to be feminists is obviously lesser than the group of bank tellers alone, making statement 2 less probable.
Based on this research, Kahneman concluded that humans have the tendency to commit this logical fallacy – what he called as conjunction fallacy. In essence, Kahneman believes that an individual’s brain has the tendency to favor an experience that defies the law of probability.
Kahneman’s View of Intuition
From his research and studies, Kahneman showcases in his book that intuition is a snakepit of biases. Although this is the case, he believes that intuition or system 1 works well for short-term decisions, like that of which surgeons and firefighters need to do. He stresses, however, that it is not beneficial for expert judgments or long-term psychological assessments.
Kahneman’s view on intuition is strengthened by his many experiments, one of which involves participants who were asked to estimate probabilities. Respondents immediately pull the cock of their “mental shotguns,” simply estimating probabilities based on stereotypes and representativeness.
He adds that a human’s capacity for quick, intuitive thinking is the source of everything he does right. One’s associative memory enables one to discern the rare from the expected immediately, so that he can search for a causal interpretation. This is what Kahneman calls as evolved intuitive thinking.
Just as intuitive thinking is the source of the things we do right, it is also the source of most of the things we do wrong. Humans have the tendency to focus their sights on loss, rather than gain. This is because we view loss as painful, compared to its equivalent gain. These errors are predictable, however, system 1 – the tendency to think quickly – trumps over system 2, or conscious and effortful deliberation. From there, humans have the tendency to do intuitive thinking, even if it becomes more and more unreliable.
What To Do, Then?
Given the flaws in intuition, Kahneman suggests using base rates in making guesses. Also known as unconditional probabilities, base rates refer to the ratio of historical frequency of the thing you are inferring on, compared to what the entire population represents. These can help you adjust your guesses, given the evidence you have at hand.
If you have a tendency to lean towards your system 1 – or your gut feeling – Kahneman suggests that you stereotype base rates by discerning such properties in individuals. This will then save you from the mistake of favoring personal impressions over factual statistical information.
Because of the errors that come with intuitive thinking, Kahneman are advising his readers to lean towards design policies that help people make better choices. And, when all else fails, it is better to leave the decision-making process to tried and tested computer algorithms.
In “Thinking Fast and Slow,” Kahneman concludes that a rational person would rather be hated than be loved, as long as his preferences remain consistent. This rationality, Kahneman adds, is what influences public policy, economics and grand strategy.
In his book, Kahneman tells that humans rarely meet the standards of rationality, even when they think that they are reasonable. After all, Kahneman emphasizes that “Maintaining one’s vigilance against biases is a chore – but the chance to avoid a costly mistake is sometimes worth the effort.