Posted in: Behavioral Finance, Value Drivers, Economics, Leadership, Management and Communication Skills, Portfolio Management
Nobel Prize winner Daniel Kahneman transformed the fields of economics and investing. Essentially, his revelations show that individuals and the choices they make are far more complicated – and far more fascinating – than previously thought.
“Optimism is the engine of capitalism,” Kahneman said. “Overconfidence is a curse. It’s a curse and a blessing. The people who do great things were, in retrospect, overly confident and optimistic – overly confident optimists. They take big risks because they underestimate how big the risks are.”
But in case you only give attention to the success stories, you learn the mistaken lesson.
“If you look at everyone,” he said, “there are a lot of failures.”
The dangers of intuition
Intuition is a type of what Kahneman calls rapid or System 1 considering, and we regularly base our decisions on what it tells us.
“We trust our intuition, even when it’s wrong,” he said.
But we trust our intuition – provided it is predicated on real specialist knowledge. And although we develop expertise through experience, experience alone just isn’t enough.
In fact, research shows that have increases the arrogance with which individuals represent their ideas, but not necessarily the accuracy of those ideas. Expertise requires a selected sort of experience, present in a context that gives regular feedback and is effectively verifiable.
“Is the world in which intuition emerges regular enough that we have the opportunity to learn its rules?” asked Kahneman.
When it involves the financial sector, the reply might be no.
“From the psychological analysis of what expertise is, it is very difficult to imagine that you can develop real expertise, for example in predicting the stock market,” he said. “That doesn’t work because the world isn’t regular enough for people to learn rules.”
However, that does not stop people from confidently predicting financial outcomes based on their experience.
“It’s a puzzle psychologically,” Kahneman said. “How could one learn when there is nothing to learn?”
This sort of intuition is de facto superstition. That is, we must always not assume that we now have expertise in all areas through which we now have intuition. And we shouldn’t assume that others do either.
“If someone tells you they have a strong hunch about a financial event,” he said, “it’s safe not to believe them.”
Noise alarm
Even in verifiable areas where causal relationships are easily identifiable, Noise can distort the outcomes.
Kahneman described a study of underwriters at a well-run insurance company. Although underwriting just isn’t an actual science, it’s an area with learnable rules through which expertise might be developed. The underwriters all read the identical file and set a premium. It was assumed that there could be differences within the premiums set by each parties. The query was how big the deviation is.
“What percentage would you expect?” Kahneman asked. “The number that comes to mind most often is 10%. It’s quite high and a conservative estimate.”
However, when calculating the typical, there was a deviation of 56%.
“Which really means these underwriters are wasting their time,” he said. “How can it be that people make so much noise in their judgments and are not aware of it?”
Unfortunately, the noise problem is not just limited to underwriting. And multiple people aren’t required. One is commonly enough. Even in additional binary disciplines, results might be different when using the identical data and the identical analyst.
“Whenever there is a verdict, there is noise and probably a lot more than you think,” Kahneman said.
For example, radiologists got a series of x-rays and asked to diagnose them. Sometimes they were shown the identical x-ray.
“In a shockingly high number of cases the diagnosis is different,” he said.
The same was true for DNA and fingerprint analysts. Even in cases where there ought to be a foolproof answer, noise could make certainty unattainable.
“We use the word bias too often.”
While Kahneman has spent much of his profession studying bias, he now focuses on noise. He believes that bias could also be overdiagnosed, and he recommends assuming that noise is the explanation for most decision-making errors.
“We should consider noise as a possible explanation because noise and bias lead to different remedies,” he said.
Hindsight, optimism and loss aversion
When we make mistakes, they naturally are inclined to go in two opposite directions.
“People are very loss-averse and very optimistic. They work against each other,” he said. “People don’t realize how bad the odds are because they’re optimistic.”
As Kahneman’s research on loss aversion has shown, we experience losses more strongly than gains.
“Our estimate is 2 to 1 in many situations,” he said.
However, we are inclined to overestimate our probabilities of success, especially within the planning phase. And regardless of the end result, hindsight is 20/20: why things worked out or not is at all times obvious in hindsight.
“When something happens, you immediately understand how it happened. You immediately have a story and an explanation,” he said. “You feel like you’ve learned something and you won’t make that mistake again.”
These conclusions are frequently mistaken. The finding shouldn’t be a transparent causal relationship.
“What you should learn is that you were surprised again,” Kahneman said. “You should learn that the world is more uncertain than you think.”
So what can professionals do to enhance their decision-making on the earth of finance and investing, where there’s a lot noise and bias and so little trusted intuition and expertise?
Kahneman suggested 4 easy strategies for higher decision-making that might be applied to each funds and life.
1. Don’t trust people, trust algorithmshttps://rpc.cfainstitute.org/en/research/financial-analysts-journal/2024/financial-analysts-journal-second-quarter-2024-vol-80-no-2
Whether it’s predicting probation and bail violations or determining who will succeed as a research analyst, algorithms are inclined to be preferable to independent human judgment.
“Algorithms beat individuals about half the time. And they match up individually about halftime,” Kahneman said. “There are very few examples of humans outperforming algorithms in predicting judgments. So if there is an opportunity to use an algorithm, people should use it. We believe that designing an algorithm is very complicated. An algorithm is a rule. You can just make rules.”
And if we will not use an algorithm, we must always train people to simulate one.
“Train people in a mindset and an approach to problems that enforces consistency,” he said.
2. Take the massive picture
Don’t take a look at each problem in isolation.
“The best advice we have when designing is to design broadly,” he said. “Consider the decision as part of a class of decisions you are likely to have to make.”
3. Test for regret
“Regret is probably the biggest enemy of good personal finance decision making,” Kahneman said.
So assess how susceptible the shoppers are to this. The greater the potential for regret, the more likely they’re to cancel their account, sell on the mistaken time, and buy when prices are high. High net value individuals particularly are risk averse, he said. So attempt to gauge how risk averse they’re.
“Clients who regret often fire their advisors,” he said.
4. Get good advice
Gaining a far-reaching perspective involves arousing curiosity and searching for advice.
So who’s the perfect advisor? “A person who likes you and doesn’t care about your feelings,” Kahneman said.
For him, that person is Nobel Prize winner Richard H. Thaler.
“He likes me,” Kahneman said. “And I don’t care about my feelings at all.”
If you enjoyed this post, do not forget to subscribe.
Image courtesy of IMAGEIN