Statistics was never one of my favourite subjects. When faced with grades at the wrong end of the bell-curve, I would console myself with a quote attributed to Mark Twain - "There is lies, damn lies, and statistics"!
However, over the years, I have warmed-up to the subject - thanks largely to the stunning visualisations of Hans Rosling, and books like Daniel Kahneman's "Thinking Fast, Thinking Slow". This post is about the latter.
The nudge for this came about ten days ago when a friend posted a book recommendation with a quote by Richard Feynman - "The first rule in science is that you must not fool yourself, and you are the easiest person to fool". The book in question was - "The power and paradox of self-deception" by Shankar Vedantam.
The first question that came to my mind was - Is this any different from the ideas described so well in Kahneman's book? I went back to the book and found that he discussed not only all sorts of cognitive biases but also ways in which the intuitions of subject experts can be built into robust decision-making models.
What I like about Kahneman is that he is a practitioner who moved to academia. Unlike most ivory tower theorists, he served in the fledgeling Israeli Army in the 1950s, setting SoPs (still in use) for officer selection before moving to a university in USA, and winning the first Nobel prize for psychology/economics in 2002.
Two of my key takeaways from this book were on Subjective Confidence and Intuitions vs. Formulas. Here are two quotes from the book on the topics which are perhaps best described in the author's own words.
First, Subjective Confidence.
Facts that challenge...basic assumptions - and thereby threaten people's livelihood and self esteem - are simply not absorbed. The mind does not digest them. This is particularly true of statistical studies of performance, which provide base rate information that people generally ignore when it clashes with their personal impressions
We know that people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers
In other words, people who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys who would have distributed their choices evenly over the options. Even in the region they knew best, experts were not significantly better than non-specialists...The reason is that a person who acquired more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident.
...Using an analogy from Isaish Berlin's essay on Tolstoy, "The Hedgehog and the Fox" -
Hedgehogs "know one big thing" and have a theory about the world; they account for particular events within a coherent framework, bristle with impatience towards those who do not see things their way, and are confident about their forecasts. They are also especially reluctant to admit error. For hedgehogs, a failed prediction is almost always "off only on timing", or "very nearly right". They are opinionated and clear, which is exactly what televisions producers like to see on their programs. Two hedgehogs on different sides of an issue, each attacking the idiotic ideas of the adversary, make a good show.
Foxes, on the other hand, are complex thinkers. They don't believe that one big thing drives the march of history. Instead the foxes recognise that reality emerges from the interactions of many different agents and forces, including blind luck, often producing large and unpredictable outcomes....Foxes are less likely to to be invited to participate in television debates.
On Intuitions vs. Formulas
Paul Meehl's book - "Clinical vs. Statistical prediction: A Theoretical Analysis and a Review of Evidence"...This 'disturbing little book' presented the results of 20 studies to examine whether Clinical Predictions based on the subjective impressions of trained professionals were more accurate than statistical predictions made by combining a few scores or ratings according to a rule... The formula was more accurate than 11 or 14 counsellors! It was proven again in other studies on violations of payroll, success in pilot training and criminal recidivism.
Princetown economist and wine lover Orley Ashenfelter shows us the compelling power of the power of simple statistics to outdo world renowned experts. Ashenfelter predicted the future value of Bordeaux wines based on info available the year they were made - he converted conventional knowledge into a statistical formula that predicts the price of wine based on three features of the weather - average temp over the summer growing months, the amount of rain at harvest, and the total rainfall during the previous winter.
A key conclusion... to maximise predictive accuracy, final decisions should be left to formulas, especially in low-validity environments
Does Vedanam's new book present any new ideas? I don't know yet. It does seem to me that path-breaking ideas presented in "Thinking Fast, Thinking Slow" may now be getting repackaged in newer books. Perhaps I am suffering from an anchoring bias, but I do believe Kahneman's book is something you need to have on your nearest bookshelf, as a constant reminder that there is more to this world than meets the eye, that intuitions can be trusted - but only to a certain extent.
-----------------------------------
REFERENCES & LINKS
* Cognitive Biases - https://www.verywellmind.com/what-is-a-cognitive-bias-2794963
* Animated book summary - https://youtu.be/uqXVAo7dVRU
* The hedgehog and the fox - http://assets.press.princeton.edu/chapters/s9981.pdf
* How Big Data can predict the wine of the century - https://www.forbes.com/sites/sap/2014/04/30/how-big-data-can-predict-the-wine-of-the-century/?sh=5f2c35c531a9
interesting post !!... congrats unknown blogger
ReplyDeleteOpened my eyes to a new concept ... Plus I learnt what anchoring bias means ...
ReplyDelete