Thinking, Fast and Slow
Written by a Nobel Prize recipient in Economics, this work of reference about intuitive thinking's strengths and dangers is a must read.
Author(s): Daniel Kahneman
Publisher: Allen Lane
Date of publication: 2011
Manageris opinion
Even if the expression is a cliché, we say unhesitatingly that this book is a “must read.” First, think of the prestige you will gain when you slip into a conversation that you are reading a book authored by a Nobel Prize winner in Economics!
But more importantly, this book will plunge you into the most enthralling of the mysteries of the human mind, conducted by a guide who, in addition to his deep scholarship and expertise, never forgets to be understandable, practical and even funny! More than once, as you turn the page, you will surprise yourself thinking: “Of course, that’s it!” Indeed, Daniel Hahneman’s demonstrations reflect the daily experiences and the psychological mechanisms with which we struggle deep down.
As he moves through the chapters, the author reviews the many wonders and woes of intuitive thought, the “hidden star” which—unbeknownst to us—often dominates over our rational mind: hypersensitivity to the external environment and a string of biases (halo effect, excess confidence, etc.), tendency to draw conclusions first and seek the underlying justification later, failure to take account of the most basic statistical principles, etc. Without becoming too sophisticated, the chapter on the theory of perspective—the domain that earned the author his Nobel Prize—is more detailed and will interest readers who want to delve more deeply into a theory at the forefront of current thinking on decision making.
More than 500 pages long, this book is a work of reference, which takes some
time to read and consider. Indeed, such a wealth of information cannot be
digested in just a few hours!
See also
Channel your intuition
Our intuition is an astonishing and valuable tool that is always in movement. Subjected to many biases, it may however easily lead us into error. How can we limit this risk to make the most of intuitive thinking?