Kahneman 2011 Penguin Books

From Bioblast
Jump to: navigation, search
Publications in the MiPMap
Kahneman D (2011) Thinking, fast and slow. Penguin Books 499 pp.

» Penguin

Kahneman D (2011) Penguin Books

Abstract: Daniel Kahneman, recipient of the Nobel Prize in Economic Sciences for his seminal work in psychology challenging the rational model of judgment and decision making, is one of the world's most important thinkers. His ideas have had a profound impact on many fields-including business, medicine, and politics - but until now, he has never brought together his many years of research in one book.

In Thinking, Fast and Slow, Kahneman takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think and make choices. One system is fast, intuitive, and emotional; the other is slower, more deliberative, and more logical. Kahneman exposes the extraordinary capabilities - and also the faults and biases - of fast thinking, and reveals the pervasive influence of intuitive impressions on our thoughts and behaviour. The importance of properly framing risks, the effects of cognitive biases on how we view others, the dangers of prediction, the right ways to develop skills, the pros and cons of fear and optimism, the difference between our experience and memory of events, the real components of happiness - each of these can be understood only by knowing how the two systems work together to shape our judgments and decisions.

Drawing on a lifetime's experimental experience, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our professional and our personal lives - and how we can use different techniques to guard against the mental glitches that often get us into trouble. Thinking, Fast and Slow will transform the way you take decisions and experience the world

Why is there more chance we'll believe something if it's in a bold type face? Why are judges more likely to deny parole before lunch? Why do we assume a good-looking person will be more competent? The answer lies in the two ways we make choices: fast, intuitive thinking, and slow, rational thinking. This book reveals how our minds are tripped up by error and prejudice (even when we think we are being logical), and gives you practical techniques for slower, smarter thinking. It will enable to you make better decisions at work, at home, and in everything you do.


Bioblast editor: Gnaiger E

Some quotations

  • p3: Learning medicine consists in part of learning the language of medicine. A deeper understanding of judgments and choices also requires a richer vocabulary than is available in everyday language.
  • p4: .. the focus on error does not denigrate human intelligence, any more than the attention to diseases in medical texts denies good health. .. improve the ability to identify and understand errors of judgment and choice .. by providing a richer and more precise language to discuss them.
  • p5: Our subjective judgments were biased: we were far too willing to believe research findings based on inadequate evidence and prone to collect too few observations. .. we found that our expert colleagues, like us, greatly exaggerated the likelihood that the original result of an experiment would be successfully replicated even with a small sample. They also gave very poor advice to a fictitious graduate student about the number of observations she needed to collect. Even statisticians were not good intuitive statisticians.
  • p12: When faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.
  • p28: .. it is easier to recognize other people's mistakes than our own.
  • p29: A sentence is understood more easily if it describes what an agent (System 2) does than if it describes what something is.
  • p35: A general "law of least effort" applies to cognitive as well as physical exertion. The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action. In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and costs. Laziness is built deep into our nature.
  • p45: .. when people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound. If System 1 is involved, the conclusion comes first and the arguments follow.
  • p62: .. familiarity is not easily distinguished from truth.
  • p67: .. the effect of repetition on liking is a profoundly important biological fact. .. The link between positive emotion and cognitive ease in System 1 has a long evolutionary history.
  • p68: .. putting the participants in a good mood before the test by having them think happy thoughts more than doubles accuracy.
  • p77: Statistical thinking derives conclusions about individual cases from properties of categories and ensembles. Unfortunately, System 1 does not have the capability for this mode of reasoning; System 2 can learn to think statistically, but few people receive the necessary training.
  • p78: When we survey the reaction to these products, let's make sure we don't focus exclusively on the average. We should consider the entire range of normal reactions.
  • p81: .. understanding a statement must begin with an attempt to believe it: you must first know what the idea would mean if it were true. Only then can you decide whether or not to unbelieve it. .. people are more likely to be influence by empty persuasive messages, such as commercials, when they are tired and depleted.
  • p81: Contrary to the rules of philosophers of science, who advice testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold.
  • p84: If the observers share a bias, the aggregation of judgement will not reduce it. Allowing the observers to influence each other effectively reduces the size of the sample, and with it the precision of the group estimate.
  • p85: The principle of independent judgments (and decorrelated errors) has immediate applications for the conduct of meetings, .. before an issue is discussed, all members of the committee should be asked to write a very brief summary of their position. This procedure makes good use of the value of the diversity of knowledge and opinion in the group. The standard practice of open discussion gives too much weight to the opinions of these who speak early and assertively, causing others to line up behind them.
  • p87: It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find than knowing little makes it easier to fit everything you know into a coherent pattern. .. WYSIATI (what you see is all there is) .. neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.
  • p98: The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. .. eureka
  • p112: For a research psychologist, sampling variation is not a curiosity: it is a nuisance and a costly obstacle, which turns the undertaking of every research project into a gamble. .. Using a sufficiently large sample is the only way to reduce the risk. Researchers who pci too small a sample leave themselves at the mercy of sampling luck. .. even the experts paid insufficient attention to sample size.
  • p114: .. sustaining doubt is harder work than sliding into certainty. .. we are prone to exaggerate the consistency and coherence of what we see. .. it will produce a representation of reality that makes too much sense.
  • p117: The tendency to see patterns in randomness is overwhelming - certainly more impressive than a guy making a study.
  • p118: I plan to keep the results of the experiment secret until we have a sufficiently large sample. Otherwise we will face pressure to reach a conclusion prematurely.
  • p131: - availability heuristic .. The same bias contributes to the common observation that many members of a collaborative team feel they have done more than their share and also feel that the others are not adequately grateful for their individual contributions. .. You will occasionally do more than your share, but it is useful to know that you are likely to have that feeling even when each member of the team feels the same way.
  • p147: The proportion of marbles of a particular kind is called a base rate.
  • p150: Logicians and statisticians have developed competing definitions of probability, all very precise. For laypeople, however, probability (a synonym of likelihood in everyday language) is a vague notion, related to uncertainty, propensity, plausibility, and surprise. .. In all the years I spent asking questions about the probability of events, no one ever raised a hand to ask me, "Sir, what do you mean by probability?" .. Everyone acted as if they knew how to answer my questions, although we all understood that it would be unfair to ask them for an explanation of what the word means.
  • p152: .. instructing people to "think like a statistician" enhanced the use of base-rate information, while the instruction to "think like a clinician" had the opposite effect.
  • 153: .. worthless information should not be treated differently from a complete lack of information, but WYSIATI makes it very difficult to apply that principle.
  • p154: The essential keys to disciplined Bayesian reasoning can be simply summarized: * Anchor your judgement of the probability of an outcome on a plausible base rate. * Question the diagnosticity of your evidence.
  • p159: The most coherent stories are not necessarily the most probable, but they are plausible, and the notions of coherence, plausibility, and probability are easily confused by the unwary. The uncritical substitution of plausibility for probability has pernicious effects on judgments when scenarios are used as tools of forecasting. .. adding detail to scenarios makes them more persuasive, but less likely to come true.
  • p163: .. a question phrased as "how many?" makes you think of individuals, but the same question phrased as "what percentage?" does not.
  • p165: They added a cheap gift to the expensive product, and made the whole deal less attractive. Less is more in this case.
  • p168: Statistical base rates are facts about a population to which a case belongs, but they are not relevant to the individual case. Causal base rates change your view of how the individual case came to be. * Statistical base rates are generally underweighted, and sometime neglected altogether, when specific information about the case at hand is available. * Causal base rates are treated as information about the individual case and are easily combined with other case-specific information.
  • p173: To teach students any psychology they did not know before, you must surprise them. .. when they surprised their students with a surprising statistical fact, the students managed to learn nothing at all. But when the students were surprised by individual cases .. they immediately made the generalization and inferred .. The test of learning psychology is whether your understanding of situations you encounter has changed, not whether you have learned a new fact.
  • p179: Regression to the mean was discovered and named lat in the nineteenth century by Sir Francis Galton, a half cousin of Charles Darwin ..
  • p182: Our mind is strongly biased toward causal explanations and does not deal well with "mere statistics". .. regression to the mean has an explanation but does not have a cause.
  • p183: The statistician Howard Wainer has drawn up a long list of eminent researchers who have made the same mistake - confusing mere correlation with causation. Regression effects are a common source of trouble in research, and experience scientists develop a healthy fear of the trap of unwarranted causal inference.
  • p194: Extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of System 1. .. confidence is determined by the coherence of the best story you can tell from the evidence at hand.
  • p199: .. Taleb introduced the notion of a narrative fallacy to describe how flawed stories of the past shape our views of the world and our expectations for the future. Narrative fallacies areise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simole; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.
  • p202: To think clearly about the future, we need to clean up the language that we use in labeling the beliefs we had in the past. .. A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed.
  • p214: .. for the large majority of individual investors, taking a shower and doing nothing would have been a better policy than implementing the ideas that came to their minds.
  • p215: In highly efficient markets, however, educated guesses are no more accurate than blind guesses.
  • p216: .. rewarding luck as if it were skill.
  • p217: We know that people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers.
  • p219: Philip Tetlock - People who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhance illusion of her skill and become unrealistically overconfident. .. Experts .. are dazzled by their own brilliance and hate to be wrong.
  • p221: The question is not whether these experts are well trained. It is whether their world is predictable.
  • p244: Once you understand the main conclusion, it seems it was always obvious.
  • p250: The optimism of planners and decision makers is not the only cause of overruns. Contractors of kitchen renovations and of weapon systems ready admit (though not to their clients) that they routinely make most of their profit on additions to the original plan. The failures of forecasting in these cases reflect the customers' inability to imagine how much their wishes will escalate over time. They end up paying much more than they would if they had made a realistic plan and stuck to it.
  • p255: Because optimistic bias can be both a blessing and a risk, you should be both happy and wary if you are temperamentally optimistic.
  • p263: Clinicians who were 'completely certain' of the diagnosis antemortem were wrong 40 % of the time. .. Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors.
  • p390: I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.
  • There is a clear contrast between the effects of income on experienced well-being and on life satisfaction. .. The easiest way to increase happiness is to control your use of time. Can you find more time to do the things you enjoy doing?

A brief reading list

» Oroboros 25 years - since 1992


Labels:






MitoFit 2020.4, Publication efficiency, International System of Units