Thinking Fast and Slow
System 1 and System 2
System 1
Operates automatically and quickly, with little or no effort, and no sense of voluntary control
Examples
- Detect that one object is farther than another
- Detect sadness in a voice
- Read words on billboards
- Understand simple sentences
- Drive a car on an empty road
Characteristics
- Can arise from expert intuition, trained over many hours of learning
- Example: chess master can recognize a strong move within a second, where it would take a novice minutes of System 2 thinking
- Generates feelings, intuitions
- If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions
- Can detect errors, provoking surprise
- Recruits System 2 for additional processing
- Example: veteran firefighter who entered a burning house with his crew, felt something was wrong, and called for them to get out. The house collapsed shortly after. He only later realized that his ears were unusually hot but the fire was unusually quiet, indicating the fire was in the basement
System 2
Allocates attention to the effortful mental activities that demand it, including complex computations; often associated with the subjective experience of agency, choice and concentration
Examples
- Focus attention on a particular person in a crowd
- Exercise faster than is normal for you
- Monitor your behavior in a social situation
- Park in a narrow space
- Multiply large numbers
Characteristics
- Lazy: if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action
- Example: not take time to check intuitive answer
- Limited budget of attention
- All forms of voluntary effort (cognitive, emotional, physical) seem to draw at least partly on a shared pool of mental energy
- In parallel: if you’re navigating traffic on a busy highway, it becomes far harder to solve a multiplication problem
- Serially: can experience “ego depletion” (depletion of voluntary effort ressources), can be offset by eating sugar
- Examples: strong emotions during a sad film worsens physical stamina later; memorizing a list of digits makes subjects more likely to yield to more decadent desserts; study showed that judges are more likely to approve parole after meals, and fall back on easy default denial when hungry and tired (although study not replicated)
Heuristics and Biases
Heuristics
Answering an easier question
Examples
- Target question: How much is saving an endangered species worth to me? Heuristic question: How much emotion do I feel when I think of dying dolphins?
- Target question: How happy are you with your life? Heuristic question: What’s my current mood?
- Target question: How should financial advisers who commit fraud be punished? Heuristic question: How much anger do I feel when I think of financial predators?
Biases
Narrow framing, failure to account for missing evidence
- Or WYSIATI (what you see is all there is): when presented with evidence, especially those that confirm your mental model, you do not question what evidence might be missing. System 1 seeks to build the most coherent story it can, it does not stop to examine the quality and quantity of information
- Causes over-confidence because confidence depends more on the plausibility of the mental story created than the quality and quantity of information
Confirmation bias
- We selectively pay attention to data that fit our prior beliefs and discard data that don’t, we seek out sources that tend to support our beliefs
- Ordering effect: first impressions matter, they form the “trunk of the tree” on which later impressions are attached to like branches and it takes a lot of work to reorder the impressions to form a new trunk ; example: work meetings often polarize around the first and most vocal people to speak, they would better yield the best ideas if people could write down opinions beforehand
- Halo effect: if you think positively about something, it extends to everything else you can think about that thing (simpler, more coherent story)
Narrative fallacy, hindsight bias
- People tend to package up a messy world into a clean-cut story, seeking cause-and-effect explanations
- Causes trump statistics
- It’s hard for us to deduce particular cases from statistics; when results of a study surprise us, we silently omit ourselves and the people we know
- On the other hand, we are pretty good at infering statistics from particular cases; when we are shown videos introducing typical subjects of a study, and then told they acted as predicted, we are much better at grasping the results and are we are able to generalize
- Gives us unjustified confidence about predicting the future
- Examples
- History is presented as an inevitable march, rather than a chaotic mishmash of influences and people; if the past were so easily predictable in hindsight, then why can’t you predict the future?
- Management literature profiles the rise and fall of companies, attributing the growth to key decisions and leadership styles; the stories are presented as inevitabilities, ignoring all the other things that didn’t happen that could have caused the company to fail
- Beware of explanations that can be applied to either outcome
Availability bias
- When trying to estimate the size of a category or the frequency of an event, you instead report the heuristic of the ease with which instances come to mind
- More vivid examples and personal experiences are more available than mere words or statistics
- Example: items that are covered more in media take on greater importance than those that aren’t, even if the latter have more practical importance
Other statistical mistakes
- Law of small numbers: the smaller your sample size, the more likely you are to have extreme results, don’t fall for outliers
- We pay more attention to the content of the story than to the reliability of the data
- Example: people not trained in statistics pay little attention to the sample size
- Reversion to the mean: over repeated sampling periods, outliers tend to revert to the mean
- Example: athletes on the cover of Sports Illustrated seem to do much worse, explained by buckling under the spotlight, but really because the athlete had had an outlier year and had now reverted to the mean
- Anchoring: when you are exposed to a number, the number affects your estimate of an unknown quantity
- Example: when donations for a nonprofit requested $400, the average was $143; when requesting $5, the average was $20
- Also works in the idea space: when you hear an extreme idea like “build the wall”, you anchor to that idea so that qualifications seem like concessions
- Neglecting the prior (base rate) when applying Bayes’ rule, we only take into account causal base rates