thinking fast and slow

Last Updated April 12, 2025

book notes


I actually first stumbled upon this book years ago on the bookstore. But it came up again during my Interaction Design class in college—turns out, it’s super useful for understanding human behavior and building better user experiences. So I decided to revisit it and read it more seriously this time. Before you read my book notes though, just a quick disclaimer: I wrote them back in 2020 when I was still in high school, so take them with a grain of salt. I definitely had some skill issues back then (honestly, maybe I still do 😅).

Part I: Two Systems

Our brain works with two systems: System 1, which is fast, intuitive, and driven by emotion, and System 2, which is slower, more thoughtful, and relies on logic and reasoning.
Multitasking is only effective when the tasks are light, because System 2 has limited attention capacity.
Your pupils dilate when you're exerting mental effort or encountering something that captures your attention.
Self-control requires both attention and effort.
The “ego depletion” effect can be countered by consuming glucose.
Your System 1 often knows more about you than you consciously know about yourself.
Priming works both ways. A useful piece of advice: “Stay calm, no matter how you feel.”
Things that feel familiar are more likely to be perceived as true.
Anything that reduces cognitive strain (like easy-to-read fonts, rhyming, or clear messaging) helps improve understanding.
notion image
If you want to be trusted, use simple language. But remember—your message still needs to be clear and make sense. Simplicity without clarity is pointless.
Students scored higher on tests when the questions were harder to read, because that visual difficulty triggered System 2 and made them avoid System 1’s quick, intuitive (and often wrong) answers.
To survive, living beings tend to avoid unfamiliar stimuli.
Mood strongly influences System 1. When you're in a good mood, intuitive thinking (System 1) is strong, and analytical thinking (System 2) weakens. When you're in a bad mood, the reverse happens.
Stress-Performance curve
notion image
notion image
Note: This section needs further reading.
We perceive cause-and-effect relationships as quickly as we perceive color.
The sense of causality seems to be innate.
"We perceive the world of objects as fundamentally separate from the world of minds, which allows us to imagine bodies without souls and souls without bodies." — Paul Bloom, The Atlantic (2005)
When System 2 is busy, we're more likely to believe everything we see or hear. That’s because System 1 is gullible, and System 2 is in charge of doubt and skepticism. This is why persuasive nonsense (like ads) is more effective when we're tired or mentally drained.

Halo Effect

Our judgment is influenced by previous impressions. For example, police often question witnesses separately to avoid bias.
System 1 generates baseline evaluations—instinctive judgments evolved to help us quickly assess key situations.

Mental Shotgun

Even when you're focusing on a specific question, your System 1 continues generating thoughts and impressions beyond what System 2 has asked for.

Heuristics and Substitution

Heuristics (from eureka) involve replacing a difficult question with an easier one. While sometimes helpful, they can lead to serious errors.
Note: This topic needs deeper reading.
Eureka meaning
notion image
The order of questions influences how people answer—even if they consciously distinguish between them. Emotionally impactful questions can shift the mood and affect all subsequent answers.
Your current mood has a huge effect on how you evaluate your own happiness.

Affect Heuristic

Your likes and dislikes shape your worldview. For example, people who favor war (hawks) and those who prefer peace (doves) view facts through different emotional filters. That said, rational thinking is still possible—reason and evidence can influence our beliefs.
System 2 often acts as an apologist for System 1’s emotions, justifying them rather than challenging them. It tends to approve more than discipline.
Apologist meaning
A person who defends or justifies a belief, idea, or behavior—often in a formal or structured way.

Part II : Heuristic and Bias

Law of Large Numbers

In theory, averages based on large samples should match observed results.

Artifact

An outcome that appears in research but does not represent an actual fact or reality.

Law of Small Numbers

People tend to generalize results from small samples, mistakenly assuming they reflect the larger population.

Cognitive Biases & System Limitations

  • Absurd or extreme numbers tend to capture more attention.
  • System 1 is incapable of estimating levels of certainty.
  • Humans are prone to overestimating the consistency and coherence of what they observe.
  • We are naturally pattern-seeking creatures—this ability helped our ancestors survive by predicting seasonal changes or potential dangers (like lions). However, this tendency also makes us see patterns where none exist. Many aspects of life are actually random, even though we often resist that idea.

Anchoring Effect

A combination of:
  • System 2: deliberate adjustment
  • System 1: automatic priming
Anchoring occurs when an initial reference point (high or low) influences later judgments, such as pricing or estimation.
  • High anchors make prices seem cheap.
  • Low anchors make them seem expensive.
Even experts are vulnerable to this bias. To resist it:
  • Consider the opposite.
  • Treat all initial figures as potentially misleading.
  • Engage System 2 deliberately.

Availability Heuristic

We assess frequency or probability based on how easily examples come to mind.
notion image

Examples of Availability Heuristic in Action:

  • You might cycle less after recalling many bike accident reports.
  • You may doubt a decision if you're asked to list too many reasons to support it.
  • You might believe an event is less preventable after listing many ways to avoid it.
  • A car may seem less impressive after trying to list too many of its benefits.
The more examples you're forced to recall, the harder it gets—leading to lower self-assessment or confidence in your evaluation.

Combating Availability Bias

It’s possible but difficult. One strategy is to remove the emotional shock factor, which can help you remain more alert and critical.

Availability Cascade

A self-reinforcing cycle where media reports on a relatively minor event, causing public panic and leading to large-scale government action.
“The emotional tail wags the rational dog.” —Jonathan Haidt

Affect Heuristic

People tend to simplify decisions by aligning risks and benefits with how much they like or dislike something.

Example:

  • If someone sees a technology as “good,” they’ll downplay its risks.
  • If someone sees it as “bad,” they’ll ignore its potential benefits.

Probability Neglect

We either exaggerate or completely ignore small risks—there’s rarely a balanced response

Representativeness Heuristic

Judgments based on how closely something resembles a stereotype, rather than actual statistical probability.

Base Rates (Base Value)

A foundational benchmark or reference point used to evaluate probability, often ignored when representativeness dominates.

How to Discipline Intuition

"Don’t believe everything that pops into your head."
To be useful, your beliefs must be constrained by the logic of probability.

Bayesian Statistics

Bayesian thinking is the logic of how one should revise their beliefs when new evidence appears.
Key Principles of Bayesian Reasoning:
  1. Base rates matter, even when specific case evidence is available.
  1. Intuitive impressions of evidence are often exaggerated.

How to Apply Bayesian Thinking:

  • Anchor your probability estimates to a reasonable base rate.
  • Question the strength and validity of your interpretation of the evidence.
The more detailed a scenario, the less likely it becomes—even if it feels more plausible.
notion image

Cognitive Pitfalls

Fallacy of Logical Misapplication

People often fail to apply obvious and relevant logic to their reasoning.
“The most coherent story is not necessarily the most probable one. But it feels believable, and coherence, believability, and probability often blend together for the unwary.” — Daniel Kahneman

Types of Base Rates

  1. Statistical Base Rate
    1. A general fact about a population that applies to an individual case—but is often ignored when specific case details are known.
  1. Causal Base Rate
    1. A base rate that explains how a case occurred. People are more likely to incorporate this with other case-specific information.

Stereotypes

Generalizations about a group that are (at least tentatively) accepted as true for all members. Whether accurate or not, stereotypes shape how we think in categories.
tentative meaning
not certain or fixed; provisional. "a tentative conclusion"
“They won’t learn just from statistics. Let’s show them one or two representative individual cases that engage their System 1.” — Daniel Kahneman

Regression to the Mean

Random fluctuations naturally gravitate toward the average.
Examples:
  • Poor performance tends to improve.
  • Excellent performance often declines. This occurs due to luck influencing results in either direction.
regression meaning
a return to a former or less developed state.
a measure of the relation between the mean value of one variable (e.g. output) and corresponding values of other variables (e.g. time and cost).
Success = Talent + Luck Extraordinary Success = Slightly more talent + A lot more luck — Daniel Kahneman

Correlation and Regression

If the correlation between two variables isn’t perfect, regression to the mean will occur.
Example:
Highly intelligent women tend to marry men who are slightly less intelligent—not due to preference, but due to statistical regression.

Flawed Intuition

High confidence in predictions can arise from non-regressive judgments based on weak evidence.

How to Make Less Biased Predictions:

  1. Start with the average performance index (the base rate).
  1. Adjust based on your impression of the case (your intuitive judgment).
  1. Estimate the correlation between your impression and actual outcomes.
  1. Moderate your prediction:
      • If the correlation is 0.30, move 30% of the way from the average toward your intuitive estimate.
      • This is called regression-based adjustment.
Even with this method, errors will still happen—but they’ll be smaller and less biased toward extremes

Other Useful Definitions:

  • Venture: A risky or daring journey or action.
  • Equivalent / Analogous (Padanan): Something similar or comparable in meaning or function.
  • Moderate: Not extreme, in between.

Part III: The Illusion of Overconfidence

Hindsight Bias

"The mistake seems obvious—but only in hindsight. You didn’t know it before it happened."

Outcome Bias

(A combination of hindsight bias and the halo effect): When outcomes dictate our judgments of prior events.
Example: A company’s stock price rises, and we credit the CEO for it. But it might just be luck—we have no actual evidence of the CEO’s impact.

The Illusion of Validity

Our predictions may be only marginally better than chance, yet we remain convinced of their accuracy.
Subjective confidence stems from the coherence of the story built by System 1 and System 2, not from the amount or quality of evidence. In other words, confidence often reflects how easy and consistent the information feels—not how true it is.

The Illusion of Stock-Picking Expertise

The performance of professional investors is inconsistent and largely driven by luck, not skill. There is no significant evidence of reliable expertise.
Some professionals are aware that success is mostly luck—but continue as if it isn’t. (Another case of the illusion of validity?)

Simple Formulas vs Expert Intuition

Statistical formulas often outperform expert judgments. Why?
  • Experts tend to overthink, creating overly complex models.
  • Humans are inconsistent when summarizing complex data.
  • Formulas are consistent—same inputs always yield the same outputs.
🧠 Combining formulas with intuition can improve prediction accuracy.

When Can Expert Intuition Be Trusted?

According to this paper, expert intuition is reliable when:
  1. The environment is sufficiently regular (predictable).
  1. There's extensive, consistent feedback through long-term practice.
Factors supporting the development of expert intuition:
  • Predictability of the environment.
  • Speed and frequency of feedback.
  • Opportunity for deliberate practice.
Examples:
  • It's easier to develop intuition for driving a car than a ship.
  • Psychotherapists often have better-developed intuition than therapists who receive less direct feedback.
Short-term predictions are often more accurate than long-term ones.
Expertise is a collection of skills, not a single ability. Experts are often overconfident because they don’t recognize the limits of their domain knowledge.

Inside vs. Outside View

  • Inside View: Your personal, detailed perspective on a situation.
  • Outside View: Reference class data (similar past cases) that helps form a baseline for more accurate prediction.

The Planning Fallacy

Occurs when plans are unrealistically close to the best-case scenario. This can be mitigated by incorporating reference class statistics (aka base rates).
🧠 Bent Flyvbjerg’s 3-Step Solution:
  1. Identify the appropriate reference class (e.g., kitchen renovations, rail projects).
  1. Collect statistical data (e.g., average cost overruns).
  1. Adjust your prediction based on specific factors of your case.

Optimism Bias

Optimism is a cognitive bias built into System 1, driven by WYSIATI (What You See Is All There Is).
We:
  • Focus on goals and plans, ignoring relevant base rates → leads to planning fallacy.
  • Focus on our own abilities, ignoring those of others.
  • Overemphasize skill and underestimate luck when explaining outcomes → leads to the illusion of control.
  • Focus on knowns and ignore unknowns → makes us overconfident.
“People tend to be overly optimistic about their relative position in any activity they can perform reasonably well.” —Daniel Kahneman

Optimism: A Double-Edged Sword

  • Negative: Makes us blind to risks and overconfident in decisions.
  • Positive: Fuels perseverance and ambition. Many successful scientists succeeded because they overestimated the value of their work.

The "Premortem" Technique (by Dr. Gary Klein)

A method to tame optimism bias without trying to eliminate it entirely.
How it works: Imagine a project has failed. Now ask: Why did it fail?
This unlocks critical thinking and risk awareness before a decision is finalized.
notion image

Other Useful Definitions

  • Sanguin: A cheerful, lively, and optimistic personality type.
  • Equity: Ownership value in a business or property after debts are paid.
  • Qualified: Having the skills, knowledge, or credentials needed for something.
  • Martyr: A person who suffers or dies for their beliefs or cause.

Part IV: Choice

When you have $100, gaining another $100 feels satisfying. But if you already have $500, gaining another $100 feels much less impactful.
→ The increase in satisfaction depends on the reference point.
People prefer certainty when it comes to gains but tend to take risks when faced with potential losses—even if the choices are statistically equivalent.
Every significant decision we make in life comes with a degree of uncertainty.
notion image

Endowment Effect

A cognitive bias where people assign more value to things merely because they own them, regardless of objective market value.

Loss Aversion

Humans are biologically wired to feel losses more intensely than equivalent gains.
Negative stimuli (e.g., war, harm) tend to stand out more than positive ones (e.g., love, peace). This is likely an evolutionary survival mechanism—our ancestors were more likely to survive by avoiding danger than by chasing rewards.
We are more strongly motivated to avoid losses than to achieve gains.
Economic theory suggests we work longer in good conditions and rest during bad ones.
Loss aversion theory suggests the opposite—for example, taxi drivers often work longer on bad days to avoid losses.
People are more relaxed in negotiations when splitting a large “pie” (both parties benefit), but become tense when the stakes involve potential losses.
“Loss aversion is a powerful conservative force that resists significant changes to the status quo—affecting institutions and individuals alike. It provides gravitational stability in our relationships, jobs, and daily lives.”
Daniel Kahneman
“The key task for those studying economic fairness is not to identify ideal behavior, but to locate the boundaries between acceptable actions and those that provoke outrage and punishment.”
Daniel Kahneman

The Fourfold Pattern of Risk Attitudes

notion image
We tend to avoid risk in the domain of gains and seek risk in the domain of losses.
“Many human disasters occur in the top right cell. That’s where people facing terrible options make reckless gambles—accepting high chances of making things worse in exchange for a slim hope of avoiding large losses.
The idea of accepting a definite major loss is too painful, and the hope of escape becomes so compelling that people often fail to make rational choices to reduce the damage.”
Daniel Kahneman

Econs vs. Humans

  • Econ: An ideal economic agent—rational, logical, and consistent.
  • Human: A real-world agent—prone to cognitive errors, emotional reasoning, and inconsistent preferences.

Disposition Effect

A behavioral bias where investors are more likely to sell stocks that have increased in value, while holding on to losing ones—because realizing a loss feels worse than missing a gain.

Mental Accounting

We categorize money into mental “accounts” and assign different subjective values depending on the context—often violating economic logic.
“People feel better about outcomes when they frame them as money saved, rather than money lost.”
“Let’s reframe the situation by changing the reference point. Imagine you didn’t already own it—what would you value it at?”
“Let’s categorize the loss under ‘general expenses’—it’ll feel less painful.”
“They ask you to check a box not to be included in a mailing list. The list would shrink if they asked you to opt in instead.”

Part V: The Two Selves

notion image
notion image
“Our memory, a function of System 1, has evolved to focus on the most intense moment of an experience (the peak) and how it ends (the ending). It neglects duration, which means it doesn't help us make choices that favor long pleasures or short pains.” — Daniel Kahneman

The Problem with Memory

Kahneman illustrates this with the example of a failed marriage.
When reflecting on it, you might mostly remember the painful divorce or the miserable final months—leading to a negative overall impression.
But this ignores the many years of happiness and love.
Even though the ending was bad, the experience as a whole wasn’t.
Sadly, we are wired to overemphasize the ending, and we use that memory to guide future decisions.
What matters most in memory:
Peak emotional moment and ending
not the length of the experience.
That’s why longer vacations can feel more enjoyable—if they include more peak moments or a pleasant ending, not just because they last longer.
“It’s no exaggeration to say that happiness is the experience of spending time with people you love, and who love you.” — Daniel Kahneman

Mixed Emotions are Real

Positive emotions (love, joy, hope, engagement, amusement, etc.) and negative emotions (anger, shame, sadness, loneliness, etc.) can coexist.
“When you’re happy because you’re in love, you might still feel joy even if you’re stuck in traffic.
But when you’re grieving, you may still feel miserable even while watching a funny movie.”
Daniel Kahneman

How We Spend Time Matters

One of the few areas in life we can control is how we spend our time.
A great way to improve life satisfaction is to replace passive leisure (like watching TV) with more active enjoyment—such as exercising, socializing, or creative hobbies.

Can Money Buy Happiness?

The conclusion is nuanced:
  • Poverty causes suffering.
  • Wealth increases life satisfaction, but not necessarily experienced happiness.
  • Higher income tends to reduce the ability to enjoy small pleasures.
💡 A study showed that students who were primed to think about wealth smiled less while eating chocolate.

Experienced Well-Being

“The easiest way to become happier is to manage how you spend your time.
Can you give more time to activities you actually enjoy?”
“Once your income reaches a level of comfort, you can buy nicer experiences—but you also lose the ability to enjoy simpler ones.”

Thinking About Life (Evaluative Well-Being)

“He thought buying a fancy car would make him happier—but that was just a mistake in affective forecasting.”
“Buying a bigger house might not improve your long-term happiness.
That could be the result of a focusing illusion—overestimating the importance of one factor in life satisfaction.”
 
githublinkedIn