Thinking, Fast and Slow by Daniel Kahneman [Book Summary]


Our minds have a fascinating drama going on, a filmlike plot between two key characters with twists, dramas, and tensions. The two characters are the impulsive, automatic, intuitive System 1, and the thoughtful, careful, calculating System 2. As they go against one another, their connections influence how we reason, make decisions, and how we behave.

System 1 is the part of our brain that works automatically and unexpectedly, mostly without our conscious control. You can feel this system working when you hear a really loud and sudden sound. What can you do at that point? You may be instantly and automatically change your focus toward the sound. That is System 1. 

System 1 is a legacy of our evolutionary history: there are natural survival benefits in being capable of doing such type of fast actions and decisions.

System 2 is what we reason of when we imagine the part of the brain accountable for our personal decision-making, thinking, and theories. This system handles conscious activities of the mind like self-control, decisions and a more deliberate concentration of attention.

For example, let’s say you’re searching for a woman in a crowd. Your mind deliberately concentrated on the mission: it remembers features of the person and anything that would assist you to find her. This attention assists you remove likely distractions, and you hardly see the other people in the crowd. If you keep this focused attention, you might see the woman within a few minutes, while if you’re distracted and lose concentration, it will be difficult to locate her.

As we’ll get to understand in the next chapters, the connections between these two systems influence how we act.


Buy this book from Amazon



Chapter 1 – The lazy mind: how laziness can cause mistakes and have an effect on our intelligence.


In order to know how these two systems function, attempt doing this popular bat-and-ball problem:

The cost of a bat and ball is $1.10. The price of the bat alone is one dollar more than the ball. What is the price of the ball?

The cost that probably came to your mind, $0.10, is due to the intuitive and automatic System 1, and it’s incorrect! Take a minute and do the math now.

Can you see the mistake you made? The right answer is $0.05.

What occurred was that your impulsive System 1 took charge and automatically solved the question by depending on intuition. However, it answered really fast.

Regularly, when confronted with a matter it can’t understand, System 1 summons System 2 to solve the issue; however, in the bat-and-ball problem, System 1 is deceived. It sees the question as simpler than it is and wrongly believes it can solve it all by itself.



The problem the bat-and-ball question reveals is our innate mental laziness. When we make use of our brains, we have a habit of using the minimum amount of energy possible for every activity. This is called the law of least effort. Since checking the solution with System 2 would use more energy, our mind will not do that when it assumes it can only solve it with System 1.

This laziness is unfortunate since making use of System 2 is an essential part of our intelligence. Research demonstrates that practicing System-2 activities, such as focus and self-control, leads to more intelligence marks. The bat-and-ball question shows this, as our minds could have checked the answer by making use of System 2 and so avoided making this frequent mistake.

By being lazy and not making use of System 2, our mind is restricting the strength of our intelligence.


Chapter 2 – Autopilot: what is the reason why we are not in conscious control of our thoughts and deeds at all times?


What comes to your mind when you see the word fragment “SO_P”? Maybe nothing. What if you first think of the word “EAT”? Now, when you check the word “SO_P”  again, you would possibly complete it as “SOUP.” This process is called priming.

We’re primed when disclosure to a word, idea or incident makes us call on related words and ideas. If you had looked at the word “SHOWER” instead of “EAT” above, you maybe would’ve finished the letters as “SOAP.”

That kind of priming not just influences the manner we think; however, it also influences the many we behave. Similarly, just like how the mind is influenced by hearing specific words and ideas, the body can be affected too. A perfect illustration of this can be seen in a study whereby participants primed with words related to being elderly, like “Florida” and “wrinkle,” answered by walking at a slower rate than normal.

Amazingly, the priming of deeds and thoughts is totally unconscious; we do it without knowing



Hence, what priming reveals is that regardless of what a lot argues, we are not constantly in conscious control of our deeds, decisions, and choices. Rather, we are being regularly primed by specific social and cultural situations.

For instance, research that was conducted by Kathleen Vohs shows that the idea of money primes individualistic deeds. For instance, people primed with the concept of money, by being exposed to pictures of money – behave more independently and are less eager to be involved with, rely on or receive demands from others. One consequence of Vohs’s research is that living in a world that is filled with initiates that prime money could push our actions away from altruism.

Priming, similar to every other societal element, can affect an individual’s views and hence decisions, judgment and deeds – and these reflect back into the culture and greatly affect the type of society we all live in.


Chapter 3 – Snap judgments: how the mind makes fast decisions, even when it doesn’t have adequate information to make a rational choice.


Let’s assume you encounter someone called Ben at a party, and it was easy for you to talk to him. After, someone questions you if you know anyone who might want to donate to their charity. You imagine Ben, although the only thing you know about Ben is that, it is easy to talk to him.

Meaning, you liked one part of Ben’s character, and therefore you presumed you would like every other thing about him. We regularly approve or disapprove of a person even when we know just a bit about them.

Our mind’s propensity to generalize things without adequate information regularly causes judgment errors. This is known as exaggerated emotional coherence, also called the halo effect: positive thoughts about Ben’s approachability make you put a halo on Ben, although you know just a bit about him.

However, this is not the only method our minds take shortcuts when making decisions.

Also, there is confirmation bias, which is the propensity for people to decide with the information that backs their formerly held opinions, and to accept any information that is proposed to them.



This can be displayed if we ask the question, “Is James friendly?” Various studies have demonstrated that confronted with this question; however no further information, we’re extremely likely to see James as friendly –since the mind automatically approves the proposed idea.

Both the halo effect and confirmation bias happen since our minds are keen to make fast decisions. However, this regularly causes mistakes, since we don’t regularly have adequate information to right judgments. Our minds depend on false propositions and generalizations to fill in the gaps in the information, causing us to make potentially wrong decisions.

Just like priming, these cognitive phenomena occur without our conscious awareness and influence our decisions, judgments, and deeds.


Chapter 4 – Heuristics: how the mind makes use of shortcuts to make quick judgments.


Mostly we see ourselves in circumstances where we need to make a fast decision. In order for us to do this, our minds have created small shortcuts to assist us quickly understand our environments. These are known as heuristics.

Regularly, these processes are really beneficial; however, the problem is that our minds have a tendency to overuse them. Using them in circumstances for which they aren’t appropriate can cause mistakes. In order to have a better knowledge of what heuristics are and what errors they can cause, we can look at two of their many kinds: the substitution heuristic and the availability heuristic.

The substitution heuristic is where we answer an easier question than the one that was initially asked.

For instance, look at this question: “That woman is a candidate for sheriff. How prosperous will she be as a sheriff?” We automatically change the question we’re meant to answer with an easier one, like, “Does this woman resemble a person that will make a good sheriff?”

This heuristic signifies that rather than researching the candidate’s experience and policies, we simply question ourselves the far easier question of if this woman fits our mental depiction of a good sheriff. Unluckily, if the woman doesn’t match our depiction of a sheriff, we could decline her – even if she possesses years of experience in crime-fighting that make her the perfect person.



Then there is the availability heuristic, which is where you overestimate the likelihood of something you hear regularly or find really easy to recall.

For instance, strokes lead to a lot of deaths than accidents do; however, a study discovered that 80% of participants viewed an accidental death a more likely fate. This is due to the fact that we hear more of accidental deaths in the media and since they make a stronger impact on us; we recall terrible accidental deaths more willingly than deaths from strokes, and therefore we may react improperly to these threats.


Chapter 5 – No head for numbers: why we find it difficult to understand statistics and make preventable errors because of it.


How can you make guesses on if specific things will occur?

One effective method is to have the base rate in mind. This means a statistical base, where other statistics depend on. For instance, let’s say a big taxi company has 20% yellow cabs and 80% red cabs. That entails that the base rate for yellow taxi cabs is 20% and the base rate for red cabs is 80%. If you request a cab and want to predict its color, keep in mind the base rates and you will make a fairly correct estimate.

Hence, we should constantly keep in mind the base rate when predicting an occurrence; however, unluckily, this doesn’t occur. As a matter of fact, base-rate neglect is very common.

One of the causes we see ourselves neglecting the base rate is that we concentrate on what we expect instead of what is most likely. For instance, let’s consider those cabs again: If you were to witness five red cabs pass by, you’d possibly begin to feel it’s somewhat possible that the next car that would pass will be yellow for a change. However, regardless of how many cabs of either color pass by, the probability that the following cab will be red will still be about 80%– and if we think of the base rate we should understand this. However, instead, we have a habit to concentrate on what we expect to see, a yellow cab, and therefore we will probably be incorrect.



Base-rate neglect is a general error connected with the wider issue of using statistics. Also, we find it difficult to recall that everything regresses to the mean. This is the recognition that every circumstance have their average status, and variations from that average will ultimately move back toward the average.

For instance, if a football striker who scores an average of five goals per month scores ten goals in September, her coach will be delighted; however, if she then goes on to score about five goals every month for the remaining of the year, her coach will most likely criticize her for not keeping up her “hot streak.” Although, the striker wouldn’t deserve this criticism since she is basically regressing to the mean!


Chapter 6 – Past imperfect: the reason why we recall situations from hindsight instead of from experience.


Our minds don’t recall experiences in a direct manner. We have two different devices, known as memory selves, both of which remember events differently.

Firstly, we have the experiencing self, which records how we feel in the present instant. It asks a question like: “How does it feel now?”

Secondly, there is the remembering self, which records how the whole situation unfolded after the fact. It asks a question like, “How was it on the whole?”

The experiencing self provides us a more precise detail of what happened because our feelings during the experience are the most accurate all the time. However, the remembering self, which is less precise because it records memories after the event has ended, dominates our memory.

There are two causes of why remembering self dominates the experiencing self. The first reason is known as duration neglect, where we disregard the whole duration of the incident in favor of a specific memory from it. The second reason is the peak-end rule, where we overemphasize what happens at the end of an incident.



Consider this experiment as a case of this dominance of the remembering self, the experiment which measured people’s memories of a painful colonoscopy. Before the colonoscopy, the people were separated into two different groups: the patients in a group were given long, instead of drawn-out colonoscopies, whereas the people in the other group were given much shorter procedures; however, where the level of pain increased towards the end.

You’d reason the most unhappy patients would be the ones who underwent the long process, as their pain was tolerated for longer. This was definitely what they experienced at the time. During the process, when they asked every patient about the pain, their feeling self provided a precise answer: those who had the longer procedures felt worse. But, after the experience, when the remembering self dominated, those who did the shorter process with the more painful ending felt the worst. This survey provides us a clear instance of duration neglect, the peak-end rule, and our faulty memories.


Chapter 7 – Mind over matter: how changing the attention of our minds can intensely impact our views and behaviors.


Our minds make use different amounts of energy depending on the activity. When there’s no necessity to mobilize focus and little energy is required, we are in a state of cognitive ease. Still, when our minds need to mobilize focus, they make use of more energy and go to a state of cognitive strain.

These modifications in the brain’s energy levels have intense impacts on how we act.

During a state of cognitive ease, the intuitive System 1 is in control of our minds, and the rational and more energy-demanding System 2 is deteriorated. This signifies that we are more intuitive, creative and happier; still, it is very possible for us to make errors.

During a state of cognitive strain, our awareness is more heightened, and therefore System 2 is in control. System 2 is more prepared to double-check our decisions than System 1; therefore, although we are far less creative, we will make lesser errors.

You can consciously impact the amount of energy the mind makes use of to get in the correct frame of mind for specific activities. For instance, if you need a message to be persuasive, attempt nurturing cognitive ease.



One method to do this is to expose ourselves to repetitive information. If we repeat the information or if the information made more memorable, it becomes more persuasive. This is due to the fact that our minds have developed to react positively when repeatedly exposed to the same clear messages. When we see a thing that is familiar, we move to a state of cognitive ease.

In contrast, cognitive strain, on the other hand, assists us to succeed at things such as statistical problems.

We can move into this state by exposing ourselves to data that is shown to us in a confusing manner, for instance, through hard-to-read type. Our minds brighten and increase their energy levels in an attempt to understand the problem, and hence we are less likely to basically surrender.


Chapter 8 – Taking chances: the manner probabilities are shown to us influences our assessment of risk.


The manner in which we judge notions and tackle issues is greatly influenced by the manner they are shown to us. Little modifications to the information or attention of a statement or question can intensely change the manner we address it.

A perfect illustration of this can be seen in how we assess risk.

You may reason that as soon as we know the probability of a risk happening, everyone will tackle it in the same manner. Still, that isn’t the case. Even for cautiously estimated probabilities, only by changing the manner the figure is presented can alter how we approach it.

For instance, people will think of a rare incident as more likely to happen if it’s presented in terms of relative frequency instead of as a statistical probability.

In what’s called Mr. Jones’s experiment, two groups that consist of psychiatric professionals were questioned if it was safe to discharge Mr. Jones from the psychiatric hospital. The first group of professionals was told that patients such as Mr. Jones had a “10 percent chance of doing an act of violence,” and the other group was told that “out of every 100 patients such as Mr. Jones, 10 are predicted to perpetrate an act of violence.” From those two groups, nearly twice as many respondents in the second group rejected his discharge.



Another method where our focus is distracted from what is statistically relevant is known as denominator neglect. This happens when we disregard plain statistics in favor of vivid mental pictures that affect our choices.

Consider these two statements: “This drug defends children from disease X however, it has a 0.001 percent probability of permanent disfigurement” and “One out of the 100,000 children who use this drug will be permanently disfigured.” Although both statements are the same, the latter statement conveys to mind a disfigured child and is very much influential, which is the reason why it would make us less likely to use the drug.


Chapter 9 – Not robots: why we don’t decide based solely on rational thinking.


How do we as people make a decision?

For over a long time, a powerful and prominent group of economists proposed that we made choices based solely on rational argument. They claimed that everyone one of us makes decisions based on utility theory, which asserts that when people make choices, they simply consider the rational facts only and select the choice with the best overall result for them, which means the most utility.

For instance, utility theory would suggest this type of statement: if you like oranges more than kiwis; also, you’re going to take a 10% chance of winning an orange over a 10% chance of winning a kiwi.

It looks clear, right?

The most powerful group of economists in this industry based in the Chicago School of Economics and their most famous scholar Milton Friedman. Making use of the utility theory, the Chicago School claimed that people in the marketplace are ultra-rational decision-makers, whom economist Richard Thaler and lawyer Cass Sunstein later called Econs. As Econs, every person behaves in the same manner, valuing goods and services centered on their rational needs. In addition, Econs also value their wealth rationally, weighing just how much utility it offers them.



Hence, consider two people, John and Jenny, where both have a wealth of $5 million. According to utility theory, they possess equal wealth, which signifies that both of them should equally happy with their money.

However, what if we make things difficult a bit? Let’s assume that their $5 million wealth is the result of one day at the casino, and the two of had hugely different starting points: John went to the casino with only $1 million and quintupled his money, while Jenny went in with $9 million that declined to $5 million. Do you still consider that both John and Jenny are equally happy with their $5 million?

Not likely. Obviously then, there is something beyond the manner we value things than the pure utility.

As we’ll understand in the following chapter, since every one of us doesn’t consider utility as rationally as utility theory thinks, we can make weird and apparently irrational choices.


Chapter 10 – Gut feeling: why instead of making choices based only on rational considerations, we are regularly influenced by emotional factors.


If utility theory doesn’t work, then what works?

One substitute is prospect theory, made by the author.

Kahneman’s prospect theory dares utility theory by displaying that when we make decisions, we don’t regularly behave act in the most rational manner.

Think of these two scenarios for instance: In the first scenario, you’re offered $1,000 and then need to decide between getting a certain $500 or taking a 50% chance to win an extra $1,000. In the second scenario, you’re offered $2,000 and need to decide between a certain loss of $500 or taking a 50% chance of losing $1,000.

If we made only rational decisions, then we would make the same decisions in both scenarios. However, that isn’t the case. In the first scenario, the majority of the people decide to take a certain bet, while in the second scenario, the majority of the people take a gamble.

Prospect theory assists us to clarify why this is the case. It emphasizes a minimum of two causes why we don’t regularly behave act rationally. Both of them emphasize our loss aversion — the fact that we fear losses more than we value profits.



The first cause for this is that we value things based on reference points. Beginning with $1,000 or $2,000 in the two cases changes whether we’re eager to gamble since the starting point influences how we value our position. The reference point in the first scenario is $1,000 and the reference point in the second scenario is $2,000, meaning ending up at $1,500 seems like a win in the first; however, an unpleasant loss in the second. Although our reasoning here is obviously irrational, we comprehend value as much by our starting point as by the real objective value at the time.

The second reason is that we’re swayed by the diminishing sensitivity principle: the value we notice may be different from its real worth. For instance, ending from $1,000 to $900 doesn’t seem as bad as ending from $200 to $100, in spite of the monetary value of both losses being the same. Likewise in our case, the observed value lost when ending from $1,500 to $1,000 is more than when ending from $2,000 to $1,500.


Chapter 11 – False images: why the mind constructs full images to clarify the world; however, they lead to overconfidence and errors.


For us to understand events, our minds naturally make use of cognitive coherence; we build complete mental images to clarify ideas and notions. For instance, we have a lot of pictures in our brains for the weather. We have a picture, for instance, summer weather, which might be an image of a bright, hot sunbathing us in heat.

In addition to helping us to comprehend things, we also depend on these pictures when making a choice.

When we make choices, we relate to these images and base our assumptions and decisions on them. For instance, if we need to determine the type of clothes to wear during summer, we base our choices on our picture of that season’s weather.

The issue is that we put a lot of confidence in these images. Even when known statistics and information disagree with our mental images; we still let the images shape us. In summer, the weather forecaster might forecast somewhat cool weather; still, you might wear shorts and a T-shirt, because that is what your mental picture of summer tells you to wear. You may eventually end up shivering outside!



We are, in a nutshell, hugely overconfident of our habitually faulty mental pictures. However, there are methods to defeat this overconfidence and begin making better estimates.

One method to evade mistakes is to make use of reference class forecasting. Rather than making decisions based on your rather common mental images, use particular historical cases to make a more accurate prediction. For instance, think of the former event you went to when it was a cold summer day. What did you wear to go out on that day?

Furthermore, you can invent a long-term risk policy that devices particular measures in the case of both success and failure in predicting. By preparing and protecting, you can depend on evidence rather than general mental images and make more accurate predictions. For instance, in the case of our weather, this could entail going out with a sweater just to be safe.


Thinking, Fast and Slow by Daniel Kahneman Book Review


Thinking, Fast and Slow reveals to us that our minds have two systems. The first one acts instinctively and needs small effort; while the second is more deliberate and needs a lot of our concentration. Our thoughts and deeds differ depending on the type of the two systems that is in charge of our brain at that time.


Repeat the message!

When we are repeatedly exposed to messages, it becomes more persuasive when. This is most likely because we developed in a manner that made repeated exposure to things that had no bad effects are considered essentially good.

Don’t be swayed by rare statistical incidences that are over-reported in newspapers.

Disasters, as well as other occurrences, are an essential aspect of our history; however, we regularly overestimate their statistical probability because of the intense pictures we connect with them from the media.

When you are in a better mood, you’re more creative and intuitive then.

When you’re in a better mood, the aspect of the mind that is alert and analytical has a habit to relax. That submits the control of your mind to the more intuitive and faster thinking system, which also enables you to be more creative.


Buy this book from Amazon



Download Pdf


https://goodbooksummary.s3.us-east-2.amazonaws.com/Thinking%2C+Fast+and+Slow+by+Daniel+Kahneman+Book+Summary.pdf


Download Epub


https://goodbooksummary.s3.us-east-2.amazonaws.com/Thinking%2C+Fast+and+Slow+by+Daniel+Kahneman+Book+Summary.epub


Audiobook


Savaş Ateş

I'm a software engineer. I like reading books and writing summaries. I like to play soccer too :) Good Reads Profile: https://www.goodreads.com/user/show/106467014-sava-ate

Recent Posts