In this episode, we unpack three foundational concepts from Kahnemanâs "Thinking, Fast and Slow." We explore how small samples, anchoring, and the availability heuristic shape our everyday decisionsâoften leading us astray. With real-world examples and Dr. Hattinghâs own experiences, we'll see just how subtle these mental shortcuts can be.
Chapter 1
Dr. Esther Hattingh
Welcome back to Thinking Fast and Slow. Iâm Dr. Esther Hattingh, and today weâre diving into some of the most quietly powerful forces shaping our decisionsâthose little mental shortcuts that Kahneman calls heuristics. Now, if youâve been following along, youâll remember in our last episode we talked about how System 1, our fast, intuitive mind, loves to jump to conclusions. Well, todayâs first topic is a perfect example of that: the law of small numbers.
Dr. Esther Hattingh
So, what is this law of small numbers? Itâs not a real law, actuallyâitâs more of a running joke between Kahneman and Tversky. The idea is that our minds, especially System 1, tend to trust results from small samples way too much. We see a handful of coin flipsâsay, four in a rowâand if they all come up heads, we start thinking, âWow, maybe this coin is special!â But statistically, small samples are just more likely to give us extreme results. Itâs like, I always mix up the math, but if you flip a coin a thousand times, youâll get close to fifty-fifty. But with just four flips? Anything can happen.
Dr. Esther Hattingh
This isnât just a quirky thing about coins, though. It shows up everywhere. In research, for example, peopleâsometimes even professionalsâdesign studies with tiny sample sizes and then get excited about dramatic findings. But those results are often just noise. This is a big reason behind whatâs called the replication crisis, especially in psychology and medicine. We see a flashy result in a small study, but when someone tries to repeat it with a bigger group, poof, the effect disappears.
Dr. Esther Hattingh
Iâll tell you, Iâve fallen for this myself. Early in my career, I ran a student surveyâjust a small group, maybe twenty or so. The results were all over the place, but I was so eager to find patterns that I started drawing big conclusions. Later, with a larger sample, those patterns just melted away. It was a humbling lesson in how our minds crave stories, even when the data is just random noise. So, if you ever find yourself wowed by a dramatic statistic from a tiny group, just pause and ask: is this really meaningful, or is it just the law of small numbers at work?
Chapter 2
Dr. Esther Hattingh
Now, letâs shift gears a bitâthough, honestly, itâs all connected. Another way our minds get tripped up is through something called the anchoring effect. This is one of Kahneman and Tverskyâs most famous discoveries. Basically, if youâre exposed to a numberâeven a totally random oneâit can sway your estimate of something completely unrelated. Like, if I ask, âWas Gandhi older or younger than 144 when he died?ââwhich, I mean, obviously, he wasnât 144!âbut just hearing that big number makes people guess higher than if Iâd said, âWas he older or younger than 35?â
Dr. Esther Hattingh
Itâs wild, right? And itâs not just trivia. This anchor-and-adjust heuristic pops up everywhere. Think about financial decisionsâhow much to save for retirement, or what salary to ask for in a new job. If you start with a high anchor, youâll probably end up negotiating for more. Even in everyday life, like shopping, those âsuggested retail pricesâ are anchors. They set a reference point, and we adjust from there, but never quite enough.
Dr. Esther Hattingh
I actually used this in my own work, designing donation forms for a university fundraising campaign. We found that if we suggested higher donation amounts as starting points, people tended to give moreâeven if they could choose any amount they wanted. Itâs not manipulation, exactly, but itâs a reminder of how easily our minds latch onto the first number we see. So next time youâre filling out a form or haggling over a price, just notice what anchors are being set for you. Sometimes, just being aware of the anchor can help you adjust a bit more thoughtfully.
Chapter 3
Dr. Esther Hattingh
And speaking of whatâs top of mind, letâs talk about the availability heuristic. This one is all about how easily something comes to mind, and how that shapes our sense of how common or likely it is. If you can quickly think of examples, youâll probably think it happens a lotâeven if thatâs not true.
Dr. Esther Hattingh
The classic example is household chores. If you ask couples to estimate what percentage of the chores they do, their answers almost always add up to more than 100%. Why? Because itâs so much easier to remember the times you did the dishes than the times your partner did. Or, think about media coverage of rare diseases. If a disease gets a lot of attention, people start to overestimate how common or dangerous it is. Same with spidersâeveryone knows about black widows and brown recluses, so we overestimate the risk, even though most spiders are harmless.
Dr. Esther Hattingh
I remember a few years ago, there was a big international news story here in South Africaâsomething dramatic, I wonât get into the details. The next semester, my students were suddenly much more anxious about that particular risk, even though statistically, nothing had changed. It was just more available in their minds. Thatâs the power of availability: it shapes our fears, our priorities, even our arguments about who does the laundry.
Dr. Esther Hattingh
So, as we wrap up, just rememberâour minds are brilliant, but theyâre also full of shortcuts that can lead us astray. Whether itâs trusting small samples, getting anchored by random numbers, or letting vivid memories shape our sense of reality, these heuristics are always at work. Next time, weâll dig even deeper into how we can spot these patterns and maybe, just maybe, outsmart them. Until then, keep thinkingâfast and slow.
About the podcast
Daniel Kahnemanâs theory Kahneman begins by explaining the purpose of his book: to provide people with a richer vocabulary for discussing and identifying errors in judgment. He briefly traces his professional interest in the psychology of judgment and decision-making, illustrated with examples of human intuition's successes and failures. Lastly, Kahneman offers a broad overview of Thinking, Fast and Slow, starting with the functions of two complementary "systems" of cognition and describing the heuristics, or rules of thumb, these systems depend on. In the "Origins" section of the introduction, Kahneman discusses his research and his late thought partner, Amos Tversky, at length. Tversky's contributions were central to Kahneman's work and success.