Thinking Fast and Slow

EducationScience

Listen

All Episodes

Heuristics and Human Error

This episode delves into Daniel Kahneman’s exploration of cognitive shortcuts in decision-making, focusing on how media, intuition, and logic interact to shape our judgments. Through the lenses of the availability and representativeness heuristics, as well as expert debates on risk, we’ll uncover why our minds so often steer us wrong—even when we think we know better.

This show was created with Jellypod, the AI Podcast Studio. Create your own podcast with Jellypod today.

Get Started

Is this your podcast and want to remove this banner? Click here.


Chapter 1

The Influence of Media and the Availability Heuristic

Dr. Esther Hattingh

Welcome back to Thinking Fast and Slow. I’m Dr. Esther Hattingh, and today we’re diving into a topic that, honestly, I see play out all the time—how the media shapes our sense of risk, and why our minds are so easily tricked by what’s vivid or dramatic. If you’ve listened to our last episode, you’ll remember we talked about the availability heuristic—how our judgments are swayed by what comes easily to mind. Well, today we’re going to dig even deeper into that, especially how media coverage can make rare events feel much more common than they actually are.

Dr. Esther Hattingh

Kahneman points out that the news is, by its very nature, biased toward novelty and poignancy. So, you’re far more likely to hear about a plane crash or a violent crime than, say, someone quietly managing their diabetes. And because those stories are so vivid, they stick in our minds. The result? We end up overestimating the likelihood of dramatic, rare events, and underestimating the risks of things that are actually much more common but less sensational. I mean, how often do you see headlines about someone dying from asthma compared to, say, a shark attack? Yet, statistically, asthma is far more dangerous.

Dr. Esther Hattingh

This is what Kahneman calls the “availability cascade.” It’s like a feedback loop—media coverage makes an event feel more common, which makes people talk about it more, which leads to even more coverage. Before you know it, public perception and even policy decisions are being driven by these amplified fears. I remember, actually, a few years ago when I was working on a university safety curriculum. There was this headline about a very rare but dramatic incident on a campus overseas. Suddenly, everyone on my team was convinced we needed to overhaul our entire safety plan to address this one risk. We spent weeks on it, only to realize later that we’d neglected far more likely, but less flashy, dangers—like, you know, basic fire safety or student health emergencies. It’s a classic case of the availability heuristic in action, and it’s something I still have to watch out for in my own work.

Dr. Esther Hattingh

So, as we move forward, keep in mind how easily our sense of risk can be hijacked by what’s most memorable, not what’s most probable. And, honestly, it’s not just us as individuals—this shapes entire public debates and policies. But let’s shift gears a bit, because there’s another mental shortcut that trips us up, especially when we’re trying to judge people or situations based on stereotypes. That’s the representativeness heuristic, and it’s got its own set of pitfalls.

Chapter 2

Representativeness Heuristic and the Case of Tom W

Dr. Esther Hattingh

Now, the representativeness heuristic—this one’s a bit sneaky. It’s our tendency to judge the likelihood of something based on how much it seems to fit a certain stereotype, rather than looking at the actual statistics. Kahneman and Tversky’s Tom W experiment is a perfect example. They gave people a short description of Tom—he’s introverted, likes science fiction, and is, well, a bit of a nerd, if I can put it that way. Then they asked, what’s Tom’s field of study? Most people guessed computer science, because, you know, it just fits the stereotype. But here’s the catch: computer science students are actually a tiny fraction of all graduate students. Statistically, it’s much more likely Tom is in a more common field, like humanities or social sciences.

Dr. Esther Hattingh

This is where we fall into the trap of ignoring base rates—the actual probabilities. We get so caught up in the story, in how “representative” Tom seems, that we forget to ask, “Wait, how many computer science students are there, really?” I see this all the time in education, too. We make assumptions about students based on a few traits, and forget to look at the bigger picture. And I’ll admit, I’ve done it myself—jumped to conclusions because someone “seemed” like they’d fit a certain mold.

Dr. Esther Hattingh

So, what can we do about it? Well, one strategy I’ve found helpful, both for myself and for learners, is to pause and ask, “What’s the base rate here?” In other words, what are the actual odds, regardless of how much the story fits? It’s not always easy—our brains love a good narrative—but just being aware of this bias can help us make better decisions. And, honestly, sometimes it’s about slowing down and letting System 2 thinking kick in, like we discussed in earlier episodes. If you’re an educator, you can build this into your lessons—ask students to look up the real numbers before making a judgment, or have them challenge their own assumptions. It’s a small step, but it can make a big difference.

Dr. Esther Hattingh

Alright, so we’ve seen how both the availability and representativeness heuristics can lead us astray. But there’s one more mental shortcut that really throws people for a loop, and that’s the conjunction fallacy. And, oh, the Linda problem—this one’s a classic.

Chapter 3

The Linda Problem and the Conjunction Fallacy

Dr. Esther Hattingh

Let’s talk about Linda. If you’ve ever taken a psychology class, you’ve probably heard this one. Kahneman and Tversky described Linda as a thirty-one-year-old, single, outspoken, and very bright woman who majored in philosophy and was deeply concerned with social justice. Then they asked: is it more likely that Linda is a bank teller, or that she’s a bank teller and active in the feminist movement? And, overwhelmingly, people chose the second option. It just feels more plausible, right? But here’s the thing—it’s actually less probable. That’s the conjunction fallacy in action.

Dr. Esther Hattingh

The mistake here is conflating plausibility with probability. Just because “bank teller and feminist” sounds like a better story, it can’t be more likely than “bank teller” alone. If there are 80 bank tellers in Linda’s city, there can’t be more than 80 feminist bank tellers. It’s a simple numbers game, but our intuition just doesn’t want to accept it. I’ve seen this play out in my own classes—when I poll students, most of them fall for the conjunction fallacy, even after we’ve talked about probability. It’s only when I show them a visual—like a Venn diagram or a simple chart—that the logic really clicks. Suddenly, they see that the subset can’t be bigger than the whole.

Dr. Esther Hattingh

Now, this isn’t just an academic exercise. There’s a real debate about how we should handle these kinds of errors, especially when it comes to public policy and risk. Kahneman brings up Cass Sunstein and Paul Slovic—two experts with almost opposite views. Sunstein says, look, experts should stick to the data and not get swayed by public emotion. Slovic, on the other hand, argues that public fears are a kind of suffering in themselves, and we should take them seriously, even if they’re not always rational. Kahneman, ever the diplomat, doesn’t pick a side, but it’s a fascinating debate. Should we trust the experts, or should we listen to the public’s concerns, even when they’re shaped by these mental shortcuts?

Dr. Esther Hattingh

So, as we wrap up, I want you to think about how these heuristics—the availability heuristic, representativeness, and the conjunction fallacy—shape not just our personal decisions, but our collective ones, too. Next time you see a dramatic headline or hear a story that just “feels right,” pause for a moment. Ask yourself: is this really likely, or just memorable? And if you’re teaching or learning, try to make these biases visible—sometimes a simple visual or a quick check of the numbers can make all the difference. Thanks for joining me today, and I hope you’ll tune in next time as we keep exploring the quirks and wonders of the human mind.