This episode delves into Daniel Kahnemanâs exploration of cognitive shortcuts in decision-making, focusing on how media, intuition, and logic interact to shape our judgments. Through the lenses of the availability and representativeness heuristics, as well as expert debates on risk, weâll uncover why our minds so often steer us wrongâeven when we think we know better.
Chapter 1
Dr. Esther Hattingh
Welcome back to Thinking Fast and Slow. Iâm Dr. Esther Hattingh, and today weâre diving into a topic that, honestly, I see play out all the timeâhow the media shapes our sense of risk, and why our minds are so easily tricked by whatâs vivid or dramatic. If youâve listened to our last episode, youâll remember we talked about the availability heuristicâhow our judgments are swayed by what comes easily to mind. Well, today weâre going to dig even deeper into that, especially how media coverage can make rare events feel much more common than they actually are.
Dr. Esther Hattingh
Kahneman points out that the news is, by its very nature, biased toward novelty and poignancy. So, youâre far more likely to hear about a plane crash or a violent crime than, say, someone quietly managing their diabetes. And because those stories are so vivid, they stick in our minds. The result? We end up overestimating the likelihood of dramatic, rare events, and underestimating the risks of things that are actually much more common but less sensational. I mean, how often do you see headlines about someone dying from asthma compared to, say, a shark attack? Yet, statistically, asthma is far more dangerous.
Dr. Esther Hattingh
This is what Kahneman calls the âavailability cascade.â Itâs like a feedback loopâmedia coverage makes an event feel more common, which makes people talk about it more, which leads to even more coverage. Before you know it, public perception and even policy decisions are being driven by these amplified fears. I remember, actually, a few years ago when I was working on a university safety curriculum. There was this headline about a very rare but dramatic incident on a campus overseas. Suddenly, everyone on my team was convinced we needed to overhaul our entire safety plan to address this one risk. We spent weeks on it, only to realize later that weâd neglected far more likely, but less flashy, dangersâlike, you know, basic fire safety or student health emergencies. Itâs a classic case of the availability heuristic in action, and itâs something I still have to watch out for in my own work.
Dr. Esther Hattingh
So, as we move forward, keep in mind how easily our sense of risk can be hijacked by whatâs most memorable, not whatâs most probable. And, honestly, itâs not just us as individualsâthis shapes entire public debates and policies. But letâs shift gears a bit, because thereâs another mental shortcut that trips us up, especially when weâre trying to judge people or situations based on stereotypes. Thatâs the representativeness heuristic, and itâs got its own set of pitfalls.
Chapter 2
Dr. Esther Hattingh
Now, the representativeness heuristicâthis oneâs a bit sneaky. Itâs our tendency to judge the likelihood of something based on how much it seems to fit a certain stereotype, rather than looking at the actual statistics. Kahneman and Tverskyâs Tom W experiment is a perfect example. They gave people a short description of Tomâheâs introverted, likes science fiction, and is, well, a bit of a nerd, if I can put it that way. Then they asked, whatâs Tomâs field of study? Most people guessed computer science, because, you know, it just fits the stereotype. But hereâs the catch: computer science students are actually a tiny fraction of all graduate students. Statistically, itâs much more likely Tom is in a more common field, like humanities or social sciences.
Dr. Esther Hattingh
This is where we fall into the trap of ignoring base ratesâthe actual probabilities. We get so caught up in the story, in how ârepresentativeâ Tom seems, that we forget to ask, âWait, how many computer science students are there, really?â I see this all the time in education, too. We make assumptions about students based on a few traits, and forget to look at the bigger picture. And Iâll admit, Iâve done it myselfâjumped to conclusions because someone âseemedâ like theyâd fit a certain mold.
Dr. Esther Hattingh
So, what can we do about it? Well, one strategy Iâve found helpful, both for myself and for learners, is to pause and ask, âWhatâs the base rate here?â In other words, what are the actual odds, regardless of how much the story fits? Itâs not always easyâour brains love a good narrativeâbut just being aware of this bias can help us make better decisions. And, honestly, sometimes itâs about slowing down and letting System 2 thinking kick in, like we discussed in earlier episodes. If youâre an educator, you can build this into your lessonsâask students to look up the real numbers before making a judgment, or have them challenge their own assumptions. Itâs a small step, but it can make a big difference.
Dr. Esther Hattingh
Alright, so weâve seen how both the availability and representativeness heuristics can lead us astray. But thereâs one more mental shortcut that really throws people for a loop, and thatâs the conjunction fallacy. And, oh, the Linda problemâthis oneâs a classic.
Chapter 3
Dr. Esther Hattingh
Letâs talk about Linda. If youâve ever taken a psychology class, youâve probably heard this one. Kahneman and Tversky described Linda as a thirty-one-year-old, single, outspoken, and very bright woman who majored in philosophy and was deeply concerned with social justice. Then they asked: is it more likely that Linda is a bank teller, or that sheâs a bank teller and active in the feminist movement? And, overwhelmingly, people chose the second option. It just feels more plausible, right? But hereâs the thingâitâs actually less probable. Thatâs the conjunction fallacy in action.
Dr. Esther Hattingh
The mistake here is conflating plausibility with probability. Just because âbank teller and feministâ sounds like a better story, it canât be more likely than âbank tellerâ alone. If there are 80 bank tellers in Lindaâs city, there canât be more than 80 feminist bank tellers. Itâs a simple numbers game, but our intuition just doesnât want to accept it. Iâve seen this play out in my own classesâwhen I poll students, most of them fall for the conjunction fallacy, even after weâve talked about probability. Itâs only when I show them a visualâlike a Venn diagram or a simple chartâthat the logic really clicks. Suddenly, they see that the subset canât be bigger than the whole.
Dr. Esther Hattingh
Now, this isnât just an academic exercise. Thereâs a real debate about how we should handle these kinds of errors, especially when it comes to public policy and risk. Kahneman brings up Cass Sunstein and Paul Slovicâtwo experts with almost opposite views. Sunstein says, look, experts should stick to the data and not get swayed by public emotion. Slovic, on the other hand, argues that public fears are a kind of suffering in themselves, and we should take them seriously, even if theyâre not always rational. Kahneman, ever the diplomat, doesnât pick a side, but itâs a fascinating debate. Should we trust the experts, or should we listen to the publicâs concerns, even when theyâre shaped by these mental shortcuts?
Dr. Esther Hattingh
So, as we wrap up, I want you to think about how these heuristicsâthe availability heuristic, representativeness, and the conjunction fallacyâshape not just our personal decisions, but our collective ones, too. Next time you see a dramatic headline or hear a story that just âfeels right,â pause for a moment. Ask yourself: is this really likely, or just memorable? And if youâre teaching or learning, try to make these biases visibleâsometimes a simple visual or a quick check of the numbers can make all the difference. Thanks for joining me today, and I hope youâll tune in next time as we keep exploring the quirks and wonders of the human mind.
About the podcast
Daniel Kahnemanâs theory Kahneman begins by explaining the purpose of his book: to provide people with a richer vocabulary for discussing and identifying errors in judgment. He briefly traces his professional interest in the psychology of judgment and decision-making, illustrated with examples of human intuition's successes and failures. Lastly, Kahneman offers a broad overview of Thinking, Fast and Slow, starting with the functions of two complementary "systems" of cognition and describing the heuristics, or rules of thumb, these systems depend on. In the "Origins" section of the introduction, Kahneman discusses his research and his late thought partner, Amos Tversky, at length. Tversky's contributions were central to Kahneman's work and success.