The availability heuristic describes people’s tendency to judge that events that are really emotionally salient or memorable are more probable than events that aren’t, even if the ones that aren’t are actually statistically more likely. One classic place you see this is in estimates of risk of dying in a terrorist attack: even though the odds are exceedingly low of dying this way (if you live in most countries, at least), we tend to spend far more resources, proportionally, fighting terror than in dealing with more prosaic dangers like automobile accidents or poverty. There might be other valid reasons to spend disproportionately — e.g., terrorism is part of a web of other foreign-policy issues that we need to focus on for more long-term benefits; or people don’t want to sacrifice the freedoms that would be necessary (like more restrictive speed limits) to make cars safer; or it’s not very clear how to solve some problems (like poverty) — and I really don’t want to get into those debates — the point is just that I think most everyone would agree that in all of those cases, at least part of the reason for the disproportionate attention is because dying in a terrorist attack is much more vivid and sensational than dying an early death because of the accumulated woes of living in poverty. And there’s plenty of actual research showing that the availability heuristic plays a role in many aspects of prediction.
There’s been a lot of debate about whether this heuristic is necessarily irrational. Evolutionarily speaking, it might make a lot of sense to pay more attention to the more salient information. To steal an example from Gerd Gigerenzer, if you live on the banks of a river and for 1000 days there have been no crocodile sightings there, but yesterday there was, you’d be well-advised to disregard the “overall statistics” and keep your kids from playing near the river today. It’s a bit of a just-so story, but a sensible one, from which we might infer two possible morals: (a) as Steven Pinker pointed out, since events have causal structure, it might make sense to pay more attention to more recent ones (which tend to be more salient); and (b) it also might make sense to pay more attention to emotionally vivid ones, which give a good indication of the “costs” of being wrong.
However, I think the problem is that when we’re talking about information that comes from mass media, both of these reasons don’t apply as well. Why? Well, if your information doesn’t come from mass media, to a good approximation you can assume that the events are statistically representative of the events that you might be likely to encounter. If you get your information from mass media, you cannot assume this. Mass media reports on events from all over the world in such a way that they can have the same vividness and impact as if they were in the next town over. And while it might be rational to worry a lot about crime if you consistently have shootings your neighborhood, it doesn’t make as much sense to worry about it if there are multiple shootings in cities hundreds of miles away. Similarly, because mass media reports on news – i.e., statistically rare occurrences – it is easy to get the dual impression that (a) rare events are less rare than they actually are; and (b) that there is a “recent trend” that needs to be paid attention to.
In other words, while it might be rational to keep your kids in if there were crocodile attacks at the nearby river yesterday, it’s pretty irrational to keep them in if there were attacks at the river a hundred miles away. Our “thinking” brains know this, but if we see those attacks as rapidly and as vividly as if they were right here — i.e., if we watch them on the nightly news — then it’s very hard to listen to the thinking brain… even if you know about the dangers. And cable TV news, with its constant repetition, makes this even harder.
The source of the problem is due to the sampling structure of mass media, but it’s of course far worse if the medium makes the message more emotional and vivid.