Comment: Our flawed perceptions of risk

By Katherine Fox-Glassman

Cracked road after an earthquake. ©Tom Wang/Shutterstock. Cracked road after an earthquake. ©Tom Wang/Shutterstock.

Growing up in Southern California, I experienced earthquake drills in elementary school as routinely as fire alarm tests. Although my own neighborhood was never severely damaged by an earthquake, it was shaken — regularly — and news reports of pancaked buildings and crumbled overpasses mentioned familiar cities and freeways. There was no question in my mind about the damage earthquakes could do, and yet I was never worried.

Instead, it was my family members in Nebraska and Minnesota, my friends on the East Coast, and my international college classmates who worried for me: “Earthquakes? That must be so scary to live with!” I told them it really wasn’t that bad — that only the worst of the damage makes the nightly news. But I had seen all the same footage they had, so why wasn’t I as worried? They hadn’t been awakened by their parents in the middle of the night and pulled — with bed sheets still clutched around them — into the relative safety of a door frame, so why were they the ones who seemed more concerned about the risk?

In the time since I left the earthquake-prone West Coast for the other side of the country, which more often faces hurricanes and “snowmageddons,” cognitive scientists have found a potential answer to the paradox that governs humans’ perceptions of risk. It seems to boil down to two ways by which we can learn about the probabilities of rare events: verbal descriptions and personal experiences.

Described probabilities are the ones you see in your weather app (which might forecast a “20-percent chance of rain,” for example) or read in the fine print on a raffle ticket (which will tell you of your “1-in-500 chance of winning”). For decades now, psychologists have catalogued the ways that people fail to use described probabilities correctly. Sometimes we misuse or misunderstand them, like the angry viewer who is caught in a downpour and wants the meteorologist to be fired because she only forecast a 10-percent chance of rain. Other times we let our desires cloud our interpretation. Perhaps you’ve been the person who has bought a lottery ticket with a 1-in-100,000 chance of winning, but who didn’t buckle his or her seat belt on the way to the store because the odds of crashing seem low, when in fact they are 1 in 10,000. Sometimes, like Han Solo rebuffing C-3PO in “Star Wars,” we simply ignore probabilities entirely: “Never tell me the odds,” Han says, after being informed of the slim chances of successfully navigating an asteroid field.

Even when we do pay attention to described probabilities, there’s another way we tend to distort them. In their Nobel-Prize-winning 1979 work on prospect theory, Daniel Kahneman and Amos Tversky noted the ways in which humans assign extra weight to low probabilities. Give someone a gamble with a 2-percent chance of winning, and he’s likely to treat it the way a computer (or a perfectly rational human — an economist’s Holy Grail) would treat 7-percent, or maybe 10-percent, odds. So when you explain to New Yorkers that really bad earthquakes in California are rare, with perhaps a 3-percent chance in a given year, they overinterpret that, suspecting that serious quakes are much more likely than is implied by those odds. For two decades, research has piled up to support this over-weighting as a basic human cognitive tendency.

But over-weighting of described rare events only explains half of my earthquake paradox. What about people like me, who grow up actually experiencing rare events but seem less concerned about them? It turns out it’s not just the so-called devil-you-know effect, whereby familiar threats seem less dangerous, but another systematic human reaction to probability. In a 2004 Psychological Science study, a team of psychologists (Ralph Hertwig, Greg Barron, Elke Weber and Ido Erev) discovered that even as people over-weight probabilities they’ve only heard or read about, they do the exact opposite for the probabilities they’ve learned about through personal experience.

Here’s an example: If I show you two decks of cards and tell you that pulling a random card from Deck A is guaranteed to lose you $3, while Deck B gives you a 10-percent chance of losing $32, which would you go for? If I only describe the odds, most people will stick with the sure loss from Deck A in this case, since they are over-weighting that 10-percent chance of losing big in Deck B. But, if I handed you both decks and had you test each of them out — allowing you to pull cards for free until you have a sense of how “good” or “bad” each deck is — then you’re likely to opt for the riskier Deck B. After getting experience with the probabilities in each deck, you’ll actually tend to under-weight the 10-percent chance of losing in Deck B. Experience, it seems, makes us under-weight rare events, even when we correctly estimate the probabilities.

What does this mean for earthquakes and other rare but inevitable natural hazards? If people have years of experience to draw on as they compute an intuitive probability of the next big disaster, chances are that they’ll be under-weighting those odds, and therefore under-preparing for the event. And although telling people the described, numeric probability does in general lead them to over-weight rare events, there’s evidence that people who have both experience and a description of the same rare event will tend to base their decisions on their experience — they’ll still under-weight.

Many researchers, those in my lab among them, are working now to better understand how the effects of described versus experienced probabilities play out in complex real-world settings. It may be that in dynamic hazard-preparedness situations, other psychological effects take over and wash out these subtle effects we see in laboratory card-game gambles. But while we work on those and other answers, just keep in mind where your probability information is coming from, and ask yourself whether your own experience might be making you feel a little bit safer than you actually are.

Katherine Fox-Glassman

Credit: Katherine Fox-Glassman.

Fox-Glassman is a graduate student in psychology at Columbia University who is studying the interplay of risk perception, decision-making and memory. The views expressed are her own. She can be reached at: kjt2111@columbia.edu.

Monday, March 23, 2015 - 06:00

Did you know ...

EARTH only uses professional science journalists and scientists to author our content?  In this era of fake news and click-bait, EARTH offers factual and researched journalism. But EARTH is a non-profit magazine, and at least 10 times more people read EARTH than pay for it. As advertising revenues across the media decline, we need your help to ensure that we can continue bringing you the reliable and well-written coverage of earth science you know and love. Our goal is not only to inform our readers, but to inform decision makers across the economic and political spectrum about the science of our planet. So, we need your help. By becoming a subscriber or making a tax-deductible contribution to support EARTH, you can fund our writers and help make sure the world knows about our planet.

Make a contribution

Subscribe