By Catherine A. Sanderson, Amherst College
The best way to understand one of the errors we make in problem-solving is by answering a simple question: Which of the following causes the most annual deaths—sharks, bees, alligators, dogs, bears, or cows? If you guessed bees, you are right. Most of us, however, guess sharks because of what is called an ‘availability bias’. It refers to our tendency to estimate the likelihood of an event based on the ease with which instances of it come readily to mind.

Availability Bias
Having an availability bias while problem-solving, people often mistakenly rely on vivid or salient information to estimate how likely something is to occur, instead of using the actual numerical likelihood. So, when someone dies of a shark attack, it’s often well covered in the media, and that makes it seem like a more prevalent cause of death than it actually is.
Again, going by the frequent reporting, school shootings get more attention than deaths by suicide, car accidents, or alcohol. Whereas, in reality, worldwide diseases such as tuberculosis, asthma, and diabetes lead to about 16 times as many deaths as do all accidents, such as drownings, car accidents, and plane crashes. And yet, there is far more coverage of deaths caused by accidents than those caused by diseases.
Researchers in one study compared the number of deaths caused by car accident in the months prior to the 9/11 terrorist attacks and the months after. What the availability bias predicted was that the awareness of what happened to the hijacked flights continued to be highly salient during the months after these attacks, which led people to believe that flying was riskier than driving. As a result, more people opted to drive. But the resulting increase in car traffic led to a corresponding increase in deaths in car accidents while people who chose to fly were actually safer.
This article comes directly from content in the video series Introduction to Psychology. Watch it now, on Wondrium.
Relying on Stereotypes for Problem-solving
A related bias that can lead us to think something is more likely than it actually is involves stereotypes. Known as representativeness, this bias describes a tendency to assume that someone is a member of a certain group if he or she fits our stereotype, or representation, of a group member, even if group membership is statistically unlikely.
Here’s an example: Sarah loves to listen to New Age music and faithfully reads her horoscope each day. In her spare time, she enjoys aromatherapy and attending a local spiritual group. Based on this description, is Sarah more likely to be a schoolteacher or a holistic healer?
Most people would say she’s probably more likely to be a holistic healer, as this description of Sarah fits with our stereotypes about what a person with this type of job would be like. But in reality, it’s far more likely that Sarah is a schoolteacher, simply based on the statistical odds; there are many, many more teachers than holistic healers.
Framing Options
Interestingly, it’s not only our images of people or products that can lead us to make errors in thinking. In some cases, simply the wording used can influence our evaluation.
Would you prefer a medical procedure that has a 90% success rate, or one with a 10% failure rate? Would you buy a yogurt that is 95% fat-free, or one that is 5% fat?
These examples, although, describe the exact same thing, but the options are presented, or framed, in different ways. And yet, even this subtle type of framing can have a substantial impact on how we think about these choices. People see a medical procedure with a 10% failure rate as riskier than one with a 90% success rate.
The Left-digit Bias
Another short cut that’s closely related to framing is the left-digit bias, meaning the tendency to focus on the left-most digit of a number and to pay less attention to the other digits.

This bias explains why many prices of good of all types end in 99 instead of 00. A car that costs $14,999 is of course virtually the same price as one that costs $15,000, but somehow a car with a price starting with 14 seems to cost less than one with a price starting with 15.
Life-threatening Consequences
And left-digit bias can have more serious, even life-threatening, consequences. In one studied published in 2020 in the New England Journal of Medicine, researchers compared treatment decisions made by doctors for heart attack patients who were 79, but nearly 80, versus those who had recently turned 80. Surprisingly, doctors were significantly more likely to perform a coronary artery bypass surgery on the younger patients.
The doctors apparently saw these patients as ‘in their 70s’ and thus better able to benefit from surgery than patients they saw as ‘in their 80s’. And here’s the most important finding from this study—patients who had the surgery were less likely to die in the next 30 days. So, at least in this case, left-digit bias by doctors led to meaningful differences in survival for their patients.
Thus, in conclusion, one can say that we clearly do often rely on mental shortcuts in making decisions. And yet, even the most carefully thought-out decision or solution may end up being reflective of a bias or a stereotype or the skewed way in which we sometimes perceive things.
Common Questions about Problem-Solving and Mental Shortcuts
Having an availability bias while problem-solving, people often mistakenly rely on vivid or salient information to estimate how likely something is to occur, instead of using the actual numerical likelihood.
A related bias that can lead us to think something is more likely than it actually is involves stereotypes. Known as representativeness, this bias describes a tendency to assume that someone is a member of a certain group if he or she fits our stereotype, or representation, of a group member, even if group membership is statistically unlikely.
Subtle type of framing options can have a substantial impact on how we think about these choices. People see a medical procedure with a 10% failure rate as riskier than one with a 90% success rate.