I found an interesting post about
cognitive shortcuts in medicine. I have a minor objection to the
title of the post which is Cognitive Errors.
Cognitive shortcuts, known as heuristics, which are examples of fast
instinctive thinking (system one), often lead to error. In
some cases, however, they can be useful because they are efficient
and time saving. There is an up side as well as a down side to
system one thinking in medicine.
Let’s go down the
list. I’ve skipped some of them.
The first example
given is affective error. This refers to an emotional
response surpassing objectivity.
Next is aggregate
bias. I struggle with this one. The author says that the
aggregate bias is the belief that data in the aggregate don’t
apply to the patient in front of you. My understanding (maybe I’m
wrong) is that aggregate bias, otherwise known as the ecological
fallacy, is the opposite. That is, it refers to inappropriate application of
population data to an individual. It has more to do with treatment
decisions than diagnostic error. Remember, one of the first
principles of evidence-based medicine is that clinical reasoning
decision making starts with the unique aspects of the individual
patient. After looking at a variety of references, it would appear
that both definitions have been used. Most medical references define
aggregation bias the way the blog author does. Those outside of
medicine define it as inappropriate extrapolation.
The ambiguity
effect is really a bias against ambiguity. So we tend to
stick with things we are more familiar with. That may cause us to
ignore other possibilities and take too narrow a view of things. As
originally conceived it had to do with probability. That is, people
have a tendency to gravitate toward choices in which the probability
is known or explicitly stated. Of note, the ambiguity effect was
first described by Daniel Ellsberg.
The anchoring
heuristic is one of the better known cognitive biases. This
refers to the tendency to stick with one’s initial hunch despite
new evidence to the contrary. You may be so proud of your initial
hunch that you ignore new information. Confirmation bias and
diagnostic momentum are related concepts.
Ascertainment
bias, as
the author points out, is an umbrella category. It encompasses a lot
of stereotypes and biases. In essence it’s just—-well, bias.
It’s not very useful as a unique category in discussions of
cognitive error.
Availability
bias is one of the better known cognitive shortcuts. This refers
to the influence of prior experience. This causes bias toward the
first thing that comes to your mind. For example, if you’ve been
burned by having missed a case of aortic dissection you may tend to
be over concerned about aortic dissection in every future case of
chest pain. The flip side is you may fail to consider things you
haven’t seen in a long time.
Base rate neglect
is a cognitive shortcut that may be considered harmful and
wasteful in ambulatory medicine but may be your friend in the arena
of hospital and emergency medicine. It’s a failing to consider the
true prevalence of diseases in clinical reasoning. It ignores the
old aphorism “common things happen most often.” In the high
acuity world of the hospital, where you really need to be
risk-averse, base rate neglect may be beneficial. Put another way you
and and your patient may be better off if you consider worst case
scenario.
Then there’s
belief bias. I’m not sure this belongs in a discussion of
diagnostic shortcuts as it has more to do with treatment
recommendations. I cringe when I hear somebody say they “believe“
in a particular treatment, implying that belief surpasses reasoning
from evidence .
Blind spot
bias is similar to the Dunning Kruger effect in which we think
we're smarter than we really are. Humility is the remedy here. Does
this lead to a form of cognitive shortcut? Maybe in that we fail to
pause and consider carefully that we might be wrong.
Confirmation bias
is akin to anchoring. This is the tendency to be selective in what
type of accumulating evidence you consider. That is, you consider
mainly evidence that supports your original hunch.
The framing
heuristic is another well known shortcut. We are biased toward
diagnostic possibilities in accordance with the way the initial
presentation is framed. Though it can be useful it restricts our
differential diagnosis in a way that excludes a wide range of
possibilities. Not every returning travel with fever has a parasite,
for example.
The gamblers
fallacy, according to the blog author, is “the erroneous belief
that chance his self correcting.“ This is a cognitive error that
tends in the opposite direction to the availability heuristic.
The order effect
is something I was vaguely aware of but had not considered as a
cognitive error category. It refers to the tendency to focus on
information that is proximate in time and to do so at the expense of
the totality of events over time. This typically occurs at the point
of hand off in a patient who has had a very long hospital course.
Premature closure
is just what it says. It’s a tendency for thinking to stop once
a tentative diagnosis has been made. It overlaps with other
categories such as anchoring. There is probably a subtle difference
between premature closure and anchoring. Anchoring implies an
emotional attachment to a diagnosis whereas premature closure implies
diagnostic laziness.
Representativeness
restraint has also been known as a representativeness heuristic.
It is a cognitive shortcut characterized by focusing too much on the
prototypical manifestations of a disease. This may cause the
clinician to miss atypical presentations.
Search
satisfaction is another example of laziness in clinical
reasoning. It’s a tendency to stop searching once an answer has
been found. The author gives the example of missing a second fracture
on an x-ray once the first one is identified.
Sunk cost fallacy
is a type of emotional heuristic as well as diagnostic laziness. It
is the tendency to ignore new information and not consider
alternative diagnoses once the original diagnosis has been arrived at
after a great time effort and expense (the sunk cost).
Sutton’s slip
might be the dark side of Sutton’s law (going where the money is).
Pursuing the obvious might lead to error because of other
possibilities being ignored.
Zebra retreat
is the avoidance of rare diagnoses to a fault. It’s an opposite
of base rate neglect.