Wednesday, June 15, 2022

Diagnostic time out

What is a diagnostic time out? Succinctly defined, it’s a deliberate exercise in differential diagnosis and systematic clinical reasoning in the care of an individual patient. But wait, I hear someone say… isn’t that what we do already? Well, no. We’re all familiar with the traditional model for clinical reasoning that we’re taught in medical school but those of us in the real world of practice nowadays, if we’re honest, realize that it seldom happens. There’s just not enough time when you’re forced to see too many patients each day. And hospitalist incentives, with their emphasis on speed and quick adoption of specific diagnostic labels, run in opposition. What do we as hospitalists do instead? Well, aside from all the care pathways and metric incentives that tell us what to do, we rely on clinical instincts and rules of thumb. Because they bypass formal analysis, they save time. They serve as cognitive shortcuts. We call these heuristics. This method of thinking (fast, instinctive, intuitive) is sometimes known as system 1 thinking. It has the advantages of being efficient and fast and sometimes, in critical situations, life saving. But it comes at the cost of a certain error rate. In order to better understand the process of system 1 thinking we have given the various heuristics names and categories. I recently listed some of those in this post


If system 1 is our usual measure of processing to get around time constraints the alternative is system 2: formal clinical reasoning .  System 2 thinking was the topic of a recent paper in CriticalCare Clinics. Although based on a survey of people working in a NICU the article has general applicability. The authors contrast system 1 and system 2 thinking in this manner:


Dual process theory holds that individuals engaging in medical decision-making use one of 2 distinct cognitive processes: a system 1 process based on heuristics – the use of rapid pattern recognition and rules of thumb – or a system 2 process, based on deliberate analytical modeling and hypothesis generation. While invoking system one processes individuals can think fast and reflexively and can even operate at a subconscious level, using pattern recognition to sort vast amounts of clinical information quickly before an illness script that allows for the rapid elaboration of a differential diagnosis. In contrast system 2 processes require focused attention and are purposefully analytical, relying on deliberate counter-factual reasoning to generate hypotheses regarding the pathophysiologic mechanisms by which a patient’s symptoms are produced.


The authors introduced the concept of the diagnostic time out to describe this shift of thinking because it requires deliberate effort. It’s not going to arise spontaneously in the natural course of the ward routine. (The authors were not the first ones to use this term). The diagnostic time out can be considered the cognitive equivalent of the better known procedural time out.


Why is a diagnostic time out needed? Research on diagnostic error has indicated that while some instances are due to system problems (such as failure to communicate test results) most are cognitive errors. These can be linked to the heuristics of system 1 thinking. The diagnostic time out, or the deliberate exercise of system 2 thinking, is a way to complement these cognitive shortcuts with a more analytical process.


Some opinion leaders in the field of diagnostic error have suggested universal adoption of system 2 thinking. This is problematic due to time constraints. Besides, there are some essential benefits of system 1 thinking, particularly in acute life-threatening situations. The real trick is how best to selectively employ system 2 thinking. In other words what are the situations in which system 2 thinking should be used? The authors suggest handoff situations in complex patients including ER to hospitalist, off service/on service and ICU to ward transfers.


How does it work? The authors propose a template but it’s really just the traditional clinical reasoning process. One of their points really got my attention: during the time out diagnostic labels should be removed and replaced by signs, symptoms, manifestations and clinical concerns. This of course is the opposite of what your coders and hospitalist leaders want you to do.


What are some of the barriers to implementation? In addition to time constraints, fear of ambiguity is an important factor. We are afraid to admit what we don’t know. One thing you will never hear a hospitalist say out loud is “I’ll have to think about that.”


Saturday, June 11, 2022

A little more on metacognition

This article from Academic Emergency Medicine, published in 2002, remains applicable today. It makes the point that heuristics in medicine are valuable even though they can lead to error. The article also makes the statement:


The increasing use of clinical decision rules, as well as other aids that reduce uncertainty and cognitive load, e.g., computerized clinical decision support,will improve certain aspects of clinical decision making, but much flesh-and-blood clinical decision making will remain and there will always be a place for intuition and clinical acumen.


It presents an exhaustive list with detailed descriptions of the various cognitive shortcuts.


Indulge me in a little metacognition

I found an interesting post about cognitive shortcuts in medicine. I have a minor objection to the title of the post which is Cognitive Errors. Cognitive shortcuts, known as heuristics, which are examples of fast instinctive thinking (system one), often lead to error. In some cases, however, they can be useful because they are efficient and time saving. There is an up side as well as a down side to system one thinking in medicine.


Let’s go down the list. I’ve skipped some of them.


The first example given is affective error. This refers to an emotional response surpassing objectivity.


Next is aggregate bias. I struggle with this one. The author says that the aggregate bias is the belief that data in the aggregate don’t apply to the patient in front of you. My understanding (maybe I’m wrong) is that aggregate bias, otherwise known as the ecological fallacy, is the opposite. That is, it refers to inappropriate application of population data to an individual. It has more to do with treatment decisions than diagnostic error. Remember, one of the first principles of evidence-based medicine is that clinical reasoning decision making starts with the unique aspects of the individual patient. After looking at a variety of references, it would appear that both definitions have been used. Most medical references define aggregation bias the way the blog author does. Those outside of medicine define it as inappropriate extrapolation.


The ambiguity effect is really a bias against ambiguity. So we tend to stick with things we are more familiar with. That may cause us to ignore other possibilities and take too narrow a view of things. As originally conceived it had to do with probability. That is, people have a tendency to gravitate toward choices in which the probability is known or explicitly stated. Of note, the ambiguity effect was first described by Daniel Ellsberg.


The anchoring heuristic is one of the better known cognitive biases. This refers to the tendency to stick with one’s initial hunch despite new evidence to the contrary. You may be so proud of your initial hunch that you ignore new information. Confirmation bias and diagnostic momentum are related concepts.


Ascertainment bias, as the author points out, is an umbrella category. It encompasses a lot of stereotypes and biases. In essence it’s just—-well, bias. It’s not very useful as a unique category in discussions of cognitive error.


Availability bias is one of the better known cognitive shortcuts. This refers to the influence of prior experience. This causes bias toward the first thing that comes to your mind. For example, if you’ve been burned by having missed a case of aortic dissection you may tend to be over concerned about aortic dissection in every future case of chest pain. The flip side is you may fail to consider things you haven’t seen in a long time.


Base rate neglect is a cognitive shortcut that may be considered harmful and wasteful in ambulatory medicine but may be your friend in the arena of hospital and emergency medicine. It’s a failing to consider the true prevalence of diseases in clinical reasoning. It ignores the old aphorism “common things happen most often.” In the high acuity world of the hospital, where you really need to be risk-averse, base rate neglect may be beneficial. Put another way you and and your patient may be better off if you consider worst case scenario.


Then there’s belief bias. I’m not sure this belongs in a discussion of diagnostic shortcuts as it has more to do with treatment recommendations. I cringe when I hear somebody say they “believe“ in a particular treatment, implying that belief surpasses reasoning from evidence .


Blind spot bias is similar to the Dunning Kruger effect in which we think we're smarter than we really are. Humility is the remedy here. Does this lead to a form of cognitive shortcut? Maybe in that we fail to pause and consider carefully that we might be wrong.


Confirmation bias is akin to anchoring. This is the tendency to be selective in what type of accumulating evidence you consider. That is, you consider mainly evidence that supports your original hunch.


The framing heuristic is another well known shortcut. We are biased toward diagnostic possibilities in accordance with the way the initial presentation is framed. Though it can be useful it restricts our differential diagnosis in a way that excludes a wide range of possibilities. Not every returning travel with fever has a parasite, for example.


The gamblers fallacy, according to the blog author, is “the erroneous belief that chance his self correcting.“ This is a cognitive error that tends in the opposite direction to the availability heuristic.


The order effect is something I was vaguely aware of but had not considered as a cognitive error category. It refers to the tendency to focus on information that is proximate in time and to do so at the expense of the totality of events over time. This typically occurs at the point of hand off in a patient who has had a very long hospital course.


Premature closure is just what it says. It’s a tendency for thinking to stop once a tentative diagnosis has been made. It overlaps with other categories such as anchoring. There is probably a subtle difference between premature closure and anchoring. Anchoring implies an emotional attachment to a diagnosis whereas premature closure implies diagnostic laziness.


Representativeness restraint has also been known as a representativeness heuristic. It is a cognitive shortcut characterized by focusing too much on the prototypical manifestations of a disease. This may cause the clinician to miss atypical presentations.


Search satisfaction is another example of laziness in clinical reasoning. It’s a tendency to stop searching once an answer has been found. The author gives the example of missing a second fracture on an x-ray once the first one is identified.


Sunk cost fallacy is a type of emotional heuristic as well as diagnostic laziness. It is the tendency to ignore new information and not consider alternative diagnoses once the original diagnosis has been arrived at after a great time effort and expense (the sunk cost).


Sutton’s slip might be the dark side of Sutton’s law (going where the money is). Pursuing the obvious might lead to error because of other possibilities being ignored.


Zebra retreat is the avoidance of rare diagnoses to a fault. It’s an opposite of base rate neglect.