Human error is seen as inevitable, unpredictable and common to us all.
A catch-all for errors is to blame them on “human error.” Human error is seen as inevitable, unpredictable, and common to us all. We forget, get distracted, make judgment errors, draw false conclusions, overlook the obvious, and make other mistakes. It’s curious that something so universal seems to be so mysterious. In effect we end up taking it for granted that human errors will always happen.
One source of human error is confirmation bias. Awareness of confirmation bias can help prevent many errors.
Confirmation bias is our natural cognitive bias to seek, interpret, favor, and recall information that confirms our beliefs, assumptions, or expectations. It’s one reason for the ongoing success of cable news networks. Conservative viewers tend to watch stations that confirm their viewpoints; the same is true for liberals. We also tend to weigh evidence in proportion to our belief systems, ignoring or giving less importance to that which contradicts what we assume is true.
Confirmation bias can have a significant effect on laboratory results. A few examples:
A hemogram MCV is 105. When reviewing the peripheral smear are you more or less likely to see macrocytes?
If a patient has an anti-E and a unit of blood is negative for E, are you more or less likely to see a negative reaction?
A urine dipstick is positive for leukocyte esterase and nitrite. Are you more or less likely to look for white cells and bacteria, and ignore renal epithelial cells?
Confirmation bias can affect how we troubleshoot instrument problems. Based on prior experience or a perception of what is usually wrong, we can ignore evidence to the contrary.
This intellectual “cherry picking” is our brain’s shortcut to understanding patterns and information. It makes sense from a time-saving standpoint: it takes time and energy to refocus on information to be sure nothing is missed. But it can be a source of serious human error, too. Indeed it is identified in the Merck Manual as a source of error in clinical decision making: “For example, a clinician may steadfastly cling to patient history elements suggesting acute coronary syndrome (ACS) to confirm the original suspicion of ACS even when serial ECGs and cardiac enzymes are normal.”
The laboratory is complex enough to have a multitude of opportunities for this kind of error, especially in reviewing culture plates, peripheral smears, and other testing that requires judgment. Fortunately, the strategy to avoid confirmation (and other cognitive) bias is simple: stop and think. Put your assumption aside as a hypothesis and ask, “Is there anything else this could be?” This can be difficult to do in a hurry, but avoiding negative outcomes is worth the small amount of time it can take to question yourself. Do you really see those spherocytes? Was that antiglobulin crossmatch really negative? Are you sure there isn’t any sample carryover with that probe?
Sometimes, it isn’t worth it trusting our assumptions.
NEXT: GFR for Everybody?