In my recent review of Peter Palmieria??s book Suffer the Children I said I would later try to cover some of the many other important issues he brings up. One of the themes in the book is the process of critical thinking and the various cognitive traps doctors fall into. I will address some of them here. This is not meant to be systematic or comprehensive, but rather a miscellany of things to think about. Some of these overlap.
Diagnostic fetishes
Everything is attributed to a pet diagnosis. Palmieri gives the example of a colleague of his who thinks everything from septic shock to behavior disorders are due to low levels of HDL, which he treats with high doses of niacin. There is a tendency to widen the criteria so that any collection of symptoms can be seen as evidence of the condition. If the hole is big enough, pegs of any shape will fit through. Some doctors attribute everything to food allergies, ??depression, environmental sensitivities,?? hormone imbalances, and other favorite diagnoses.?? CAM is notorious for claiming to have found the one true cause of all disease (subluxations, an imbalance of qi, etc.).
Favorite treatment.
One of his partners put dozens of infants on Cisapride to treat the spitting up that most normal babies do. ??Even after the manufacturer sent out a warning letter about babies who had died from irregular heart rhythms, she continued using it. Eventually the drug was recalled.
Another colleague prescribed cholestyramine for every patient with diarrhea: not only ineffective but highly illogical.
When I was an intern on the Internal Medicine rotation, the attending physician noticed one day that every single patient on our service was getting guaifenesin.?? We thought we had ordered it for valid reasons, but I doubt whether everyone benefited from it.
Recognizing warblers.
Like birdwatchers, hospitalists like Palmieri learn to identify which doctor admitted a patient. Child doesna??t appear sick; admitting diagnosis is a??occult bacteremiaa??; patient was given an intramuscular injection of Cephotaxime in the office a?? oh, that must be Dr. X.
Rapid identification vs pareidolia
Humans are good at pattern recognition. This allows experienced clinicians to make rapid diagnoses, but it also allows us to see the Virgin Mary on a grilled cheese sandwich.
Rooster syndrome
Rooster crows, sun comes up; therefore rooster made sun come up. Baby had colic, was given treatment X, colic resolved; therefore X cures colic. In reality, colic resolves spontaneously by 3-4 months of age and X was useless.
Copycats
Mimicking what other physicians in the community are doing.
Availability
Choosing a drug because you have samples handy that the drug rep left.
Ulysses syndrome
Ulysses went from one adventure to another in the odyssey of returning home from the Trojan War. A false positive test can lead to a fruitless odyssey of further investigation: tests lead to more tests, maybe even invasive procedures and harm to the patient. Eventually it is realized that the patient has been healthy all along.
Unnecessary lab tests
Sometimes tests are done in a scattershot attempt to find something, anything. Palmieria??s pathologist wife directs a laboratory and frequently gets calls from doctors who have ordered an unfamiliar test and have no idea what to do when they get an abnormal result. Instead of getting an individual chemistry test, we get SMAC panels because the machine is there and ita??s so convenient and cheap. With 20 tests on these panels, there is a 66% probability that at least one test will be outside normal limits on a perfectly healthy normal person.
Defensive medicine
With the present legal climate, doctors sometimes do tests or treatments with an eye to how things would look in court, rather than for the direct benefit of the patient.
Showmanship
Ordering tests to impress the patient that the doctor is being thorough and is actually doing something.
Hardwired fallibility
Our brains do not function in a rational, objective, logical way. We have built-in psychological mechanisms and defects in information processing; our brains have evolved a repertoire of tricks and shortcuts that serve us well in everyday life but that must be overcome for critical thinking and science.
Confirmation bias
Once we form a belief, we seek out evidence that confirms it and reject evidence that contradicts it. We are all biased, but by being aware of our biases we can activate a self-correcting mechanism.
Over-generalization
We form opinions about the many based on our experience of a few. We may base our idea of a disease on a patient who had an atypical presentation, or tend to avoid using a drug because of a patient who had an uncommon side effect. Radiologists who have missed a diagnosis are tempted to over-interpret x-ray findings for a time afterwards.
Anchoring
We tend to reach an early diagnosis and cling to it even when subsequent evidence doesna??t fit. We tend to accept the diagnosis of the referring physician rather than going back to square one to make up our own mind.
Diagnosis momentum
An early possibility becomes a presumptive diagnosis and gains legitimacy as it is repeated by more and more health care providers.
Framing
We seek a diagnosis within the context of how the information is presented to us. Palmieri tells about a boy who presented with a??frequent throat infections.a?? He was referred to ENT and even had a tonsillectomy before it was discovered that he had never even had a sore throat, only unexplained fevers that had been falsely attributed to throat infections but that eventually turned out to be due to juvenile rheumatoid arthritis.
Miscommunication and assumptions
Palmieri describes a case where an ENT consultant was called in directly by the worried parents of a child hospitalized with an ear infection. He assumed that they and the pediatrician must have wanted him to put in PE tubes; otherwise there would have been no earthly reason for a consult. He had booked an OR and scheduled the patient for surgery before it became clear that the child had a first ear infection that was responding to treatment, that ENT input was unnecessary, and that PE tubes were clearly not indicated.
Algorithms
We simplify our approach to complex problems by following algorithms like a??if the white count is over 15,000, give antibiotics.a?? This is not always appropriate. Algorithms provide a convenient framework, not an unalterable directive.
Tunnel vision
We are cautioned against thinking of zebras every time we hear hoofbeats, but we often fall into the opposite trap: we tend to fixate on the diagnoses we commonly see in our practice and not consider rare possibilities. On a recent episode of the television show a??Untold Stories from the ERa?? there was a toddler who was refusing to walk because of leg pain. They took x-rays looking for fractures to confirm their initial diagnosis of child abuse. It turned out he had scurvy, a vitamin C deficiency that simply doesna??t occur in the 21st century US a?? but it did, because he was refusing all foods but oatmeal and his uneducated parents didna??t know there was anything wrong with catering to his wishes.
Conclusion
In medical school, doctors learn science but they may not learn to think like a scientist. Once out in practice, they become vulnerable to unproven claims, myths, and pseudoscience; and they are encouraged to give advice based on common sense and intuition rather than on evidence. Not just doctors but everyone needs to better understand the cognitive traps we all fall into. Since our human brains are inherently fallible, only critical thinking and good science can keep us on track. A major theme of this blog is that good science is essential for correcting our errors.
No comments:
Post a Comment