On a recent driving trip I realized at one point that I was completely relying on my GPS device to tell me, turn by turn, where to go. If my GPS lost its signal I would have no idea where I was or how to proceed. In the past I would navigate with maps and I would have a pretty good idea of the roads between me and my destination. Now I mindlessly do whatever my cell phone tells me to do.

Is something similar happening in modern medicine? There is no question that our modern technology has given us powerful diagnostic tools, but has this atrophied our most important tool of all, our brains?

I get asked this question often – most recently I was sent the following:

“How about discussing the most important part of a stethoscope ie. the part between the ear pieces. As a fellow Neurologist, I am confronted with medical students and residents who fail to take a comprehensive history and exam. This leads to deficiencies in neurologic localization and diagnostic formulation. In this day and age we rely so heavily on diagnostic testing (eg. MRI, EMG, etc…) that the art and science of the 19th century Neurologist is thrown by the wayside. Is it any wonder that the recent Medicare cuts are hurting our profession? Your thoughts?”

This is somewhat conventional wisdom, that the availability of so many high-tech diagnostic options encourages doctors to think less, and to neglect low tech, but still highly useful, skills such as direct examination.

I do not agree with conventional wisdom, however.

First, medical students do often make this error of overreliance on testing. They make many other kinds of cognitive errors as well. I see this as a general pattern of intellectual laziness – taking mental short cuts based upon well-established psychological phenomena such as heuristics and common fallacies.

This is because medical students and doctors are people and they begin their medical training with all the cognitive biases and shortcomings that are inherent to the human brain. They need to have these mental habits beaten out of them over years of medical training.

As clinicians mature most learn the utility of the various diagnostic options that we have, including the physical exam. They learn how to use tests, and not use them, and the strengths and weaknesses of various diagnostic procedures. If all goes well they develop into an at least competent physician who can think clearly about such things. They will then see younger physicians and students relying too heavily on diagnostic tests and become part of the conventional wisdom that technology is rotting our brains (part of a more general “these kids today” phenomenon).

The other major reason I disagree with conventional wisdom is that I think modern diagnostic testing has actually made us far superior in our ability to localize a problem based upon the neurological exam. There is a tendency to think that in the past giants walked the earth, in this case that the early 20th and 19th century neurologists, who had to completely rely on their exam, must have been experts at the exam in ways that we have lost.

Pre-imaging technology neurologists, however, had a significant disadvantage. They could examine the patient, decide where they think the lesion might be and what kind of process it was, then treat based upon their diagnosis. However, in most cases they would never learn if they were correct. They were subject to confirmation bias, the toupee fallacy, and similar fallacies and so would have thought that their diagnoses were highly accurate, but in truth they had no idea. The best feedback they would have received would be from autopsies, but this would be in only some cases and often delayed.

The modern neurologist, on the other hand, sees a patient, does a neurological exam, makes a preliminary localization and diagnosis, and then will often order a CT scan or MRI scan. If this is a hospital patient, within an hour or two they will likely be looking at a scan and seeing the actual lesion. This provides immediate feedback, and they quickly learn how to correlate their exam diagnosis with the lesion seen on imaging.

I call this the localization game, and I strongly encourage my students and residents to play it. Decide where you think the lesion is before looking at the scan. Once you see the lesion on the MRI scan then you can rationalize why you would have thought all along that it was there. If you play the game, however, you will rapidly learn how to localize. Part of this is learning how reliable and predictive are the various aspects of the neurological exam.

To summarize – the real issue here is a more general one of thinking in a disciplined and critical manner vs intellectual laziness and falling back on common but flawed patterns of thinking. I do think that medical training can be more thorough and systematic in teaching students about the skeptical and critical thinking aspects of being a physician.

Technology itself is not a problem, and in fact, if used correctly, improves our ability to think clinically because of the reality-check it provides for our diagnostic theorizing. Overreliance on technology can be a symptom of the deeper problem described above, but not the cause of the problem.

 

Steven Novella, M.D. is the JREF's Senior Fellow and Director of the JREF’s Science-Based Medicine project.