A recent Medscape piece is titled In Search of a Better Way to Measure Quality Primary Care. I agree with the premise of the article but would strike the word “better.” No one has found a way to measure quality at all. Quality is mentioned multiple times throughout the article but I would call this what it really is: performance. Performance is a very poor surrogate for quality. From the article:
Although the intentions are good, the results of these measures are less clear. Measuring and reporting quality, as currently accomplished, has effects and consequences that negatively influence quality of care, patient outcomes, and clinician job satisfaction.
Note the confusion around the notion of quality. In the quote above there is one inappropriate and one appropriate use of the term. The footnote was in reference to this JAMA paper. From that paper:
Despite these plausible mechanisms of quality improvement, the value of publicly reporting quality information is largely undemonstrated and public reporting may have unintended and negative consequences on health care. These unintended consequences include causing physicians to avoid sick patients in an attempt to improve their quality ranking, encouraging physicians to achieve “target rates” for health care interventions even when it may be inappropriate among some patients, and discounting patient preferences and clinical judgment. Public reporting of quality information promotes a spirit of openness that may be valuable for enhancing trust of the health professions, but its ability to improve health remains undemonstrated, and public reporting may inadvertently reduce, rather than improve, quality.
Again, somewhat confusing use of the word but we get the idea. That was in 2005. I’ve seen nothing in the way of fundamental change since then that would undermine the premise of that article. There’s a lot to unpack from the text of the JAMA article quoted above. Avoiding sick patients is a way to get around not just process, but also outcome reports. Outcome reporting has been considered as a way to get around the pervasive gaming of process report cards. However, there are ways to game outcome metrics as well as processes metrics. Finally, the JAMA piece correctly points out that artificial incentives to adhere to metric targets discount the judgment of the clinician and the preferences of the patient, two of the three key elements of evidence based medicine. Thus these incentives actually oppose rather than promote EBM.