The first quality measure was the provision of discharge instructions on medications, diet and other aspects of heart failure care. In one study on which this recommendation was based the instructions included a full hour of one-on-one verbal counseling. The intervention was associated with improved outcomes. The “core quality” measure, in contrast, required only that written instructions be given to the patient. It’s one thing to hand patients a ream of paper as they are rushed out the door and quite another to provide detailed counseling. Nominal compliance may earn the hospital a perfect report card while doing little of substance to help patients.
In my recent top 10 post on quality I examined individual performance measures and again noted, concerning heart failure instructions:
Discharge instructions: Same as for smoking cessation above. In the studies which showed discharge instructions to improve outcomes a specialty nurse sat down and spent an hour or two with the patient and family. Hospitals, concerned about bed control and early discharge, don't feel they have time to do this and don't have to to play for the report card.
Now the NEJM reports a disconnect between quality and performance in heart failure instructions:
Methods We examined hospital performance on the basis of two measures of discharge planning: the adequacy of documentation in the chart that discharge instructions were provided to patients with congestive heart failure, and patient-reported experiences with discharge planning. We examined the association between performance on these measures and rates of readmission for congestive heart failure and pneumonia.
Results We found a weak correlation in performance between the two discharge measures (r=0.05, P less than 0.001). Although larger hospitals performed better on the chart-based measure, smaller hospitals and those with higher nurse-staffing levels performed better on the patient-reported measure. We found no association between performance on the chart-based measure and readmission rates among patients with congestive heart failure (readmission rates among hospitals performing in the highest quartile vs. the lowest quartile, 23.7% vs. 23.5%; P=0.54) and only a very modest association between performance on the patient-reported measure and readmission rates for congestive heart failure (readmission rates among hospitals performing in the highest quartile vs. the lowest quartile, 22.4% vs. 24.7%; P less than 0.001) and pneumonia (17.5% vs. 19.5%, P less than 0.001).
Although the authors considered both measures as performance measures, the real performance measure was the chart based measure (what the hospitals said they did) whereas the patient's reported experiences were closer to real quality. The correlation between what the institution said it did and what the patient reported was very weak, at r=0.05. Although the patient reported measure was associated with a highly statistically significant reduction in readmission rate the magnitude of the reduction was modest. This may be because the patient questionnaire itself was weak, making it easy to pass the measure (see appendix to the article).
1 comment:
The newest fallacy sold to the public is that medical quality will be measured and rewarded. The systems I've seen and endured, such as the PQRI program, cost us time and money and didn't contribute a wit toward quality. These bean counting strategies will count lots of data that are easily counted, but these are not surrogates for medical quality. What really counts in medicine, can't be easily counted. www.MDWhistleblower.blogspot.com
Post a Comment