Though there’s little evidence regarding whether any CME format is more effective than others there was no shortage of strong opinions in this Roundtable. I did make one strong evidential point:
High-quality evidence concerning the effects of CME is almost nonexistent. The notable exception is worth looking at in some detail. It is a grand experiment involving an educational program that contains all the elements that my Roundtable colleagues find desirable: interactive format; performance measurement; immediate feedback; and rigorous adherence to "best practices." I'm referring to advanced cardiac life support. Quality evidence exists for both performance and patient outcomes. According to both levels of evidence, the program has failed. Studies have indicated that learner retention deteriorates rapidly over time.[1] Real-world adherence to the guidelines is as low as 40%.[2] Survival in cardiac arrest has been dismal, with negligible improvement over decades despite multiple evidence-based updates in course content and certification requirement for virtually all providers.[3] Exceptional improvements have been realized by only a handful of communities, which have departed from the performance measures to employ methods of resuscitation developed by researchers at the University of Arizona.[4] Although considered new, these methods have been used in select communities for several years, regardless of that fact their penetration into CME has been limited to the very activities that many would abandon: the traditional lecture.
No comments:
Post a Comment