Saturday, August 23, 2014

Filtered resources versus Google to answer clinical questions

Which approach is better? Findings from a recent study might come as a surprise:

Method: In 2011 and 2012, 48 internal medicine interns from two classes at Rutgers University Robert Wood Johnson Medical School, who had been trained to use three evidence-based summary resources, performed four-minute computer searches to answer 10 clinical questions. Half were randomized to initiate searches for answers to questions 1 to 5 using Google; the other half initiated searches using a summary resource. They then crossed over and used the other resource for questions 6 to 10. They documented the time spent searching and the resource where the answer was found. Time to correct response and percentage of correct responses were compared between groups using t test and general estimating equations.
Results: Of 480 questions administered, interns found answers for 393 (82%). Interns initiating searches in Google used a wider variety of resources than those starting with summary resources. No significant difference was found in mean time to correct response (138.5 seconds for Google versus 136.1 seconds for summary resource; P = .72). Mean correct response rate was 58.4% for Google versus 61.5% for summary resource (mean difference −3.1%; 95% CI −10.3% to 4.2%; P = .40).
Conclusions: The authors found no significant differences in speed or accuracy between searches initiated using Google versus summary resources.

Does this mean Google is as good as filtered “evidence based” resources? Not necessarily. The filtered resources available to the participants were quite limited, consisting only of First Consult, DynaMed and Wiley’s Essential Evidence Plus. Certainly not representative of the best selection in that category. Participants using Google landed on a wider variety of sites including the primary sources themselves: journal articles.

How many of the Google users hit on social media? None! Social media are rising in their perceived importance as resources for answering clinical questions and are increasingly being promoted through initiatives like FOAM and Blitter. But this study, though limited by being small, suggests that penetration remains low. So for now it would appear that most users who Google clinical questions end up in non social media resources although most who do land on social media do so via Google (at least for my blog). These trends are likely to change over time.

No comments: