Carney PA, Bogart A, Sickles EA, Smith R, Buist DS, Kerlikowske K, Onega T, Miglioretti DL, Rosenberg R, Yankaskas BC, Geller BM. Acad Radiol. 2013 Nov;20(11):1389-98. doi: 10.1016/j.acra.2013.08.017.


To describe recruitment, enrollment, and participation in a study of US radiologists invited to participate in a randomized controlled trial of two continuing medical education (CME) interventions designed to improve interpretation of screening mammography.

We collected recruitment, consent, and intervention-completion information as part of a large study involving radiologists in California, Oregon, Washington, New Mexico, New Hampshire, North Carolina, and Vermont. Consenting radiologists were randomized to receive either a 1-day live, expert-led educational session; to receive a self-paced DVD with similar content; or to a control group (delayed intervention). The impact of the interventions was assessed using a preintervention-postintervention test set design. All activities were institutional review board approved and HIPAA compliant.

Of 403 eligible radiologists, 151 of 403 (37.5%) consented to participate in the trial and 119 of 151 (78.8%) completed the preintervention test set, leaving 119 available for randomization to one of the two intervention groups or to controls. Female radiologists were more likely than male radiologists to consent to and complete the study (P = .03). Consenting radiologists who completed all study activities were more likely to have been interpreting mammography for 10 years or less compared to radiologists who consented and did not complete all study activities or did not consent at all. The live intervention group was more likely to report their intent to change their clinical practice as a result of the intervention compared to those who received the DVD (50% versus 17.6%, P = .02). The majority of participants in both interventions groups felt the interventions were a useful way to receive CME mammography credits.

Community radiologists found interactive interventions designed to improve interpretative mammography performance acceptable and useful for clinical practice. This suggests CME credits for radiologists should, in part, be for examining practice skills.

Screening mammography, interpretive accuracy, physician education