Abstract
Diagnostic radiology training programs must produce highly skilled diagnostic radiologists capable of interpreting radiological examinations and communicating results to clinicians. Established training performance tools evaluate interpretive skills, but trainees' competency in reporting skills is also essential. Our semi-automated passive electronic tool entitled the Quantitative Reporting Skills Evaluation (QRSE) allows radiology training programs to evaluate the quantity of edits made to trainee preliminary reports by attending physicians as a metric to evaluate trainee reporting performance. Consecutive report pairs and metadata extracted from the radiology information system were anonymized and exported to a MySQL database. To perform the QRSE, for each report pair, open source software was first utilized to calculate the Levenshtein Percent (LP), the percent of character changes required to convert each preliminary report to its corresponding final report. The average LP (ALP), ALP for each trainee, and standard deviations were calculated. Eighty-four trainees and 56 attending radiologists interpreted 228,543 radiological examinations during the study period. The overall ALP was 6.38 %. Trainee-specific ALPs ranged from 1.1 to 15.3 %. Among trainee-specific ALPs, the standard deviation was 3.7 %. Our analysis identified five trainees with trainee-specific ALPs above 2 standard deviations from the mean and 14 trainees with trainee-specific ALPs less than 1 standard deviation below the mean. The QRSE methodology allows for the passive, quantitative, and longitudinal evaluation of the reporting skills of trainees during diagnostic radiology residency training. The QRSE identifies trainees with high and low levels of edits to their preliminary reports, as a marker for trainee overall reporting skills, and thus represents a novel performance metric for radiology training programs.
Similar content being viewed by others
References
American College of Radiology: ACR standard for communication: diagnostic radiology. In: Standards. Reston: American College of Radiology, 1999–2000, pp. 1–3
Goodman NW, Edwards MB: Medical Writing: A Prescription for Clarity. Cambridge Univ. Press, New York, 1991
Hall FM: Language of the radiology report: primer for residents and wayward radiologists. Am J Roentgenol 175:1239–1242, 2000
Hunter TB: Radiographic reports: structure and review. (letter). Am J Roentgenol 142:647–648, 1984
American College of Radiology: The American College of Radiology DXIT™ & TXIT™ in-training examinations. Downloaded from http://www.acr.org/Education/Exams-Certifications/DXIT-TXIT on July 22, 2012
American Board of Radiology: The American Board of Radiology initial certification process. Downloaded from http://www.theabr.org/ic-landing on July 22, 2012
Google-diff-match-patch. Downloaded from http://code.google.com/p/google-diff-match-patch/ on June 1, 2011
Carney E, Kempf J, DeCarvalho V, Yudd A, Nosher J: Preliminary interpretations of after-hours CT and sonography by radiology residents versus final interpretations by body imaging radiologists at a level 1 trauma center. AJR Am J Roentgenol 181:367–373, 2003
Wysoki MG, Nassar CJ, Koenigsberg RA, Novelline RA, Faro SH, Faerber EN: Head trauma: CT scan interpretation by radiology residents versus staff radiologists. Radiology 208:125–128, 1998
Tieng N, Grinberg D, Li SF: Discrepancies in interpretation of ED body computed tomographic scans by radiology residents. Am J Emerg Med 25:45–48, 2007
Meyer RE, Nickerson JP, Burbank HN, Alsofrom GF, Linnell GJ, Filippi CG: Discrepancy rates of on-call radiology residents interpretations of CT angiography studies of the neck and circle of Willis. AJR Am J Roentgenol 193:527–532, 2009
Itri JN, Kim W, Scanlon MH: Orion: a web-based application designed to monitor resident and fellow performance on-call. J Digit Imaging 24(5):897–907, 2011
Itri JN, Redfern RO, Scanlon MH: Using a web-based application to enhance resident trianing and improve performance on-call. Acad Radiol 17(7):917–920, 2009
Sharpe Jr, RE, Surrey D, Gorniak RJ, Nazarian L, Rao VM, Flanders AE: Radiology report comparator: a novel method to augment resident education. J Digit Imaging 25(3):330–336, 2012
Baghdanian A, Sharpe RE, Gorniak RJ, Nazarian LN, Rao VA, Flanders A: Quantitative metric to evaluate the clinical significance of changes made to trainee radiology reports. 2012 Annual Meeting of the Association of University Radiologists. March 21, 2012 (SS01-01). Downloaded from http://www.aur.org/Annual_Meeting/upload/Abstracts.pdf on July 22, 2012
ACGME: ACGME core competencies. Downloaded from http://www.acgme.org/acwebsite/RRC_280/280_corecomp.asp on June 1, 2012
Collins J, Rosado de Christenson M, Gray L, Hyde C, Koeller KK, Laine F, et al: General competencies in radiology training: definitions, skills, education and assessment. Academic Radiology 9(6):L721–L726, 2002
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Surrey, D., Sharpe, R.E., Gorniak, R.J.T. et al. QRSE: a Novel Metric for the Evaluation of Trainee Radiologist Reporting Skills. J Digit Imaging 26, 678–682 (2013). https://doi.org/10.1007/s10278-013-9574-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10278-013-9574-y