| RSNA
RSNA Press Release

AI Helps Radiologists Detect Bone Fractures

Share: Share on Facebook PDF

Released: March 29, 2022

At A Glance

  • Missed or delayed diagnosis of fractures has potentially serious implications for patients.
  • AI is an effective tool for fracture detection that has potential to aid clinicians in busy emergency departments.
  • AI’s sensitivity for detecting fractures was 91-92%.

OAK BROOK, Ill. — Artificial intelligence (AI) is an effective tool for fracture detection that has potential to aid clinicians in busy emergency departments, according to a study in Radiology.

Missed or delayed diagnosis of fractures on X-ray is a common error with potentially serious implications for the patient. Lack of timely access to expert opinion as the growth in imaging volumes continues to outpace radiologist recruitment only makes the problem worse.

AI may help address this problem by acting as an aid to radiologists, helping to speed and improve fracture diagnosis.

To learn more about the technology’s potential in the fracture setting, a team of researchers in England reviewed 42 existing studies comparing the diagnostic performance in fracture detection between AI and clinicians. Of the 42 studies, 37 used X-ray to identify fractures, and five used CT.

The researchers found no statistically significant differences between clinician and AI performance. AI’s sensitivity for detecting fractures was 91-92%.

“We found that AI performed with a high degree of accuracy, comparable to clinician performance,” said study lead author Rachel Kuo, M.B. B.Chir., from the Botnar Research Centre, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences in Oxford, England. “Importantly, we found this to be the case when AI was validated using independent external datasets, suggesting that the results may be generalizable to the wider population.”

The study results point to several promising educational and clinical applications for AI in fracture detection, Dr. Kuo said. It could reduce the rate of early misdiagnosis in challenging circumstances in the emergency setting, including cases where patients may sustain multiple fractures. It has potential as an educational tool for junior clinicians.

“It could also be helpful as a ‘second reader,’ providing clinicians with either reassurance that they have made the correct diagnosis or prompting them to take another look at the imaging before treating patients,” Dr. Kuo said.

Dr. Kuo cautioned that research into fracture detection by AI remains in a very early, pre-clinical stage. Only a minority of the studies that she and her colleagues looked at evaluated the performance of clinicians with AI assistance, and there was only one example where an AI was evaluated in a prospective study in a clinical environment.

“It remains important for clinicians to continue to exercise their own judgment,” Dr. Kuo said. “AI is not infallible and is subject to bias and error.”

“Artificial Intelligence in Fracture Detection: A Systematic Review and Meta-Analysis.” Collaborating with Dr. Kuo were Conrad Harrison, B.Sc., M.B.B.S., M.R.C.S., Terry-Ann Curran, M.B. B.Ch. B.A.O., M.D., Benjamin Jones, B.M.B.Ch., B.A., Alexander Freethy, B.Sc., M.B.B.S., M.Sc., M.R.C.S., David Cussons, B.Sc., M.B.B.S., Max Stewart, M.B. B.Chir., B.A., Gary S. Collins, B.Sc., Ph.D., and Dominic Furniss, D.M., M.A., M.B.B.Ch., FRCS (Plast).

Radiology is edited by David A. Bluemke, M.D., Ph.D., University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin, and owned and published by the Radiological Society of North America, Inc. (https://pubs.rsna.org/journal/radiology)

RSNA is an association of radiologists, radiation oncologists, medical physicists and related scientists promoting excellence in patient care and health care delivery through education, research and technologic innovation. The Society is based in Oak Brook, Illinois. (RSNA.org)

For patient-friendly information on musculoskeletal imaging, visit RadiologyInfo.org.

Images (JPG, TIF):

Thumbnail
Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses flowchart shows studies selected for review. ACM = Association for Computing Machinery, AI = artificial intelligence, CENTRAL = Central Register of Controlled Trials, CINAHL = Cumulative Index to Nursing and Allied Health Literature, IEEE = Institute of Electrical and Electronics Engineers and Institution of Engineering and Technology.
High-res (TIF) version
(Right-click and Save As)
Thumbnail
Figure 2. Summary of study adherence to Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis (TRIPOD) reporting guidelines.
High-res (TIF) version
(Right-click and Save As)
Thumbnail
Figure 3. Summary of Prediction Model Study Risk of Bias Assessment Tool (PROBAST) risk of bias and concern about generalizability scores.
High-res (TIF) version
(Right-click and Save As)
Thumbnail
Figure 4. Hierarchical summary receiver operating characteristic (HSROC) curves for (A) fracture detection algorithms and (B) clinicians with internal validation test sets. The 95% prediction region is a visual representation of between-study heterogeneity.
High-res (TIF) version
(Right-click and Save As)
Thumbnail
Figure 5. Hierarchical summary receiver operating characteristic (HSROC) curves for (A) fracture detection algorithms and (B) clinicians with external validation test sets. The 95% prediction region is a visual representation of between-study heterogeneity.
High-res (TIF) version
(Right-click and Save As)
Thumbnail
Figure 6. Summary of pooled sensitivity, specificity, and area under the curve (AUC) of algorithms and clinicians comparing all studies versus low-bias studies with 95% CIs.
High-res (TIF) version
(Right-click and Save As)