Since urgent care centers and emergency rooms are frequently busy, patients may have to wait several hours before being seen, assessed, and treated. Because radiologists frequently examine x-rays for many patients, waiting for x-rays to be interpreted by them might add to the wait time.
A new study has found that artificial intelligence (AI) can help physicians in interpreting x-rays after an injury and suspected fracture.
“Our AI algorithm can quickly and automatically detect x-rays that are positive for fractures and flag those studies in the system so that radiologists can prioritize reading x-rays with positive fractures. The system also highlights regions of interest with bounding boxes around areas where fractures are suspected. This can potentially contribute to less waiting time at the time of hospital or clinic visit before patients can get a positive diagnosis of fracture,” explained corresponding Ali Guermazi, MD, PhD, chief of radiology at VA Boston Healthcare System and Professor of Radiology & Medicine at Boston University School of Medicine (BUSM).
Up to 24 percent of hazardous diagnostic mistakes made in the emergency room are errors in fracture interpretation. Additionally, from 5 p.m. to 3 a.m., radiographic diagnosis errors of fractures are more frequent. This is probably due to weariness and inexperienced reading.
Our study was focused on fracture diagnosis, but similar concept can be applied to other diseases and disorders. Our ongoing research interest is to how best to utilize AI to help human healthcare providers to improve patient care, rather than making AI replace human healthcare providers. Our study showed one such example.
Ali Guermazi
To find fractures of the limbs, pelvis, torso, lumbar spine, and rib cage, a significant number of X-rays from various institutions were used to train the AI algorithm (AI BoneView).
The performance of human readers with and without AI support was compared, with the performance of expert human readers (musculoskeletal radiologists, who are subspecialized radiology specialists after obtaining specific training on reading bone x-rays) defining the gold standard.
Radiologists, orthopedic surgeons, emergency room doctors, physician assistants, rheumatologists, and family doctors all of whom interpret x-rays in real clinical practice to identify fractures in their patients were among the readers employed to imitate real-life scenarios.
With and without AI aid, the diagnostic accuracy of each reader’s fractures was measured against the industry benchmark. They also evaluated how well AI performed diagnostically when compared to the industry standard.
AI aid improved specificity by 5%, enhanced reader sensitivity by 16% for tests with more than one fracture, and decreased missed fractures by 29%.
Guermazi thinks artificial intelligence (AI) has the potential to be an effective tool for radiologists and other medical professionals to improve diagnostic performance and efficiency while maybe enhancing patient experience during hospital or clinic visits.
“Our study was focused on fracture diagnosis, but similar concept can be applied to other diseases and disorders. Our ongoing research interest is to how best to utilize AI to help human healthcare providers to improve patient care, rather than making AI replace human healthcare providers. Our study showed one such example,” he added.