Photo by geralt on Pixabay
By Ralph Zoontjens
Artificial intelligence (AI) is a fast-evolving field of solutions that are well-suited to spot bone fractures that human specialists may miss. Decades ago, computer systems were limited to pre-programmed routines, but today’s pattern recognition algorithms are self-learning based on colossal datasets.
Even if AI will never replace a human specialist, the improvements with this new technology are substantial.
The Digital Doctor
Each year, there are around two million bone fractures in the U.S. Fracture interpretation errors make for roughly a quarter of severe diagnostic errors in emergency rooms. That’s a lot of misdiagnoses.
Outcomes improved with the advent of teleradiology. Able to work from any location and on a 24/7 basis, radiologists today can be assigned to cases worldwide that match their specific expertise.
AI further empowers the radiologist in their accuracy and helps prevent overlooked fractures. In one study using cutting-edge technology, AI reduced missed fractures by 29%. It also increased readers’ ratio of true positives (sensitivity) by 16% and true negatives (specificity) by 5%.
Artificial intelligence saves time by plowing through piles of imaging data, highlighting potential lesion zones and triaging the most critical cases. This streamlines the workflow, relieving overworked clinicians and reducing error rates.
Advancements in AI
Medical AI has come a long way since the 1980s. In the early days, algorithms were human-instructed to find patterns in images that required attention. This is not unlike an automated grammar checker where the computer validates if the right patterns are present as dictated by human authorities.
But it doesn’t mean the system can read and interpret itself.
As processing power increased, a variety of advanced AI models became available for medical applications that approach human-level intelligence:
- Artificial neural networks (ANNs) work much like the human brain in the way data is internally structured.
- With supervised machine learning (SML), the ANN is exposed to thousands of source images that are human-labeled with the correct answer. For example, “fracture” or “no fracture.” Based on this dataset, the system then learns to predict the outcome for newly introduced images.
- In unsupervised learning (UL), the computer is simply presented with information for it to detect patterns and relationships on its own. This blindfolded approach can take thousands of iterations, but it can be better suited for detecting anomalies.
- With deep learning (DL), the system contains several layers of processing, where images are internally translated in a variety of ways. Although these intermediate stages render the system less transparent, this does achieve greater accuracy.
- At the current forefront of software innovation is the convolutional neural network (CNN). It is similar to an ANN but its nodes share more data internally, allowing for more complex and refined information processing.
Computer-aided diagnosis (CAD) thus comprises a vast field of solutions in terms of AI algorithms, analytics, and computer vision. With growing insights, connectivity, and cloud processing power, this realm is evolving to better suit the needs of healthcare professionals.
Workflow Integration
The number of CT scans has risen from 18.3 million in 1993 to 80 million annually. So it’s not hard to imagine that the vast majority of medical data comprises diagnostic imaging.
All this data is handled on a dedicated AI server. After digital X-ray machines deliver their raw material to the picture archiving and communication system (PACS), they are automatically transferred to the server while remaining fully integrated with other radiology information systems (RIS).
Seamlessly embedding artificial intelligence applications into the overall healthcare information technology ecosystem this way will, together with a user-friendly interface, make for a pleasurable experience for both patient and clinician.
No Job Replacement
As an extra set of eyes, AI is a highly useful assistant, especially when detecting scaphoid or hairline fractures in digital X-rays. It can prevent such misses that would otherwise lead to arthritis and debilitating pain.
But even though it takes human error and subjectivity out of the equation, computer vision will never function perfectly. So software will not replace the human specialist, who must provide a double check in the context of a more holistic injury assessment.
About the Author
Ralph Zoontjens is a product designer with a master’s degree in Industrial Design from the Eindhoven University of Technology. Currently based out of Tilburg, the Netherlands, he specializes in 3D printing and works as a content writer with topics around design and technology.
Throughout the year, our writers feature fresh, in-depth, and relevant information for our audience of 40,000+ healthcare leaders and professionals. As a healthcare business publication, we cover and cherish our relationship with the entire health care industry including administrators, nurses, physicians, physical therapists, pharmacists, and more. We cover a broad spectrum from hospitals to medical offices to outpatient services to eye surgery centers to university settings. We focus on rehabilitation, nursing homes, home care, hospice as well as men’s health, women’s heath, and pediatrics.