The new National Radiology Data Registry announced this week by the American College of Radiology is capable of monitoring AI results and collecting an array of contextual information such as patient data, clinical metadata and radiology report results.
It’s intended to compare individual radiology practice results against aggregated national performance benchmarks from other sites using identical or similar products, ACR says.
WHY IT MATTERS
Clinical sites and AI developers can use ACR’s Assess-AI, which will monitor the performance of an array of imaging AI algorithms in real-world clinical settings to obtain performance reports of deployed AI, the organization said on Monday.
The ACR Data Science Institute will facilitate oversight of the data, which could also help developers improve future versions of algorithms, ACR said.
Radiology’s legacy systems were not built to support activities to ensure algorithms used in clinical settings are operating as expected, according to Dr. Christoph Wald, vice chair of the ACR Board of Chancellors and chair of the ACR Commission on Informatics.
“Users of AI technology in radiological care need to ensure that algorithms perform sufficiently in their local environment,” Wald said in a statement.
“With the rising demand for imaging outpacing the supply of radiologists, AI is seen as an essential tool to help bridge the gap and enable radiologists to maintain high standards of care while meeting increasing demands,” Dr. Woojin Kim, ACR DSI chief medical officer, added.
ACR DSI facilitated oversight offers “a tangible real-world approach to address a challenge radiologists are increasingly faced with today,” Wald said.
Participating radiology facilities will have access to:
- Algorithm stability monitoring over time, including imaging equipment, protocols and software version.
- Continuous oversight of AI result concordance/discordance with radiology reports.
- Aggregated observations in reports and dashboards.
THE LARGER TREND
Innovation around radiology has been happening rapidly with a number of clinical and financial AI use cases showing their worth, but long-term algorithm safety is still an important consideration for ACR.
Scottsdale, Arizona-based SimonMed is one of many practices that have seen AI tools improve workflows. It said algorithms are turning radiology reports around 82% faster than readings without automation.
“In terms of enhanced diagnoses, the tools can be truly remarkable, so it is important to be open-minded and curious as this is a rapidly evolving field,” Dr. John Simon, CEO of SimonMed, told Healthcare IT News in February.
Previously in July, ACR launched the Recognized Center for Healthcare as a first-of-its-kind quality assurance program to benchmark radiology facilities that are making use of AI in their imaging workflows.
“Even a U.S. Food and Drug Administration-cleared AI product must be tested locally to ensure it works safely and as intended,” Wald cautioned in the quality assurance center’s announcement.
“Practice leaders must put safeguards in place to maximize the benefit of AI products while minimizing risk.”
ON THE RECORD
“Assess-AI will play a critical role in safely and effectively accelerating the clinical adoption of AI in radiology by ensuring AI products perform optimally in clinical settings so that radiologists can focus on what’s most important – providing high-quality care to their patients,” Kim said in a statement.
“Expanding on its commitment to the radiology profession, the launch of Assess-AI is ACR’s latest step in helping empower radiologists to implement AI safely, effectively and transparently.”
Andrea Fox is senior editor of Healthcare IT News.
Email: [email protected]
Healthcare IT News is a HIMSS Media publication.