By Elizabeth Gardner
A standing-room-only crowd turned out to hear foremost experts discuss how artificial intelligence (AI) and machine learning (ML) are impacting radiology at Tuesday’s RSNA/American Association of Physicists in Medicine (AAPM) symposium.
The discipline now upending radiology as we know it — applying big-data processing power and techniques to digital images — is developing rapidly, but will need the trained brains of radiologists to make it into a daily tool for diagnosis and treatment.
“There is a lot of excitement and also a lot of questions,” said Paul E. Kinahan, PhD, moderator of the symposium, “Machine Learning in Radiology: Why and How?”
Dr. Kinahan, vice chair of radiology research and head of the Imaging Research Laboratory at the University of Washington, Seattle, noted that dozens of exhibitors at this year’s meeting, many of them exhibiting for the first time, are offering AI-based products.
During the session “Harnessing Artificial Intelligence,” presenter Keith Dreyer, DO, gave a high-level explanation of the complexities involved in teaching computers to read images. The audience was split among radiologists, physicists and vendors of AI tools.
“Machines are getting smarter faster than people are,” said Dr. Dreyer, vice chair of radiology and director of the Center for Clinical Data Science at Massachusetts General Hospital, Boston, and chair of the American College of Radiology’s Commission on Informatics.
He said early science-fiction depictions of AI that all but replace the human brain have given way to less complex, but immediately useful ML, which, for example, tells Amazon customers what other products they might be interested in based on past purchases.
“It’s not as sexy as AI, but it is a necessary foundation for AI to take hold,” Dr. Dreyer said.
Getting Involved in AI
A major obstacle to developing AI for imaging is the lack of what Dr. Dreyer called a “healthcare AI ecosystem.” He said radiology needs universally accepted ways to develop and incorporate AI, similar to the DICOM image standard, in order to make it easy for developers to create new applications and integrate them into imaging devices and clinical information systems.
Dr. Dreyer urged audience members to get involved in the first important task of developing medical imaging use cases. One of his slides showed a matrix of thousands of tiny squares, each one representing the intersection of a radiology specialty, an imaging modality, a part of the body and the lab and/or pathology findings that inform the imaging study. Each square is a possible use case for AI that will need clinicians to help develop the rules.
AI for Cancer Treatment
In his presentation, “Assistive AI for Cancer Treatment,” Antonio Criminisi, PhD, a principal researcher at Microsoft in Cambridge, U.K., described the company’s InnerEye project and how it can help pinpoint the location of tumors for targeted radiotherapy.
The technology is based on a principle similar to the one that drives the Microsoft Kinect game system, which senses human movement and “builds” a replica of the body inside the game. Dr. Criminisi’s team is teaching the InnerEye software to analyze pelvic images and identify anatomical structures, particularly the prostate, to speed the now laborious task of locating exactly which spots to radiate.
In another example, the team combined images of a brain tumor from six different imaging modalities in order to quantify changes in tumor volume.
“This technology is not for doing things that you already know how to do well, but to do things that you wish you were able to do,” Dr. Criminisi said.
But perfecting AI isn’t as important as getting it to work well enough for daily use, Dr. Dreyer said, noting the broad spectrum of accuracy among human radiologists.
“Do you need AI to be at the top?” he asked. “Or would it be enough for it to be somewhere in the middle, which is easier to achieve? I would argue there are many places on the globe where it would be adequate to have a good solution.”