0

Will AI replace doctors who read X-rays, or just make them better?

WASHINGTON

Associated Press

How good would an algorithm have to be to take over your job?

It’s a new question for many workers amid the rise of ChatGPT and other AI programs that can hold conversations, write stories and even generate songs and images within seconds.

For doctors who review scans to spot cancer and other diseases, however, AI has loomed for about a decade as more algorithms promise to improve accuracy, speed up work and, in some cases, take over entire parts of the job. Predictions have ranged from doomsday scenarios in which AI fully replaces radiologists, to sunny futures in which it frees them to focus on the most rewarding aspects of their work.

That tension reflects how AI is rolling out across health care. Beyond the technology itself, much depends upon the willingness of doctors to put their trust — and their patients’ health — in the hands of increasingly sophisticated algorithms that few understand.

Even within the field, opinions differ on how much radiologists should be embracing the technology.

“Some of the AI techniques are so good, frankly, I think we should be doing them now,” said Dr Ronald Summers, a radiologist and AI researcher at the National Institutes of Health. “Why are we letting that information just sit on the table?”

Summers’ lab has developed computer-aided imaging programs that detect colon cancer, osteoporosis, diabetes and other conditions. None of those have been widely adopted, which he attributes to the “culture of medicine,” among other factors.

Radiologists have used computers to enhance images and flag suspicious areas since the 1990s. But the latest AI programs can go much further, interpreting the scans, offering a potential diagnosis and even drafting written reports about their findings. The algorithms are often trained on millions of X-rays and other images collected from hospitals.

Across all of medicine, the FDA has OK’d more than 700 AI algorithms to aid physicians. More than 75 percent of them are in radiology, yet just 2 percent of radiology practices use such technology, according to one recent estimate.

For all the promises from industry, radiologists see a number of reasons to be skeptical of AI programs: limited testing in real-world settings, lack of transparency about how they work and questions about the demographics of the patients used to train them.

“If we don’t know on what cases the AI was tested, or whether those cases are similar to the kinds of patients we see in our practice, there’s just a question in everyone’s mind as to whether these are going to work for us,” said Dr. Curtis Langlotz, a radiologist who runs an AI research center at Stanford University.

To date, all the programs cleared by the FDA require a human to be in the loop.

In early 2020, the FDA held a two-day workshop to discuss algorithms that could operate without human oversight. Shortly afterwards, radiology professionals warned regulators in a letter that they “strongly believe it is premature for the FDA to consider approval or clearance” of such systems.

Comments

Use the comment form below to begin a discussion about this content.

Sign in to comment