Written by Denise Grady
Artificial intelligence can help doctors do a better job of finding breast cancer on mammograms, researchers from Google and medical centers in the United States and Britain are reporting in the journal Nature.
The new system for reading mammograms, which are X-rays of the breast, is still being studied and is not yet available for widespread use. It is just one of Google’s ventures into medicine. Computers can be trained to recognise patterns and interpret images, and the company has already created algorithms to help detect lung cancers on CT scans, diagnose eye disease in people with diabetes and find cancer on microscope slides.
“This paper will help move things along quite a bit,” said Dr Constance Lehman, director of breast imaging at the Massachusetts General Hospital in Boston, who was not involved in the study. “There are challenges to their methods. But having Google at this level is a very good thing.”
Tested on images where the diagnosis was already known, the new system performed better than radiologists. On scans from the United States, the system produced a 9.4 per cent reduction in false negatives, in which a mammogram is mistakenly read as normal and a cancer is missed. It also provided a lowering of 5.7 per cent in false positives, where the scan is incorrectly judged abnormal but there is no cancer.
On mammograms performed in Britain, the system also beat the radiologists, reducing false negatives by 2.7 per cent and false positives by 1.2 per cent.
Google paid for the study, and worked with researchers from Northwestern University in Chicago and two British medical centers, Cancer Research Imperial Centre and Royal Surrey County Hospital.
Last year, 268,600 new cases of invasive breast cancer and 41,760 deaths were expected among women in the United States, according to the American Cancer Society. Globally, there are about 2 million new cases a year, and more than half a million deaths.
About 33 million screening mammograms are performed each year in the United States. The test misses about 20 per cent of breast cancers, according to the American Cancer Society, and false positives are common, resulting in women being called back for more tests, sometimes even biopsies.
Doctors have long wanted to make mammography more accurate. “There are many radiologists who are reading mammograms who make mistakes, some well outside the acceptable margins of normal human error,” Lehman said.
To apply artificial intelligence to the task, the authors of the Nature report used mammograms from about 76,000 women in Britain and 15,000 in the United States, whose diagnoses were already known, to train computers to recognise cancer.
Then, they tested the computers on images from about 25,000 other women in Britain, and 3,000 in the United States, and compared the system’s performance with that of the radiologists who had originally read the X-rays. The mammograms had been taken in the past, so the women’s outcomes were known, and the researchers could tell whether the initial diagnoses were correct.
“We took mammograms that already happened, showed them to radiologists and asked, ‘Cancer or no?’ and then showed them to AI, and asked, ‘Cancer, or no?’” said Dr Mozziyar Etemadi, an author of the study from Northwestern University.
This was the test that found AI more accurate than the radiologists. Unlike humans, computers do not get tired, bored or distracted toward the end of a long day of reading mammograms, Etemadi said.
In another test, the researchers pitted AI against six radiologists in the United States, presenting 500 mammograms to be interpreted. Overall, AI again outperformed the humans. But in some instances, AI missed a cancer that all six radiologists found — and vice versa.
“There’s no denying that in some cases our AI tool totally gets it wrong and they totally get it right,” Etemadi said. “Purely from that perspective it opens up an entirely new area of inquiry and study. Why is it that they missed it? Why is it that we missed it?”
Lehman, who is also developing AI for mammograms, said the Nature report was strong, but she had some concerns about the methods, noting that the patients studied might not be a true reflection of the general population.
A higher proportion had cancer, and the racial makeup was not specified. She also said that “reader” analyses involving a small number of radiologists — this study used six — were not always reliable.
The next step in the research is to have radiologists try using the tool as part of their routine practice in reading mammograms. New techniques that pass their initial tests with flying colors do not always perform as well out in the real world.
“We have to see what happens when radiologists have it, see if they do better,” Etemadi said. Lehman said: “We have to be very careful. We want to make sure this is helping patients.”
She said an earlier technology, computer-aided detection, or CAD, provided a cautionary tale. Approved in 1998 by the Food and Drug Administration to help radiologists read mammograms, it came into widespread use. Some hospital administrators pressured radiologists to use it whether they liked it or not because patients could be charged extra for it, increasing profits, Lehman said. Later, several studies, including one that Lehman was part of, found that CAD did not improve the doctors’ accuracy, and even made them worse.
“We can learn from the mistakes with CAD and do it better,” Lehman said, adding that AI has become far more powerful, and keeps improving as more data is fed in. “Using computers to enhance human performance is long overdue.”
She and Etemadi said that a potentially good use of AI would be to sort mammograms and flag those most in need of the radiologist’s attention. The system may also be able to identify those that are clearly negative, so they could be read quickly and patients could promptly be given a clean bill of health.
Although developers of AI often say it is intended to help radiologists, not replace them, Lehman predicted that eventually, computers alone will read at least some mammograms, without help from humans.
“We’re onto something,” she said. “These systems are picking up things a human might not see, and we’re right at the beginning of it.”
📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines