After Breast Cancer Diagnosis, Technology Professor Teaches Computers to Read Mammograms
Regina Barzilay is a technology professor who teaches a popular class at the Massachusetts Institute of Technology about how computers can decipher ancient texts. She says the work is not really very applicable to the real world, but she loves it anyway, and so do her students.
“This is clearly of no practical use,” she says. “But it was really cool, and I was really obsessed about this topic, how machines could do it.”
But she was about to get a wake-up call concerning a way she could apply her specific knowledge to a very important topic.
Barzilay first got the idea when she was diagnosed with breast cancer in 2014. She realized how much potential there was for human error in the simple fact that mammograms are read by doctors, who also determine the woman’s breast density, a very subjective task.
“I was really surprised how primitive information technology is in the hospitals,” she says. “It almost felt that we were in a different century.”
Even beyond diagnosis, Barzilay says she kept coming across moments when she had questions that should have been answerable, if only the right technology had been introduced to the industry to analyze the data.
“At every point of my treatment, there would be some point of uncertainty, and I would say, ‘Gosh, I wish we had the technology to solve it,'” she says. “So when I was done with the treatment, I started my long journey toward this goal.”
It seemed only natural, when she was able to get back to her job, that Barzilay reconsider the path her research was taking. The journey to get enough funding and get started was difficult, because the National Cancer Institute and other organizations weren’t interested in contributing, but she knew she could help people immensely if she could get the project off the ground.
Then she found Connie Lehman, a Harvard University radiologist who is chief of breast imaging at Massachusetts General Hospital. The pair began collaborating on the project, and now they’re in the middle of teaching a machine all the steps it needs to know to accurately assess a mammogram image.
The first thing Barzilay and Lehman taught a computer how to do was to determine breast density using a deep-learning algorithm. “We’re excited about this because we find there’s a lot of human variation in assessing breast density,” Lehman says, “and so we’ve trained our deep-learning model to assess the density in a much more consistent way.”
Now that the machine is pretty good at accurately and consistently labeling dense breasts, the team is moving on to giving it lessons in spotting changes between a new mammogram and an older one from the same woman.
“These are the sorts of things that we can also teach a model, but more importantly we allow the model to teach itself,” Barzilay says. “That’s the power of artificial intelligence — it’s not simply automating rules that the researchers provide but also creating its own rules. The optimist in me says in three years we can train this tool to read mammograms as well as an average radiologist. So we’ll see. That’s what we’re working on.”
The technology has a ways to go still, but Barzilay and other experts are confident that it’s capable of getting to where it needs to be to read most mammograms in place of a radiologist, thereby freeing radiologists up to tackle other problems. The real challenge may come in the form of making sure patients are comfortable having their scans read by a machine, as well as determining whether the technology will be FDA-monitored or regulated in some other way. There are also the potential legal and ethical issues that could arise when the algorithm screws up and misdiagnoses a woman.
Yes, there are still many hurdles to face. But science is moving the technology for things like mammogram analysis forward at a miraculous pace, with the help of women like Barzilay, who have been brave enough to speak up when things aren’t as they should be. Let’s keep making the world a better place together!