Posted on May 17, 2022

MIT, Harvard Scientists Find AI Can Recognize Race From X-Rays — And Nobody Knows How

Hiawatha Bray, Boston Globe, May 13, 2022

A doctor can’t tell if somebody is Black, Asian, or white, just by looking at their X-rays. But a computer can, according to a surprising new paper by an international team of scientists, including researchers at the Massachusetts Institute of Technology and Harvard Medical School.

The study found that an artificial intelligence program trained to read X-rays and CT scans could predict a person’s race with 90 percent accuracy. But the scientists who conducted the study say they have no idea how the computer figures it out.


At a time when AI software is increasingly used to help doctors make diagnostic decisions, the research raises the unsettling prospect that AI-based diagnostic systems could unintentionally generate racially biased results. For example, an AI (with access to X-rays) could automatically recommend a particular course of treatment for all Black patients, whether or not it’s best for a specific person. {snip}

The research effort was born when the scientists noticed that an AI program for examining chest X-rays was more likely to miss signs of illness in Black patients. “We asked ourselves, how can that be if computers cannot tell the race of a person?” said Leo Anthony Celi, another coauthor and an associate professor at Harvard Medical School.


Once the software had been shown large numbers of race-labeled images, it was then shown different sets of unlabeled images. The program was able to identify the race of people in the images with remarkable accuracy, often well above 90 percent. Even when images from people of the same size or age or gender were analyzed, the AI accurately distinguished between Black and white patients.

{snip} Perhaps X-rays and CT scanners detect the higher melanin content of darker skin, and embed this information in the digital image in some fashion that human users have never noticed before. It’ll take a lot more research to be sure.


In any case, Celi said doctors should be reluctant to use AI diagnostic tools that might automatically generate biased results.

“We need to take a pause,” he said. “We cannot rush bringing the algorithms to hospitals and clinics until we’re sure they’re not making racist decisions or sexist decisions.”