Machine tells patients racial identity by scanning x-rays

A physician, however experienced, cannot tell if a patient is black, Asian, or white just by looking at their X-ray images. But can a computer? According to a new study by an international team of scientists, including researchers from MIT (Massachusetts Institute of Technology) and Harvard Medical School, the answer is yes.

The study, published in The Lancet Digital Health, found that an AI (Artificial Intelligence) program trained to read X-rays and CT scans could identify a person’s race with 90% accuracy, even without additional data about the patient. There’s just one problem: the scientists who conducted the study say they have no idea how the computer figures this out.

The research trained an artificial intelligence with hundreds of thousands of existing medical images labeled with details of the patient’s race. Then he tested the AI ​​with X-rays that the program had not seen.

“Our goal was to conduct a comprehensive assessment of AI’s ability to recognize a patient’s racial identity from this type of image,” the researchers write in their published paper.

The images used in the tests came from different parts of the body, including the chest, hand and spine, and did not contain markers of racial identity such as skin color or hair texture.

Discovery of AI reading was born by chance

According to a statement given to the American newspaper Boston Globe, the research effort was born when scientists noticed that an AI program used to examine chest X-rays had a tendency to miss signs of disease in black patients.

“We wonder how this can happen if computers can’t tell a person’s race?” said Leo Anthony Celi, co-author of the study and an associate professor at Harvard Medical School.

At the moment, scientists aren’t sure why the AI ​​system is able to identify races so accurately from images that don’t contain this information.

Even on images with more limited information, with no clues about bone density, for example, or focusing on a small part of the body, the computer still performed surprisingly well.

Among the hypotheses, the researchers suggest it is possible that the system is finding signs of melanin, the pigment that gives skin its color.

Can artificial intelligence be racist?

The findings raise some troubling questions about the role of AI in medical diagnosis, assessment and treatment.

Scientists fear that the racial bias of human beings is unwittingly applied by computer software when studying images like these.

By its very nature, AI mimics human thinking to quickly identify patterns in data. However, this also means that she can unintentionally succumb to the same types of prejudices, be they race, gender, among others.

For researchers, this raises a red flag. The concern is that, with the increase in the use of AIs to assist in medicine due to the enormous potential of data analysis of the programs, the results lead to distorted diagnoses, caused by misunderstandings linked to racial or gender bias.

“We need to take a break,” Leo Anthony Celi told the Boston Globe. “We can’t rush to get the algorithms into hospitals and clinics until we’re sure they’re not making racist or sexist decisions.”

About Raju Singh

Raju has an exquisite taste. For him, video games are more than entertainment and he likes to discuss forms and art.

Check Also

Samsung starts its own production of 3nm chips; what does that mean?

Home › Technology > Samsung starts its own production of 3nm chips; what …