Article

'If you have a face, you have a place in the conversation about AI,' expert says

By Tonya Mosley

Computer scientist Joy Buolamwini was a graduate student at MIT when she made a startling discovery: The facial recognition software program she was working on couldn't detect her dark skin; it only registered her presence when she put on a white mask.

It was Buolamwini's first encounter with what she came to call the "coded gaze."

"You've likely heard of the 'male gaze' or the 'white gaze,'" she explains. "This is a cousin concept really, about who has the power to shape technology and whose preferences and priorities are baked in — as well as also, sometimes, whose prejudices are baked in."

Buolamwini notes that in a recent test of Stable Diffusion's text-to-image generative AI system, prompts for high paying jobs overwhelmingly yielded images of men with lighter skin. Meanwhile, prompts for criminal stereotypes, such as drug dealers, terrorists or inmates, typically resulted in images of men with darker skin.

In her new book, Unmasking AI: My Mission to Protect What Is Human in a World of Machines, Buolamwini looks at the social implications of the technology and warns that biases in facial analysis systems could harm millions of people — especially if they reinforce existing stereotypes.

Related Content