By Steve Lohr
February 9, 2018
The New York Times
Facial recognition technology is improving by leaps and bounds. Some commercial software can now tell if a person in a photograph is male or female 99 percent of the time.
But it has that accuracy only if the person is a white man.
The darker the skin, the more errors arise — up to nearly 35 percent for images of darker skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender.
These disparate results, calculated by Joy Buolamwini, a researcher at the M.I.T. Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition.