
“The same data-centric techniques that can be used to try to determine somebody’s gender are also used to identify a person when you’re looking for a criminal suspect or to unlock your phone. “What’s really important here is the method and how that method applies to other applications,” says Joy Buolamwini, a researcher in the MIT Media Lab’s Civic Media group and first author on the new paper. But the data set used to assess its performance was more than 77 percent male and more than 83 percent white. technology company claimed an accuracy rate of more than 97 percent for a face-recognition system they’d designed. For instance, according to the paper, researchers at a major U.S. The findings raise questions about how today’s neural networks, which learn to perform computational tasks by looking for patterns in huge data sets, are trained and evaluated. For darker-skinned women, however, the error rates ballooned - to more than 20 percent in one case and more than 34 percent in the other two.

In the researchers’ experiments, the three programs’ error rates in determining the gender of light-skinned men were never worse than 0.8 percent. Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to a new paper researchers from MIT and Stanford University will present later this month at the Conference on Fairness, Accountability, and Transparency.
