Home > NewsRelease > Facial recognition software is biased towards white men
Text
Facial recognition software is biased towards white men
From:
Loren Stocker  -- 800 Number Expert Loren Stocker -- 800 Number Expert
For Immediate Release:
Dateline: Los Angeles, CA
Friday, February 16, 2018

 

Looks like racism and bias is turning into biased code..making a neutral thing like software a weapon of white supremacy.

New research out of MIT’s Media Lab is underscoring what other experts have reported or at least suspected before: facial recognition technology is subject to biases based on the data sets provided and the conditions in which algorithms are created.

Joy Buolamwini, a researcher at the MIT Media Lab, recently built a dataset of 1,270 faces, using the faces of politicians, selected based on their country’s rankings for gender parity (in other words, having a significant number of women in public office). Buolamwini then tested the accuracy of three facial recognition systems: those made by Microsoft, IBM, and Megvii of China. The results, which were originally reported in The New York Times, showed inaccuracies in gender identification dependent on a person’s skin color.

Gender was misidentified in less than one percent of lighter-skinned males; in up to seven percent of lighter-skinned females; up to 12 percent of darker-skinned males; and up to 35 percent in darker-skinner females.

read more

Reader Interactions

Leave a Reply

Pickup Short URL to Share
News Media Interview Contact
Name: Loren C. Stocker
Group: Vanity International
Dateline: Del Mar, CA United States
Direct Phone: 858-792-5000
Main Phone: 800-438-8264 *
Jump To Loren Stocker  -- 800 Number Expert Jump To Loren Stocker -- 800 Number Expert
Contact Click to Contact