#GAReads | Machines Taught By Photos Learn a Sexist View of Women

Last Fall, University of Virginia computer-science professor Vicente Ordóñez noticed a pattern in some of the guesses made by image-recognition software he was building. “It would see a picture of a kitchen and more often than not associate it with women, not men,” he says.

That got Ordóñez wondering whether he and other researchers were unconsciously injecting biases into their software. So he teamed up with colleagues to test two large collections of labeled photos used to “train” image-recognition software.

Facebook

Subscribe

GenderAvenger will use the information you provide on this form to send email updates. Check the box below if you give us permission:

You can send me email updates.

You can unsubscribe at any time through the link in the footer of every email we send you or by contacting us at info@genderavenger.com. We will not share your information with any third parties. Click the "Subscribe" button to let us know you agree to these terms.
We use MailChimp as our marketing automation platform. By clicking below to submit this form, you acknowledge that the information you provide will be transferred to MailChimp for processing in accordance with their Privacy Policy (https://mailchimp.com/legal/privacy/) and Terms (https://mailchimp.com/legal/terms/).

Thank you!

GenderAvenger is a 501(c)(3) nonprofit organization that focuses on gender balance at all levels in public dialog. We support inclusion of all people regardless of gender, race, sex, age, and ability.