Group 3 Pressure Project 1: Racist Face Detection Cameras

Description: Some digital camera features fail to work in certain situations regarding various ethnicities, e.g., blink detection.

Short descriptionand illustration/diagram of the analyzed experience:

Here an Asian subject is being photographed. However, despite having his eyes remain open, the camera insists that he is blinking. This software is incorrectly interpreting his facial structures because of his ethnicity.

Presentation of design process:

We chose a design process that catered toward increasing the diversity of our test groups, and the developers. Thinking about possible challenges we might face, one quick and dirty way to test with diversity with access to only caucasians (if this occurred), would be to find pictures of people from different backgrounds on the internet, and feed them into the algorithm.

We also noticed that the diversity of our group affected the way we designed the test subjects. For example, having a girl member allowed for alternate leg poses on the female participants.

Description and concept sketches of proposed intervention:

More diverse testing scenario:
The development of this software was hugely biased toward Caucasian faces. In order to combat this, we designed our testing scenario to include males, females, various facial hair attributes, multiple ethnicities, etc.

In addition to a testing scenario that includes a more diverse population, it may also help to include a diverse set of developers, who may be more attentive to these sorts of details.