There is a huge diversity problem in the technology sector and it doesn’t just affect revenue and creativity – it actually creates flaws in products that could have been avoided had more women and ethnic groups been involved in the designing.

What do the statistics say?

Let’s start with some statistics. In 2016, Google’s diversity report showed that 56% of its overall workforce was white and only 3% of its new hires were black. The proportion of men to women was also unsurprising, with males making up 69% of the workforce.

The whiteness and maleness of these sorts of companies can lead to defunct products such as voice recognition software that has trouble understanding female voices and this soap dispenser that doesn’t work on dark skin tones.

If you have ever had a problem grasping the importance of diversity in tech and its impact on society, watch this video pic.twitter.com/ZJ1Je1C4NW

Systems with in-built Prejudice?

Now, as companies are beginning to seriously develop self-driving cars, research is finding that these cars are better at detecting light skin pedestrians. This, if unfixed, could lead to countless deaths and injuries of black and other dark skinned pedestrians once these cars become mainstream.

A study from the Georgia Institute of Technology found that an automated vehicle would fail to spot somebody with a darker skin tone and proceed to drive into them. According to their findings, these systems were 5% less accurate at detecting dark skin tones compared to light skin tones. However the systems tested were those created by academics rather than those of big companies such as Google.

The authors conclude the study with a warning message for tech companies who do not consider skin tones when programming recognition technology:

“We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before deploying these sort of recognition models.”

Is the technology sector’s inequality having a much wider impact? Let us know what you think below.