Could Your Big Data Project Put the Company in Legal Jeopardy?

The big buzz right now is about the contents of the President’s Council of Advisors on Science and Technology (PCAST) report, “Big Data: A Technological Perspective.” The report looks at the technical aspects of Big Data technology and assesses how these capabilities might affect privacy for U.S. citizens here and abroad.

The report examines Big Data capabilities and technical considerations such as data from sensors, data mining, data fusion and information integration, image and speech recognition, social media data, the cloud and encryption. Then, it discusses notice and consent laws in light of these new capabilities and reviews potential tools and solutions for protecting privacy. It’s a short list.

It would be tempting to think of this report as a policy problem for the lawyers or a mere curiosity. Do not be so hasty—it actually offers a valuable implementation lesson for CIOs and other technologists.

Jeffrey Kelly, principal research contributor at The Wikibon project, points out in his Silicon Angle piece how it uses the City of Boston’s Street Bump mobile app as an example of Big Data’s unintended consequences.

The app leveraged smartphone sensors to note when a driver hit a pothole. The information was then sent to the city’s public works department, which would send someone to fix it.

The problem? The app created a heavy bias toward fixing street problems in wealthy areas, where smartphones are more common.

Now, at first glance, that may seem like no big deal to you. Maybe you want wealthier clients or maybe you think, “We’re not the government so that kind of issue isn’t a big deal for us.”

Think again. As Silicon Angle points out, ignoring that kind of bias could lead to discriminatory practices, which in turn could lead to lawsuits, even for private companies.

“But if such a program (unintentionally or not) results in price discrimination based on race, sex, class or some other factor proscribed by law, the retailer could (and should) face legal consequences,” Silicon Angle reports. “It is the retailer’s responsibility to identify such discriminatory practices and put a stop to them even if they are the result of just one of thousands of Big Data analytics projects.”

Fortunately, the report also offers a relatively simple fix: Boston managed to identify the unintended bias by testing the app internally before rolling out the app.

Identifying these sorts of potential legal minefields “requires not just the technology and manpower necessary to identify such discriminatory practices, but the mindset and will to do so as well,” writes Kelly.