This sounds nice enough on its face. Facebook says that in the future, its users will be able to see whether they were affected by other apps on the same page. The Facebook employee also revealed that if the apps misuse the data, the developers would be banned. An audit won’t change this fact. Now it’s announced it may have snagged up to 200 more potentially problematic apps – and it won’t say what they are yet.

Advertisement

Facebook stressed that the app developer who sold data to Cambridge Analytica did not have the right to do so, adding that the move was in violation of Facebook’s terms of service agreement. The United Kingdom’s data watchdog told New Scientist that it is also investigating the matter.

Facebook, the University of Cambridge, the Psychometrics Centre and Aleksandr Kogan didn’t immediately respond to requests for comment. A report by Tech Crunch provided most of the information used in this article. The flood of information was supposed to be anonymized but responses and results were packaged together using a unique ID, making it easy to backtrack and determine who the data belonged to.

More than 280 people from almost 150 institutions had access to the full data set, including researchers at universities and at companies like Facebook, Google, Microsoft and Yahoo.

In Canada, for example, AggregateIQ executives recently faced federal Parliamentary ethics committee questioning over whether they had sufficiently co-operated with the United Kingdom information commissioner. Just like how they did with Cambridge Analytica, this website will show people if them or their friends installed an app that misused data before the year 2015.

The suspensions are part of an app investigation and audit that Chief Executive Officer Mark Zuckerberg promised on March 21.

And perhaps more importantly, knowing the organization that extracted your information is far from equivalent from getting control over that information again.

Third, we want to make sure you understand which apps you’ve allowed to access your data.

Even though Facebook is trying to look transparent, it’s not clear how many apps will be investigated in what period of time.

Advertisement

Facebook prohibited apps from doing this in a 2014 policy change. The data was used to understand voter psychology and influence people’s perspective, which is against Facebook’s policies [VIDEO].