UK Gov Report: Micro-targeting In Political Campaigns Is Manipulation

Analysis. Micro-targeting and data analytics in political campaigns disregard voters’ personal privacy and they cannot make truly informed choices. Facebook is lying when they say they don’t sell data and chose profits over security. And social media should be accountable and regulated as something in between a platform and a publisher. These are key points from the most comprehensive report from an official authority about the threatening sides of data manipulation and voter influence.

Despite the title of the newly published report by the House of Commons Digital, Culture, Media and Sport Committee Disinformation and ‘fake news’, it is about much more than disinformation and ‘fake news’. It is highly relevant for everyone interested in political regulation of online, micro-targeted political campaigning.

The media coverage of the report has largely focused on the harsh critique of Facebook. Facebook is the key representative on the dark economy of data manipulation. However, as some critics claim, this dominant focus on Facebook in a way gets counterproductive, because when we ignore other platforms like Google and YouTube, and surveillance capitalism itself, we risk sending regulation in the wrong direction. It is indeed a very important perspective that diagnosing disinformation as essentially a problem with Facebook. The report, however, brutally disclose Facebook’s problems being a systemic issue emerging in part from the pollution of online spaces by the business model that Facebook shares with others: the surveillance and modification of human behavior for profit.

The report initially had a broader perspective on the “negative impacts of technology that do not show up on the balance sheets of companies, but on the balance sheet of society”. That is loss of attention, mental health issues, confusions over personal relationships, risks to our democracies, and issues affecting children. And among these negative impacts we also find micro-targeting:

“This proliferation of online harms is made more dangerous by focusing specific messages on individuals as a result of ‘micro-targeted messaging’—often playing on and distorting people’s negative views of themselves and of others”.

According to the report the use of micro-targeting and data analytics in political campaigns uncovered:

“a disturbing disregard for voters’ personal privacy (…) citizens can only make truly informed choices about who to vote for if they are sure that those decisions have not been unduly influenced. For that reason, when personal data is used to target political messages, that use should be both transparent and lawful”.

A New Catergory of a Tech Company
The report emphasises the importance of “someone” must be liable. This leads to a request to the government to consider the formulation of a new category of tech company that is not necessarily either a ‘platform’ or a ‘publisher’. The aim is to make tech companies assume legal liability for content identified as harmful after it has been posted by users because:

“Social media companies can no longer hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites”.

The report then points at the critique of Facebook and concludes that despite specific requests, Facebook has not provided one example of a business excluded from its platform because of serious data breaches:

“We believe that is because it only ever takes action when breaches become public. We consider that data transfer for value is Facebook’s business model and that Mark Zuckerberg’s statement that “we’ve never sold anyone’s data” is simply untrue.”

But not only is Zuckerberg accused of lying. According to the report, he has also shown contempt for Parliament in refusing three separate demands for him to give evidence, instead sending junior employees unable to answer the committee’s questions. And the critique goes further: ”Facebook continues to choose profit over data security, taking risks in order to prioritize their aim of making money from user data”.

The report again and again talks about putting pressure on social media and keeping them accountable – also regarding Russian elective interference and the creation and distribution of disinformation. This is formulated as a specific request to the Government to “put pressure on social media companies to publicise any instances of disinformation. The Government needs to ensure that social media companies share information they have about foreign interference on their sites—including who has paid for political adverts, who has seen the adverts, and who has clicked on the adverts—with the threat of financial liability if such information is not forthcoming”.

The report even concludes that the British electoral law needs a serious update because it is way too vulnerable to interference by hostile foreign actors, including agents of the Russian government attempting to discredit democracy.
In two months the authors expect to hear the Government’s response to the report and all its recommendations. A lot of citizens, laymen and experts do the same.

We Need Agile Action
But can we wait two months to act? The upcoming election to EU-parliament is at risk of being subject to micro-targeting mechanisms, if regulators and authorities do not act on the fundamental critique presented in the report.
Various initiatives on cyber security and election interference in the digital age have been initiated. Realizing the essential need to bolster Europe’s democratic resilience and make sure that the off-line rules created on transparency and to protect the electoral process from foreign interference also apply online, the EU Commission has proposed measures for securing free and fair European elections. A transatlantic think tank, the Transatlantic Commission on Election Integrity was established in 2018.

In July 2018, the UK Information Commissioner published a report Democracy Disrupted setting out findings and recommendations arising out of the ICO’s 14 month investigation into the use of data analytics in political campaigns. Elizabeth Denham, UK Information Commissioner and chair of the International Conference of Data Protection and Privacy Commissioners stated at the CPDP 2019 that as the ICO opened this investigation in mid-2017, they had no clue of what they were looking at. The lack of competences within the Commissioner’s office, the lack of knowledge on tech and microtargeting mechanism forced the Commissioner’s office to a steep learning curve. The ICO was simply not fit for purpose to grasp the well-established core use of marketing data analytics for political campaigns.

The report from the report on Disinformation and “fake news” is not just adding to these initiatives and various reports; it takes the threat from surveillance capitalism on democracy and free elections to a new level.

About the report:

The DCMS Committee’s Interim Report, “Disinformation and ‘fake news’” was published in July 2018.1

This Final Report is built on the principle-based recommendations made in the Interim Report.

This Final Report is the accumulation of many months of collaboration with other countries, organisations, parliamentarians and individuals from around the world.

In total, the Committee held 23 oral evidence sessions, received over 170 written submissions, heard evidence from 73 witnesses, asking over 4,350 questions at these hearings, and had many exchanges of public and private correspondence with individuals and organisations.