Roses are red, gave Violet pick 'n' mix. She didn't like the milk bottles. Now back to ethics

The UK's advisory body for biometrics and forensics ethics has had another chunk of oversight added to its already laden basket – instructing the Home Office on ethical issues in large, complex data sets.

The group currently consults with Home Office ministers on the use of tech that generates biometric and forensic data, assessing existing and proposed kit, and how to collect, manage and analyse data produced.

But the government has confirmed the group will also be asked to consider ethical issues relating to volumes of complex data sets, and to provide oversight of an ethics framework designed to direct the Home Office's use of data.

(El Reg has asked the Home Office to clarify whether this document, referred to as the Data Ethics Governance Framework, is publicly available and how it relates to the digital department's Data Ethics Framework.)

The extension of the Biometrics and Forensics Ethics Group (BFEG)'s remit comes about 18 months after it was handed further responsibilities and a name change, having been the National DNA Database Ethics Group until July 2017.

BFEG's remit

This is currently defined as including, but not limited to, consideration of the ethical aspects of:

the application and operation of technologies which produce biometric and forensic data and identifiers

services currently provided and techniques employed and proposals for new services and techniques

applications for research involving access to biometric of forensic data other matters relating to the management, operation and use of biometric or forensic data

A statement from the Home Office on the latest addition to the group's remit said the move "is aimed at strengthening the public's assurance on the use of data within the department".

This is something it sorely needs, given its propensity to slurp any and all data it can, often for immigration enforcement. The department has come under fire for deals to suck up data from both NHS Digital and with the Department for Education.

The Home Office was also panned for failing to implement some of the most basic data management practices. The National Audit Office said in December that data in its "migration refusal pool" of failed asylum seekers was inaccurate, and found issues with data governance and management.

Concerns are heightened because the Home Office has plenty of data to mismanage. It has been repeatedly criticised for retaining more than 20 million custody images, including some of people who were never charged – in spite of a 2012 High Court decision that this was unlawful, and repeat requests that it addresses the problem.

The department is also working to merge Police National Computer and Police National Database to form a mega-database, the Law Enforcement Data Service – a project civil rights group Liberty has said it has lost faith in.

On top of these data sets, the Home Office is in the process of gaining even more data, including biometric information, as potentially millions of EU citizens in the UK apply for settled status ahead of Brexit.

Adding another string to the BFEG's bow might indicate the department is at least aware of the issues arising from its access to vast swathes of highly personal information on people in the UK.

This means that, for some, the question is whether this requires more time and heft than a six-person advisory group that already has a lot on its plate can summon – not to mention that it gives the impression the situation is simply about ethics.

"While oversight of the government's use of enormous banks of data is welcome, the BFEG has its work cut out if it hopes to meaningfully scrutinise the vast number of ways biometrics are already being used by the Home Office, and are looking to use them in the future," Hannah Couchman, advocacy and policy officer for Liberty, told The Reg.

"Oversight of the state's use of big data must go beyond ethics if it is to protect our rights and freedoms. We need to urgently consider how to prevent the harvesting of enormous amounts of our private information that can only be used by the state in conjunction with human rights-abusing algorithms."

Another potential issue is that as data harvesting, artificial intelligence and algorithms, and the ethics and regulation of their use become mainstream, governments are encouraged to set up no end of groups to demonstrate they are taking the matter seriously.

The list now includes the government-sponsored Centre for Data Ethics and Innovation, AI Council, the Government Office for AI, the National Data Guardian and the Information Commissioner's Office, as well as the Alan Turing Institute, and the Nuffield's Ada Lovelace Institute.

But there is little clarity on how the burgeoning number of both government and non-government groups will work together, or how they will avoid repeating each other's work.

"The UK is in the baffling position of having far too many data and AI governance bodies for anyone to make heads or tails of them," said Michael Veale, a tech policy and data protection researcher at University College London.

"There's a real risk that this fragments already scarce expertise and resource... In the race to claim institutional ownership of data ethics, there's a real risk substantive examination of the issues, and individuals' rights and freedoms, will be left by the wayside." ®