Main navigation

Breadcrumb

How not to do data-driven due diligence

A powerful new VICE News investigation has blown open the secretive world of risk management and the most influential database you've never heard of: World-Check.

Over 300 government and intelligence agencies, 49 of the 50 biggest banks, and 9 of the top 10 global law firms use World-Check to conduct due diligence, including checking compliance with anti-terrorism financing and sanctions laws. World-Check gathers and analyses open source information on individuals and organisations of interest. Analysts then designate these profiles with categories of risk related to various crimes, including terrorism and cybercrime.

Those who are in the Thomson Reuters' subscription-only 'risk management solution' database usually never find out about it. But the database has serious privacy and human rights implications – and being branded a 'terrorist' is just one of them.

Garbage in, garbage out

World-Check relies on millions of pieces of open source media, including blog posts and opinion pieces distilled into what it claims are "well structured, highly detailed profiles.” A human World-Check analyst makes the call on whether to ascribe a profile category – from 'terrorism' to 'narcotic offences' and 'organized crime' – to the person or group.

Algorithms and web crawlers that scrape data off the internet make this work possible. Algorithms are great. We use them every day. They help our dating apps match us up with people with whom we're likely to click. We use an algorithm every time we use a search engine. Machine learning – where computers use information input externally and information they have generated themselves to 'learn' to be more effective or efficient without being explicitly programmed – is also great. It is how the supermarket knows to give me offers on what I buy. It is how Google knows how to improve my search results and get me to the information I want, faster. It can be used by hospitals to calculate more accurate emergency room waiting times for patients.

But when it comes to law enforcement and intelligence gathering, machine learning and other algorithm-based intelligence gathering is gambling with our most fundamental rights. In part because the quality of the intelligence depends on the quality of the information on which it is based. Predictive policing – the use of quantitative techniques to identify likely targets and areas for police intervention and prevent crime – relies in part on machine learning. The data that is input into the systems is supposedly neutral but is actually full of bias, relying on entrenched selection biases for base-line data of where crimes 'usually' occur to focus the algorithm on particular profiles and areas – in the US, usually young, poor, and black. Even the conservative-leaning American RAND corporation found problematic the complete lack of clarity on what constitutes 'high crime' areas (that the US Supreme Court ruled acceptable to monitor) and what, on the other hand, is a blatant violation of the US Constitutional right to be “secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.” Police authorities in the UK, too, have been trialing use of various predictive policing software with unclear legal basis. No word yet on what this means for our rights to privacy or equal protection under the law.

If you put garbage in, you get garbage out – just as your maths teacher warned. That is what seems to have happened with World-Check, with life-altering effects. World-Check terror-listed Mohamed Iqbal Asaria CBE, a former World Bank and Bank of England advisor and a member of the Queen's Honours List for services to international development. One of the sources justifying his profile is Militant Islam Monitor – a blog that frequently rails against “Muslim savages” and refers to the UN agency working with Palestinian refugees (UNRWA) as the United Nazis Weapons Resource Agency. World-Check's 'terrorism' list is indeed a who's who of the Muslim elite. Nihad Awad, president of the Council on American-Islamic Relations (CAIR) and well-known anti-extremist speaker, profile is also terrorism-designated. So is one of Britain's most prominent Muslim voices, the Liberal Democrat parliamentary candidate Maajid Nawaz. These are individuals who actively campaign against extremism. Yet their profiles hold the same designation as a number of known ISIS fighters.

Is this just? Is it even useful, as its supporters claim? Some of the total 93,000 'terrorism' entries in World-Check have reportedly not been updated in eight years. VICE News told Privacy International that 1,400 new 'terrorism' entries had been added in the past two weeks – around 100 each day. One source said they individually approved up to 600 profiles every day. That is around one profile per minute for a long working day of 12 hours. At this frenzied pace, it is little wonder if World-Check employees are able to do their job accurately at all.

Media and police, arm in arm

The World-Check case is another example of how serious investigations – whether for business or government intelligence – are being privatised. What is particularly concerning in this case is the collusion of a major media corporation, Thomson Reuters, in such a flawed endeavour.

Beyond World-Check, Thomson Reuters now offers its own OSINT search, analytics and visualization capabilities, according to report by Insider Surveillance. Law enforcement agencies can purchase the company’s “Integrated Solutions” – CLEAR System-to-System and Integrated Batch Processing – that would allow them to “scan millions of public records and apply complementary tools that quickly lead to arrests during exigent circumstances.” Thomson Reuters has reportedly teamed up with some of the big players in the surveillance industry to provide these, including surveillance technology and big data companies IBM i2, Pen-Link, Intellaegis, and Palantir. The latter received start-up funds from the CIA's venture capital arm. This is media and policing, arm in arm.

Slave to the algorithm

Being 'terror-listed' has real consequences.

Being linked to terrorism, often on the basis of scurrilous and outdated sources, can cause you to be financially penalised. Bank advisor Asaria CBE told VICE News, "I believe it [being terror-listed] cut off 50 percent of potential business for me...Several people in my industry said, 'I'll get so-and-so to contact you but never came back.' I'm sure they consulted World-Check and asked, do we want this?"

Those who are 'terror-listed' are unlikely to know about it. There is effectively no way to challenge what may be an unjust and empirically wrong designation. Finally, there is no accountability over how World-Check's profiles are used and to what ends – they are created on the basis of automatically-gathered open source information, and, it seems, hastily rubber stamped, according to VICE News. World-Check purports to offer “more precision, less noise.” But it also admits to storing profiles of individuals and organizations with “no independent input or opinion” – essentially very little quality control.

Services like these are eroding core democratic notions of presumption of innocence and ability to challenge accusers, and having potentially devastating effects on individuals. We need to seriously reconsider the value and justness of such due diligence databases, before we automate away our rights.