Post navigation

A British researcher has uncovered an ironic security hole in the EU’s General Data Protection Regulation (GDPR) – right of access requests.

Right of access, also called subject access, is the part of the GDPR regulation that allows individuals to ask organisations for a copy of any data held on them.

This makes sense because, as with any user privacy system, there must be a legally enforceable mechanism which allows people to check the accuracy and quantity of personal data.

Unfortunately, in what can charitably be described as a massive GDPR teething problem, Oxford University PhD student James Pavur has discovered that too many companies are handing out personal data when asked, without checking who’s asking for it.

In his session entitled GDPArrrrr: Using Privacy Laws to Steal Identities at this week’s Black Hat show, Pavur documents how he decided to see how easy it would be to use right of access requests to ‘steal’ the personal data of his fiancée (with her permission).

After contacting 150 UK and US organisations posing as her, the answer was not hard at all.

According to the accounts by journalists who attended the session, for the first 75 contacted by letter, he impersonated her by providing only information he was able to find online – full name, email address, phone numbers – which some companies responded to by supplying her home address.

Armed with this extra information, he then contacted a further 75 by email, which satisfied some to the extent they sent back his fiancee’s social security number, previous home addresses, hotel logs, school grades, whether she’d used online dating, and even her credit card numbers.

Pavur didn’t even need to fake identity documents or forge signatures to back up his requests and didn’t spoof her real email addresses to make his requests seem more genuine.

Lateral thinking

Pavur hasn’t revealed which companies failed to authenticate his bogus right of access requests, but named three – Tesco, Bed Bath and Beyond, and American Airlines – which performed well because they challenged his requests after spotting missing authentication data.

Nevertheless, a quarter handed over his fiancée’s data without identity verification, 16% asked for an easily forged type of identity he decided not to provide, while 39% asked for strong identity.

While far too often no proof of identity is required at all, even in the best cases the GDPR permits someone capable of stealing or forging a driving license nearly complete access to your digital life.

The danger is that criminals might already have been exploiting this without anybody noticing.

As Pavur points out, automating bogus standardised access requests wouldn’t be hard to do at scale by using the sort of basic name and email address data that many people make public on social media.

Whose fault?

If Pavur’s research shows a failing, it’s that too many organisations still don’t understand GDPR.

It isn’t enough to secure data in a technical sense if you don’t also secure access to it. If someone phones up to requesting to know what data is held on them, not authenticating this request becomes a bypass that ends up endangering privacy rather than protecting it.

While it’s true that this could have been happening long before GDPR existed, giving citizens the legal right to request data has handed people with bad intentions a standardised mechanism to try and manipulate.

But there are deeper failures here too – if organisations try to verify someone’s identity, what should they ask for? GDPR or not, there is still no universal and reliable identity verification system to check that someone is who they say they are.

3 comments on “GDPR privacy can be defeated using right of access requests”

This is an awful regurgitation of poor content first seen (by me, at least) on the BBC website.

Especially the final paragraph is atrocious, since data subject access requests were part of the 95/46/EU directive (and the UK data protection act of 1998).

But even if data subject access requests were something new, the conclusion that this is somehow the result of unintended consequences is like saying the banking payments system has the unintended consequence of being vulnerable to scammers using social engineering to steal people’s money.

(For clarity – we removed the last paragraph to which you referred.) I think the point of the article is pretty clear and important – namely that complying with one cybersecurity rule may leave you non-compliant when it comes to others. GDPR has the same unavoidable tension when it comes to deleting data: you have a statutory right to “be forgotten” but the organisation holding that data might be compelled by a statutory obligation to retain it (certain financial records, for example; court transcripts; contract details; and so on).