Meet PINLogger, the drive-by exploit that steals smartphone PINs

Sensors in phones running both iOS and Android reveal all kinds of sensitive info.

Smartphones know an awful lot about us. They know if we're in a car that's speeding, and they know when we're walking, running, or riding in a bus. They know how many calls we make and receive each day and the precise starting and ending time of each one. And of course, they know the personal identification numbers we use to unlock the devices or to log in to sites that are protected by two-factor authentication. Now, researchers have devised an attack that makes it possible for sneaky websites to surreptitiously collect much of that data, often with surprising accuracy.

The demonstrated keylogging attacks are most useful at guessing digits in four-digit PINs, with a 74-percent accuracy the first time it's entered and a 94-percent chance of success on the third try. The same technique could be used to infer other input, including the lock patterns many Android users rely on to lock their phones, although the accuracy rates would probably be different. The attacks require only that a user open a malicious webpage and enter the characters before closing it. The attack doesn't require the installation of any malicious apps.

Malicious webpages—or depending on the browser, legitimate sites serving malicious ads or malicious content through HTML-based iframe tags—can mount the attack by using standard JavaScript code that accesses motion and orientation sensors built into virtually all iOS and Android devices. To demonstrate how the attack would work, researchers from Newcastle University in the UK wrote attack code dubbed PINLogger.js. Without any warning or outward sign of what was happening, the JavaScript was able to accurately infer characters being entered into the devices.

"That means whenever you are typing private data on a webpage and this webpage for example has some advert banners at the side or the bottom, the advert provider as part of the page can 'listen in' and find out what you type in that page," Siamak F Shahandashti, one of the Newcastle University researchers who demonstrated the attack, told Ars. "Or with some browsers as we found, if you open a page A and then another page B without closing page A (which most people do) page A in the background can listen in on what you type in page B."

No easy fix

The specific conditions under which various types of attacks worked varied from browser to browser, and, to a lesser extent, also depended on which operating system each browser ran on. The browser provided by the Chinese-US Web services company Baidu, whether running on iOS or Android, had the greatest access to sensors. As a result, the browser provided sensitive sensor data when a malicious webpage was loaded directly into an open or background browser tab, was loaded as an iframe into an open or background tab, or even when the malicious page was loaded directly or as an iframe when the device screen was locked. More widely used browsers, by contrast, restricted access to sensor data, but they still presented an opportunity for abuse.

Chrome for iOS, for instance, served sensor data to malicious sites that were loaded directly, as an iframe or as an ad into an active tab. The Google browser, however, blocked access to all sites loaded into background tabs or when the iPhone was locked. Chrome for Android worked similarly with the exception of not providing sensor data to ad servers. Firefox on Android also had access to sensors when the JavaScript was hosted directly or through an iframe on an active tab but not background tabs. Safari, meanwhile, had the same access as Firefox, but the Apple-made browser also accessed the code when the iPhone screen was locked. A full summary of the conditions for various browsers is in the table below:

Enlarge/ Mobile browser access to the orientation and motion sensor data on Android and iOS under different conditions. A yes (in italics) indicates a possible security leakage vector. A yes (in italics and underlined) indicates a possible security leakage vector only in the case when the browser was active before the screen is locked.

The researchers reported the results to the makers of Chrome, Firefox, Safari, and Opera. Mozilla issued a partial fix in version 46, in which Firefox restricted JavaScript access to motion and orientation sensors to top-level documents and same-origin iframes. In Apple Security Updates for iOS 9.3 released in March 2016, Safari took a similar countermeasure by suspending the availability of motion and orientation data when a page is hidden. While an effective means of thwarting the PINLogger attack, the updates also prevent the browsers from supporting useful features, such as those provided by fitness and exercise websites. Chrome, meanwhile, continues to make sensor data available to webpages that are loaded into an active tab. Chrome developers publicly acknowledged the issue here. (It wasn't immediately clear how Opera developers responded.)

"There is no straightforward fix to the problem without also breaking potentially useful Web applications in the future," another Newcastle University researcher, Feng Hao, told Ars. "No one is able to come up with a definite solution yet."

It knows when you are sleeping

By accessing accelerometer and gyroscope sensors, the Web-hosted JavaScript measures subtle changes in a phone's angle, rotation, movement speed, and similar characteristics. The data, in turn, can reveal sensitive information about the phone and its user, including the precise start and end of each phone call, if the person using it is stationary, walking, running, on a bus, in a car, or on a train. The researchers experimented using the Maxthon browser running on a Nexus 5 phone running Android version 5.1.1. Different devices running other browsers likely would behave similarly. The title of the paper is Stealing PINs via mobile sensors: actual risk versus user perception.

Enlarge/ Three dimensions of acceleration data, taken from the motion sensor during 22 seconds of sitting, 34 seconds of walking, and 25 seconds of running.

Of all the information the sensors reveal, the keystrokes being entered are almost certainly the most sensitive. The researchers used artificial neural network training to tie certain sensor measurements to specific characters contained in 50 four-digit PINS. PINLogger was able to infer subjects' PINs with 74 percent accuracy on the first attempt and with nearly 100 percent accuracy in five attempts. By comparison, a random guess from a set of 50 PINs would have only a 2-percent chance of being correct on the first attempt and only 6 percent accuracy in three attempts.

"One might argue that the attack should be evaluated against the whole 4-digit PIN space," the researchers wrote. "However, we believe that the attack could still be practical when selecting from a limited set of PINs since users do not select their PINs randomly. It has been reported that around 27 percent of all possible 4-digit PINs belong to a set of 20 PINs, including straightforward ones like '1111', '1234', or '2000'."

The researchers went on to perform a separate round of training that evaluated all possible four-digit PINs. The training included two modes. The first, known as multiple-users mode, was trained using several subjects. The other mode, known as same-user mode, relied on the training of the individual being targeted in the attack. The researchers wrote:

The results in our multiple-users mode indicate that we can infer the digits with a success probability of 70.75, 83.27, and 94.03 percent in the first, second, and third attempts, respectively. This means that for a 4-digit PIN and based on the obtained sensor data, the attacker can guess the PIN from a set of 34 = 81 possible PINs with a probability of success of 0.92064 = 71.82 percent. A random attack, however, can only predict the 4-digit PIN with the probability of 0.81 percent in 81 attempts. By comparison, PINlogger.js achieves a dramatically higher success rate than a random attacker.

In same-user mode, the success probability of guessing the PIN in 81 attempts is 85.46 percent.

"Crucial open problem"

Right now, there's no ideal way to prevent attacks. That's because, as noted earlier, browsers that can't access sensor data at all are likely to prevent many useful sites from working properly. For people using Chrome, a good practice is to close tabs as often as possible. The researchers warned that unless browser and OS makers figure out a better, long-term solution, the threat is likely to grow.

"Access to mobile sensor data via JavaScript is limited to only a few sensors at the moment," they wrote. "This will probably expand in the future, especially with the rapid development of sensor-enabled devices in the Internet of things. Hence, designing a general mechanism for secure and usable sensor data management remains a crucial open problem for future research."

Promoted Comments

For iOS, why not add accelerometers as a toggle option on the privacy screen so the user can block web browser's access to them? Some of us have already done that for the camera, location, and microphone sensors. (Why would random web pages ever need that info?) Why not accelerometers too? They are conspicuously absent from the privacy settings. Let the user decide. Some of us don't mind if we break a web site, if it means keeping private info private.

Boss level privacy in iOS Safari will be achieved when we can whitelist web sites, rather than the app (entire browser).

This should be a clear permissions based approach. The first level of permissions should be for the browser to actually have access to the sensors.The second level is to force acknowledgements from the browser that the website is requesting data that a web page normally should not have access to.

Otherwise block the browser from having access to the sensors and require that only apps have access to that data.

Maybe I'm being naive but why not implement something similar to when a browser page wants location data? Or are the pop-ups I get asking to allow the page to get my location not as straight forward as I assume? In my brain it seems like you could have a different pop-up to "Allow sensor data access?" or something.

We've already seen from acoustic pickups of keyboard clicks that data processing can indeed reveal more than we'd think. It is not at all surprising, nor should it have been unexpected, that giving accelerometer access to JS running on any web site was a colossally bad idea, and can be used to harvest touch keypad input such as PINs.

In iOS 10.3 (and probably earlier), I have a setting under Privacy called Motion and Fitness. If I tap that, I find a list of apps that have asked to use the motion sensors. I have allowed my fitness apps access to them, but nothing else. Safari has never even asked. So I'm not really sure what the problem is here, at least for people who don't grant all requested permissions regardless of whether they are needed.

Safari uses the sensors and doesn't ask. Follow that link I posted earlier. You'll see that the web site will have access to accelerometer data, Safari won't ask for permission, and that list will still not have Safari listed.

That alone is bad enough. Add to that the web site having access to the data in a background tab, or on the lock screen, and that's how they can get the PIN.

95 Reader Comments

For iOS, why not add accelerometers as a toggle option on the privacy screen so the user can block web browser's access to them? Some of us have already done that for the camera, location, and microphone sensors. (Why would random web pages ever need that info?) Why not accelerometers too? They are conspicuously absent from the privacy settings. Let the user decide. Some of us don't mind if we break a web site, if it means keeping private info private.

Boss level privacy in iOS Safari will be achieved when we can whitelist web sites, rather than the app (entire browser).

This should be a clear permissions based approach. The first level of permissions should be for the browser to actually have access to the sensors.The second level is to force acknowledgements from the browser that the website is requesting data that a web page normally should not have access to.

Otherwise block the browser from having access to the sensors and require that only apps have access to that data.

I can literally go to my phone provider and pick up a better (hardware-wise) phone for free from credit (so no paying anything and it's mine forever). Just can't make myself do it.

It's like me and my Amiga 1200. Sure, I had to give up a 4 times more powerful 386DX40 to keep it, and it cost me an arm and a leg to add a 2.5" HDD to it, but I loved that technology, damn it, and I didn't let it go until after replacing the PSU, adding an external (internal) CD drive, adding a filter circuit to cut down on noise (and integrate the CD drive's audio), and god alone knows what else (that I don't remember anymore).

Looking at the paper, the researchers had to train their algorithm on the device they're capturing pin data from. It's possible iPhones are so similar that you wouldn't have to train on each individual phone, but it seems very unlikely that training on a Nexus 5 will work well for anything iOS. Until someone shows that they can make this attack work without prior device calibration, it's more theoretical than practical.

I understand how malicious website A could listen in on stuff you enter on website B - though I doubt the recognition rates remain very high for full-text-entry.

But to steal the PIN, wouldn't the JavaScript need to be running while the device is locked? I would surely hope Android/iOS don't keep all those websites and their JS up and running when I lock my device.

If I understood this correctly, then the attack can't guess already saved passwords in the browser's password manager?

Those are vulnerable to weak passwords or man in the middle attacks if they're cloud based, but it can be a mitigation for this type of thing.

And the same would be the case for fingerprint lock and unlock, so I guess an extra security measure that could be put in place for now is perhaps to kill all background apps if the pin or password is required to unlock the device?

Looking at the paper, the researchers had to train their algorithm on the device they're capturing pin data from. It's possible iPhones are so similar that you wouldn't have to train on each individual phone, but it seems very unlikely that training on a Nexus 5 will work well for anything iOS. Until someone shows that they can make this attack work without prior device calibration, it's more theoretical than practical.

Probably more a state-sponsored attack than anything else at this point. Get a profile for a given handset used in your target organization and go to town. Coupled with on the ground wetwork agents that would work well, but the risk exposure would probably be considered too high under most circumstances.

Possibly good for industrial espionage as well, but social hacking is just so much more fruitful on that front, with much lower risk.

If I understood this correctly, then the attack can't guess already saved passwords in the browser's password manager?

Those are vulnerable to weak passwords or man in the middle attacks if they're cloud based, but it can be a mitigation for this type of thing.

And the same would be the case for fingerprint lock and unlock, so I guess an extra security measure that could be put in place for now is perhaps to kill all background apps if the pin or password is required to unlock the device?

Freeze all background apps during login. Even that would feel clunky for many, and would have to be worked around any timing apps.

Maybe I'm missing something, but wouldn't this be completely mitigated by shuffling the numbers in the PIN pad input when entering a code? I know some Android ROMs allow that. Definitely less speedy if you don't know where the numbers are ahead of time, but if this is the alternative...

why is it when any idiot reasearcher wants to make a report about something they use versions in the past? give us recent systems.

Because it takes time to learn the system and figure out how to get anything done on it. This is different than the patched Windows exploit in that the researchers here had to make the exploit themselves from scratch.

Also, remember that Android and iOS don't get updated as often as Windows does for security vulnerabilities, not to mention that there are any number of devices out there that are not eligible for update to the latest and greatest.

So to be clear, on Android, running Chrome, the script can eventually guess the pin if you keep entering it while your phone is unlocked? That seems like a vulnerability that is impossible to exploit. Like someone will be able to copy your housekey if you let them in and then repeatedly show them your key. You only use that pin to unlock your phone normally.

The keyboard guessing implications are a much bigger issue but there is no indication that sensors would have any ability to guess keypresses based on motion.

More people are on Marshmellow than any other specific version, but a majority is on KitKat and Lollipop combined. So API's 19, 21 and 22 account for ~52% of the market, while API 23+ accounts for ~36.1% as of today.

So to be clear, on Android, running Chrome, the script can eventually guess the pin if you keep entering it while your phone is unlocked? That seems like a vulnerability that is impossible to exploit. Like someone will be able to copy your housekey if you let them in and then repeatedly show them your key. You only use that pin to unlock your phone normally.

The keyboard guessing implications are a much bigger issue but there is no indication that sensors would have any ability to guess keypresses based on motion.

You realize the number of Android devices not running Chrome and instead something like Baidu are not insignificant? The numbers are harder to get, particularly as I don't even know where to start getting accurate numbers for non-Google certed devices, but I've run across enough cheap off-brand Android devices to appreciate just how prevalent they can be. Those usually do not run Chrome.

Then there's Samsung and it's willingness to create shitty software no matter how many security holes it adds to their stack.

Is this about logging what I enter in another website while the malicious one is in another tab? And ALSO that if I leave my browser running, lock the device, and then unlock the device it can guess my device password? Kinda unclear on the details. Presumably just closing the browser is enough to mitigate the risk.

This should be a clear permissions based approach. The first level of permissions should be for the browser to actually have access to the sensors.The second level is to force acknowledgements from the browser that the website is requesting data that a web page normally should not have access to.

Otherwise block the browser from having access to the sensors and require that only apps have access to that data.

Doesn't iOS already have a feature where it temporarily brings up the stock keyboard for entering passwords, even if you have another keyboard set as your default keyboard? It seems like they could do something similar with selectively turning off access to the accelerometers when entering things like PINs and passwords.

Is this about logging what I enter in another website while the malicious one is in another tab? And ALSO that if I leave my browser running, lock the device, and then unlock the device it can guess my device password? Kinda unclear on the details. Presumably just closing the browser is enough to mitigate the risk.

Right, because you and everyone else has never-ever-ever, left a browser open until the phone auto-locks. Rarely if ever in the history of the world has this thing ever happened, and is so very rare that on the few occasions that it does happen the user will not hesitate to turn off/restart the phone.

Because we can always count on the average user being willing to go to great lengths to ensure their security.

Maybe I'm being naive but why not implement something similar to when a browser page wants location data? Or are the pop-ups I get asking to allow the page to get my location not as straight forward as I assume? In my brain it seems like you could have a different pop-up to "Allow sensor data access?" or something.

Maybe I'm being naive but why not implement something similar to when a browser page wants location data? Or are the pop-ups I get asking to allow the page to get my location not as straight forward as I assume? In my brain it seems like you could have a different pop-up to "Allow sensor data access?" or something.

That's definitely a good option, and hopefully what will end up happening. The VAST majority of webpages have no business (and derive no benefit from) having sensor data.

"There is no straightforward fix to the problem without also breaking potentially useful Web applications in the future," another Newcastle University researcher, Feng Hao, told Ars. "No one is able to come up with a definite solution yet."

This is complete bullshit. Javascript does not need access to these sensors, ever.

I dare anyone to come up with a future functionality for websites using these sensors that is both not a security nightmare with dubious actual benefits, and not better done with an app.

We've already seen from acoustic pickups of keyboard clicks that data processing can indeed reveal more than we'd think. It is not at all surprising, nor should it have been unexpected, that giving accelerometer access to JS running on any web site was a colossally bad idea, and can be used to harvest touch keypad input such as PINs.

So the sensors are measuring acceleration. Preventing movement => no acceleration => no measurements => no spying possible. One way to stop the phone moving when entering your PIN might therefore be to hold it firmly against a fixed surface — put it on a table or hold it up against a wall or a tree. That probably gives an advantage to phones with a flat back.

Another question is whether these scripts can tell when the phone is prompting the user to enter their PIN? My guess is that they can't ask to be told when the PIN prompt appears, so they would have to distinguish PIN entry from regular typing or doing other kinds of stuff just by looking at the accelerometer data. Can you say battery drain?

For iOS, why not add accelerometers as a toggle option on the privacy screen so the user can block web browser's access to them? Some of us have already done that for the camera, location, and microphone sensors. (Why would random web pages ever need that info?) Why not accelerometers too? They are conspicuously absent from the privacy settings. Let the user decide. Some of us don't mind if we break a web site, if it means keeping private info private.

Boss level privacy in iOS Safari will be achieved when we can whitelist web sites, rather than the app (entire browser).

This is fantastic - quite impressive accuracy from sensors that aren't directly near the touch area. If that team could quit security research and go into product design, they could make my dream smartwatch/wearable : the one where you just wiggle the fingers of your hand to do things. (I don't consider a watch to be smart if it needs twice as many hands to use as my phone does).