Meet PINLogger, the drive-by exploit that steals smartphone PINs

Sensors in phones running both iOS and Android reveal all kinds of sensitive info.

Smartphones know an awful lot about us. They know if we're in a car that's speeding, and they know when we're walking, running, or riding in a bus. They know how many calls we make and receive each day and the precise starting and ending time of each one. And of course, they know the personal identification numbers we use to unlock the devices or to log in to sites that are protected by two-factor authentication. Now, researchers have devised an attack that makes it possible for sneaky websites to surreptitiously collect much of that data, often with surprising accuracy.

The demonstrated keylogging attacks are most useful at guessing digits in four-digit PINs, with a 74-percent accuracy the first time it's entered and a 94-percent chance of success on the third try. The same technique could be used to infer other input, including the lock patterns many Android users rely on to lock their phones, although the accuracy rates would probably be different. The attacks require only that a user open a malicious webpage and enter the characters before closing it. The attack doesn't require the installation of any malicious apps.

Malicious webpages—or depending on the browser, legitimate sites serving malicious ads or malicious content through HTML-based iframe tags—can mount the attack by using standard JavaScript code that accesses motion and orientation sensors built into virtually all iOS and Android devices. To demonstrate how the attack would work, researchers from Newcastle University in the UK wrote attack code dubbed PINLogger.js. Without any warning or outward sign of what was happening, the JavaScript was able to accurately infer characters being entered into the devices.

"That means whenever you are typing private data on a webpage and this webpage for example has some advert banners at the side or the bottom, the advert provider as part of the page can 'listen in' and find out what you type in that page," Siamak F Shahandashti, one of the Newcastle University researchers who demonstrated the attack, told Ars. "Or with some browsers as we found, if you open a page A and then another page B without closing page A (which most people do) page A in the background can listen in on what you type in page B."

No easy fix

The specific conditions under which various types of attacks worked varied from browser to browser, and, to a lesser extent, also depended on which operating system each browser ran on. The browser provided by the Chinese-US Web services company Baidu, whether running on iOS or Android, had the greatest access to sensors. As a result, the browser provided sensitive sensor data when a malicious webpage was loaded directly into an open or background browser tab, was loaded as an iframe into an open or background tab, or even when the malicious page was loaded directly or as an iframe when the device screen was locked. More widely used browsers, by contrast, restricted access to sensor data, but they still presented an opportunity for abuse.

Chrome for iOS, for instance, served sensor data to malicious sites that were loaded directly, as an iframe or as an ad into an active tab. The Google browser, however, blocked access to all sites loaded into background tabs or when the iPhone was locked. Chrome for Android worked similarly with the exception of not providing sensor data to ad servers. Firefox on Android also had access to sensors when the JavaScript was hosted directly or through an iframe on an active tab but not background tabs. Safari, meanwhile, had the same access as Firefox, but the Apple-made browser also accessed the code when the iPhone screen was locked. A full summary of the conditions for various browsers is in the table below:

Enlarge/ Mobile browser access to the orientation and motion sensor data on Android and iOS under different conditions. A yes (in italics) indicates a possible security leakage vector. A yes (in italics and underlined) indicates a possible security leakage vector only in the case when the browser was active before the screen is locked.

The researchers reported the results to the makers of Chrome, Firefox, Safari, and Opera. Mozilla issued a partial fix in version 46, in which Firefox restricted JavaScript access to motion and orientation sensors to top-level documents and same-origin iframes. In Apple Security Updates for iOS 9.3 released in March 2016, Safari took a similar countermeasure by suspending the availability of motion and orientation data when a page is hidden. While an effective means of thwarting the PINLogger attack, the updates also prevent the browsers from supporting useful features, such as those provided by fitness and exercise websites. Chrome, meanwhile, continues to make sensor data available to webpages that are loaded into an active tab. Chrome developers publicly acknowledged the issue here. (It wasn't immediately clear how Opera developers responded.)

"There is no straightforward fix to the problem without also breaking potentially useful Web applications in the future," another Newcastle University researcher, Feng Hao, told Ars. "No one is able to come up with a definite solution yet."

It knows when you are sleeping

By accessing accelerometer and gyroscope sensors, the Web-hosted JavaScript measures subtle changes in a phone's angle, rotation, movement speed, and similar characteristics. The data, in turn, can reveal sensitive information about the phone and its user, including the precise start and end of each phone call, if the person using it is stationary, walking, running, on a bus, in a car, or on a train. The researchers experimented using the Maxthon browser running on a Nexus 5 phone running Android version 5.1.1. Different devices running other browsers likely would behave similarly. The title of the paper is Stealing PINs via mobile sensors: actual risk versus user perception.

Enlarge/ Three dimensions of acceleration data, taken from the motion sensor during 22 seconds of sitting, 34 seconds of walking, and 25 seconds of running.

Of all the information the sensors reveal, the keystrokes being entered are almost certainly the most sensitive. The researchers used artificial neural network training to tie certain sensor measurements to specific characters contained in 50 four-digit PINS. PINLogger was able to infer subjects' PINs with 74 percent accuracy on the first attempt and with nearly 100 percent accuracy in five attempts. By comparison, a random guess from a set of 50 PINs would have only a 2-percent chance of being correct on the first attempt and only 6 percent accuracy in three attempts.

"One might argue that the attack should be evaluated against the whole 4-digit PIN space," the researchers wrote. "However, we believe that the attack could still be practical when selecting from a limited set of PINs since users do not select their PINs randomly. It has been reported that around 27 percent of all possible 4-digit PINs belong to a set of 20 PINs, including straightforward ones like '1111', '1234', or '2000'."

The researchers went on to perform a separate round of training that evaluated all possible four-digit PINs. The training included two modes. The first, known as multiple-users mode, was trained using several subjects. The other mode, known as same-user mode, relied on the training of the individual being targeted in the attack. The researchers wrote:

The results in our multiple-users mode indicate that we can infer the digits with a success probability of 70.75, 83.27, and 94.03 percent in the first, second, and third attempts, respectively. This means that for a 4-digit PIN and based on the obtained sensor data, the attacker can guess the PIN from a set of 34 = 81 possible PINs with a probability of success of 0.92064 = 71.82 percent. A random attack, however, can only predict the 4-digit PIN with the probability of 0.81 percent in 81 attempts. By comparison, PINlogger.js achieves a dramatically higher success rate than a random attacker.

In same-user mode, the success probability of guessing the PIN in 81 attempts is 85.46 percent.

"Crucial open problem"

Right now, there's no ideal way to prevent attacks. That's because, as noted earlier, browsers that can't access sensor data at all are likely to prevent many useful sites from working properly. For people using Chrome, a good practice is to close tabs as often as possible. The researchers warned that unless browser and OS makers figure out a better, long-term solution, the threat is likely to grow.

"Access to mobile sensor data via JavaScript is limited to only a few sensors at the moment," they wrote. "This will probably expand in the future, especially with the rapid development of sensor-enabled devices in the Internet of things. Hence, designing a general mechanism for secure and usable sensor data management remains a crucial open problem for future research."

Promoted Comments

For iOS, why not add accelerometers as a toggle option on the privacy screen so the user can block web browser's access to them? Some of us have already done that for the camera, location, and microphone sensors. (Why would random web pages ever need that info?) Why not accelerometers too? They are conspicuously absent from the privacy settings. Let the user decide. Some of us don't mind if we break a web site, if it means keeping private info private.

Boss level privacy in iOS Safari will be achieved when we can whitelist web sites, rather than the app (entire browser).

This should be a clear permissions based approach. The first level of permissions should be for the browser to actually have access to the sensors.The second level is to force acknowledgements from the browser that the website is requesting data that a web page normally should not have access to.

Otherwise block the browser from having access to the sensors and require that only apps have access to that data.

Maybe I'm being naive but why not implement something similar to when a browser page wants location data? Or are the pop-ups I get asking to allow the page to get my location not as straight forward as I assume? In my brain it seems like you could have a different pop-up to "Allow sensor data access?" or something.

We've already seen from acoustic pickups of keyboard clicks that data processing can indeed reveal more than we'd think. It is not at all surprising, nor should it have been unexpected, that giving accelerometer access to JS running on any web site was a colossally bad idea, and can be used to harvest touch keypad input such as PINs.

In iOS 10.3 (and probably earlier), I have a setting under Privacy called Motion and Fitness. If I tap that, I find a list of apps that have asked to use the motion sensors. I have allowed my fitness apps access to them, but nothing else. Safari has never even asked. So I'm not really sure what the problem is here, at least for people who don't grant all requested permissions regardless of whether they are needed.

Safari uses the sensors and doesn't ask. Follow that link I posted earlier. You'll see that the web site will have access to accelerometer data, Safari won't ask for permission, and that list will still not have Safari listed.

That alone is bad enough. Add to that the web site having access to the data in a background tab, or on the lock screen, and that's how they can get the PIN.

We've already seen from acoustic pickups of keyboard clicks that data processing can indeed reveal more than we'd think. It is not at all surprising, nor should it have been unexpected, that giving accelerometer access to JS running on any web site was a colossally bad idea, and can be used to harvest touch keypad input such as PINs.

That site doesn't really tell you whether it's getting readings from your phone while it's on the lock screen or on another tab which is where the security issue lays.

This should be a clear permissions based approach. The first level of permissions should be for the browser to actually have access to the sensors.The second level is to force acknowledgements from the browser that the website is requesting data that a web page normally should not have access to.

Otherwise block the browser from having access to the sensors and require that only apps have access to that data.

Doesn't iOS already have a feature where it temporarily brings up the stock keyboard for entering passwords, even if you have another keyboard set as your default keyboard? It seems like they could do something similar with selectively turning off access to the accelerometers when entering things like PINs and passwords.

Why not just disable reading from the accelerometer altogether any time the keyboard is active, or temporarily switch to some sort of "reduced accuracy" mode that obliterates the detail needed for these kinds of attacks?

In iOS 10.3 (and probably earlier), I have a setting under Privacy called Motion and Fitness. If I tap that, I find a list of apps that have asked to use the motion sensors. I have allowed my fitness apps access to them, but nothing else. Safari has never even asked. So I'm not really sure what the problem is here, at least for people who don't grant all requested permissions regardless of whether they are needed.

This maybe won't help in the general cases, but for PIN locks in particular, it seems like the OS could temporarily disable sensors when the unlock screen is presented, at least for the system lock screen. A temporary blackout on sensor access would be minimally disruptive for legitimate web apps running in the background I think.

That and increased use of biometric unlock could help mitigate this particular edge case.

"Let's use the accelerometer to measure the device's movement to measure the fingers' movement when they type, and predict what they typed!"

How do you get your mind in the right place for thinking up this stuff?

I fear side-channel attacks may always be a form of mysticism to me

I've come up with these sorts of ideas before (including some with some pretty serious security implications...fortunately, those have been as a developer on the affected product and I could patch the holes before they got noticed in the wild)...I could never think up that kind of stuff on demand, but sometimes when you let your mind wander you come up with some weird ideas, and on rare occasion those ideas are even good.

why is it when any idiot reasearcher wants to make a report about something they use versions in the past? give us recent systems.

Because it takes time to learn the system and figure out how to get anything done on it. This is different than the patched Windows exploit in that the researchers here had to make the exploit themselves from scratch.

Also, remember that Android and iOS don't get updated as often as Windows does for security vulnerabilities, not to mention that there are any number of devices out there that are not eligible for update to the latest and greatest.

why is it when any idiot reasearcher wants to make a report about something they use versions in the past? give us recent systems.

Because it takes time to learn the system and figure out how to get anything done on it. This is different than the patched Windows exploit in that the researchers here had to make the exploit themselves from scratch.

Also, remember that Android and iOS don't get updated as often as Windows does for security vulnerabilities, not to mention that there are any number of devices out there that are not eligible for update to the latest and greatest.

Android gets monthly security updates, just like Windows.

Sure, but down to which version? And do off-brand versions of Android get them?

"Let's use the accelerometer to measure the device's movement to measure the fingers' movement when they type, and predict what they typed!"

How do you get your mind in the right place for thinking up this stuff?

I fear side-channel attacks may always be a form of mysticism to me

I can understand how a smart person came up with such an idea. The thing I more wonder about is who thought it made sense to allow random websites running javascript to have access to a phone's various sensors.

I'm prompted if I want to allow a website access to my GPS, but not if I want to allow access to my phone's gyroscope. It just seems silly to allow one without prompting, but not the other.

Might one solution simply be a system-level restriction that turns off the relevant sensors completely whenever the on-screen keyboard or number pad is visible on the screen?

Also, as a side note, this is just adds one more reason that running ad-blocking software is an important security measure these days. It just makes sense to minimize your exposure to unknown JavaScript served up to your machine from some random third-party ad-server.

"Let's use the accelerometer to measure the device's movement to measure the fingers' movement when they type, and predict what they typed!"

How do you get your mind in the right place for thinking up this stuff?

I fear side-channel attacks may always be a form of mysticism to me

I can understand how a smart person came up with such an idea. The thing I more wonder about is who thought it made sense to allow random websites running javascript to have access to a phone's various sensors.

I'm prompted if I want to allow a website access to my GPS, but not if I want to allow access to my phone's gyroscope. It just seems silly to allow one without prompting, but not the other.

Hindsight is 20/20, but really it comes down to making the device as accessible to developers as possible. Sure, it opens you up, but if the competition is doing it and everything seems fine for them...

In iOS 10.3 (and probably earlier), I have a setting under Privacy called Motion and Fitness. If I tap that, I find a list of apps that have asked to use the motion sensors. I have allowed my fitness apps access to them, but nothing else. Safari has never even asked. So I'm not really sure what the problem is here, at least for people who don't grant all requested permissions regardless of whether they are needed.

Safari uses the sensors and doesn't ask. Follow that link I posted earlier. You'll see that the web site will have access to accelerometer data, Safari won't ask for permission, and that list will still not have Safari listed.

That alone is bad enough. Add to that the web site having access to the data in a background tab, or on the lock screen, and that's how they can get the PIN.

why is it when any idiot reasearcher wants to make a report about something they use versions in the past? give us recent systems.

Because it takes time to learn the system and figure out how to get anything done on it. This is different than the patched Windows exploit in that the researchers here had to make the exploit themselves from scratch.

Also, remember that Android and iOS don't get updated as often as Windows does for security vulnerabilities, not to mention that there are any number of devices out there that are not eligible for update to the latest and greatest.

Android gets monthly security updates, just like Windows.

Not "just like Windows".

Android gets updates monthly, but users can't directly install those updates the way you can with Windows. Your phone maker and often your carrier have to review those updates and often spend significant time updating them for specific models. Many models never get updates or only after significant delays.

- Make the sensor API's opt in only, on a per site basis.- Better yet make JavaScript opt in only, on a per site basis.

(The latter point is usually seen as "too extreme" because it makes life mildly more challenging for the average user. That said it would mitigate a huge percentage of web attacks but it would also be bad for advertising. It's worth noting that most of the world's smart phones use operating system software made by a huge advertising firm)

why is it when any idiot reasearcher wants to make a report about something they use versions in the past? give us recent systems.

Because it takes time to learn the system and figure out how to get anything done on it. This is different than the patched Windows exploit in that the researchers here had to make the exploit themselves from scratch.

Also, remember that Android and iOS don't get updated as often as Windows does for security vulnerabilities, not to mention that there are any number of devices out there that are not eligible for update to the latest and greatest.

Android gets monthly security updates, just like Windows.

Not "just like Windows".

Android gets updates monthly, but users can't directly install those updates the way you can with Windows. Your phone maker and often your carrier have to review those updates and often spend significant time updating them for specific models. Many models never get updates or only after significant delays.

Well...most users. People running unlocked Nexus/Pixels can get updates on release day. (Which is a major reason I switched to that...security updates are too important to put up with OEM & carrier BS.)

Just ditched my budget BLU phone, in part because it will never receive a software/security update (stuck on Android 5.0.1). Funny enough, being a budget phone, it's lacking a gyroscope and accelerometer - so, at least concerning this exploit, it's more secure than my phone running 7.1.1.

why is it when any idiot reasearcher wants to make a report about something they use versions in the past? give us recent systems.

Because it takes time to learn the system and figure out how to get anything done on it. This is different than the patched Windows exploit in that the researchers here had to make the exploit themselves from scratch.

Also, remember that Android and iOS don't get updated as often as Windows does for security vulnerabilities, not to mention that there are any number of devices out there that are not eligible for update to the latest and greatest.

Android gets monthly security updates, just like Windows.

Not "just like Windows".

Android gets updates monthly, but users can't directly install those updates the way you can with Windows. Your phone maker and often your carrier have to review those updates and often spend significant time updating them for specific models. Many models never get updates or only after significant delays.

why is it when any idiot reasearcher wants to make a report about something they use versions in the past? give us recent systems.

Because it takes time to learn the system and figure out how to get anything done on it. This is different than the patched Windows exploit in that the researchers here had to make the exploit themselves from scratch.

Also, remember that Android and iOS don't get updated as often as Windows does for security vulnerabilities, not to mention that there are any number of devices out there that are not eligible for update to the latest and greatest.

Android gets monthly security updates, just like Windows.

Not "just like Windows".

Android gets updates monthly, but users can't directly install those updates the way you can with Windows. Your phone maker and often your carrier have to review those updates and often spend significant time updating them for specific models. Many models never get updates or only after significant delays.

At least in this case chrome doesn't need system level patching anymore.

I'm actually surprised that Safari on iOS (of all things) allows JS on websites to grab sensor data while the iPhone is being unlocked... Probably Apple allows things here that no other app would be able to do, but come on: Locking EVERYTHING out while the PIN keypad is on should be just sensible. Even just stopping every code on the phone while the screen is on and the keypad is active would be reasonable.

The fact that I use my fingerprints to unlock my phone and rarely have to type my PIN doesn't exactly calm me here. If they miss this they may miss other things too.

While an effective means of thwarting the PINLogger attack, the updates also prevent the browsers from supporting useful features, such as those provided by fitness and exercise websites....

"There is no straightforward fix to the problem without also breaking potentially useful Web applications in the future," another Newcastle University researcher, Feng Hao, told Ars....

By accessing accelerometer and gyroscope sensors, the Web-hosted JavaScript measures subtle changes in a phone's angle, rotation, movement speed, and similar characteristics. The data, in turn, can reveal sensitive information about the phone and its user, including the precise start and end of each phone call, if the person using it is stationary, walking, running, on a bus, in a car, or on a train.

What, exactly are all these 'websites' that all of us are thrilled to be using, the core features of which would be 'broken' without having access to this sensor data? I don't personally know many users who utilize the web page of a fitness service over long periods of time (or sites themselves who encourage users to do so, rather than continually suggesting their mobile app). In fact, it feels like the uses that this access enables is primarily location based advertising, or gaming? Curious to know what I've been missing.

I'm actually surprised that Safari on iOS (of all things) allows JS on websites to grab sensor data while the iPhone is being unlocked... Probably Apple allows things here that no other app would be able to do, but come on: Locking EVERYTHING out while the PIN keypad is on should be just sensible. Even just stopping every code on the phone while the screen is on and the keypad is active would be reasonable.

The fact that I use my fingerprints to unlock my phone and rarely have to type my PIN doesn't exactly calm me here. If they miss this they may miss other things too.

The lock screen on the iPhone has always been just a UI element and not any actual security. Remember how many times lock screen has been bypassed just by tapping the screen in the right order and timing?

So the sensors are measuring acceleration. Preventing movement => no acceleration => no measurements => no spying possible. One way to stop the phone moving when entering your PIN might therefore be to hold it firmly against a fixed surface — put it on a table or hold it up against a wall or a tree. That probably gives an advantage to phones with a flat back.

Another question is whether these scripts can tell when the phone is prompting the user to enter their PIN? My guess is that they can't ask to be told when the PIN prompt appears, so they would have to distinguish PIN entry from regular typing or doing other kinds of stuff just by looking at the accelerometer data. Can you say battery drain?

Personally, I hold my phone with an iron grip and an enraged visage--cursing existence--whilst entering my PIN. This way, whatever private data the Soulless They might gather, They will also ingest my hatred.

Just ditched my budget BLU phone, in part because it will never receive a software/security update (stuck on Android 5.0.1). Funny enough, being a budget phone, it's lacking a gyroscope and accelerometer - so, at least concerning this exploit, it's more secure than my phone running 7.1.1.

Wow, that must be a serious budget phone to lack those sensors. Even my budget phone I bought for $20 refurbished has them.

You should at least read the text instead of relying on the pretty pictures. Heck, even the full paper is linked and available to read.

TL;DR - access to these sensors is part of the W3C spec.

If you read the text, you'd have spied this fact:

Quote:

In Apple Security Updates for iOS 9.3 released in March 2016, Safari took a similar countermeasure by suspending the availability of motion and orientation data when a page is hidden.

Which seems to suggest more recent versions of iOS are in fact immune to this exploit and has been for about a year. The information on the table is out of date and misleading regardless of the W3C spec.

Yes, you can install the app. A lot of people don't or can't install apps, though.

I, for example, have dozens of apps on my private phone, but my work phone is a measly 16GB iPhone 6, which forces me to delete all but the barest essentials I need for work. At 94.8MB Google Maps is far from the biggest app out there, but it is not insignificant. It did survive the last round of app culling, but if I have to do it again it'll probably get deleted, and replaced with a bookmark icon to maps.google.com on the home screen.

And then there's my mother, who has finally had her Internet epiphany with an iPad I gave her a few years ago. She loves surfing with Safari, but she remains overwhelmed by the App Store and intimidated by apps; she will not open them, even if I download them for her. When she needs to look at maps, she googles for "Map" and opens the web page.

You can't entirely blame Google for wanting to serve people like me and my mother, nor the Safari team to want to facilitate that.

why is it when any idiot reasearcher wants to make a report about something they use versions in the past? give us recent systems.

Because it takes time to learn the system and figure out how to get anything done on it. This is different than the patched Windows exploit in that the researchers here had to make the exploit themselves from scratch.

Also, remember that Android and iOS don't get updated as often as Windows does for security vulnerabilities, not to mention that there are any number of devices out there that are not eligible for update to the latest and greatest.

Android gets monthly security updates, just like Windows.

Not "just like Windows".

Android gets updates monthly, but users can't directly install those updates the way you can with Windows. Your phone maker and often your carrier have to review those updates and often spend significant time updating them for specific models. Many models never get updates or only after significant delays.

At least in this case chrome doesn't need system level patching anymore.

A fair point. However many Android phones—including Samsung—ship with browsers other than Chrome set to be the default. These vendor specific browsers are usually based on a slightly older version of Chromium (open source Chrome) and not part of any monthly security updates Google does.