Developers for the open-source smartphone firmware have released a patch.

CyanogenMod, an alternative open-source firmware distribution for Android smartphones and tablets, has been recording the swipe gestures used to unlock the devices, a developer involved in the open-source project said. The issue has been corrected in an update.

Gabriel Castro noticed an extra line of code that appears to have been added in August, when CyanogenMod was updated to make the grid size for screen locks configurable rather than fixed. In the process, the line was added, causing the unlock pattern to be recorded in a log file stored on the device.

"I'm really surprised nobody caught this," Castro wrote in comments accompanying a fix. An alternative to removing the line is adding a character to the code so it's treated as a comment and isn't executed. Either approach can be done "without breaking anything," he added.

The undocumented screen-lock logging isn't as serious for most users as many reported vulnerabilities are. To exploit the weakness, an attacker would first have to gain access to a device, or possibly a device backup stored on a hard drive. That makes it hard for an attacker to exploit the weakness on a large number of devices.

Still, the logging behavior could be of use in targeted attacks. Further, security experts have long cautioned against the storing of passwords, personal identification numbers, and other forms of passcodes in plaintext or even in an encrypted format that can be reversed. Logging the pattern required to unlock a phone is something that's unnecessary and can only diminish security.

As many as 2.5 million phones have installed CyanogenMod, which offers easier updating and is therefore viewed by some users as a more secure alternative to official carrier releases that often can't be upgraded regularly or at all. There's no benefit to having unlock patterns logged, so it makes sense to upgrade to the latest nightly build available here.

This isn't a big deal, Log.v() is used for verbose logging and is commonly used for debugging. They probably added that line in while doing some testing and forgot to remove it when they were finished. Besides, not only do you need to *first* gain access to the device to get to the archived logcat logs, but this doesn't even reveal anything insecure if you had the log file and were just reading through it. You would see a verbose debugging line with some coordinates for which you'd have no idea what they are for. They could literally be for anything.

The offending line would print: SMASHER816/([column], [row])

Where [row] is the row value and [column] is the column value. This would mean nothing to someone just perusing the logs at random, assuming they had access to them in the first place. I guess you could figure out what it was for by looking at a stack trace and the source code side-by-side eventually, but I doubt anyone would make the effort. The point still stands that this is logged locally anyway, so to get the unlock pattern you would first have to... Unlock the phone with the pattern.

No big deal, just patch it and move on. Your alternative is to have the carriers install rootkits and then threaten anyone exposing it. HTC in particular, they modified CarrierIQ to make it even more invasive but hey, at least they are 'taking it seriously'.

Things like this are what motivated me to root my phones and seize control back from these clowns. I trust the enthusiast community much more than the carriers/manufacturers, simple as that.

It looks like Android is becoming the IE of the 2010s. Every week there is a new security vulnerability found.

It's not a security vulnerability. It's a line added to a log file stored on the device which is inaccessible unless you can gain access to the device and then unlock it to read it. And since by then you've unlocked it anyways this information would already be known by you before you read it.

Somebody who knows their way around GitHub ought to be able to identify the account that put this line in, right? I'd be curious to hear a little forensics on that part of the story.

The highlighted line of code actually shows the 'culprit'. smasher816 was the last person to commit code to that file (on September 1st), when he "Add[ed] support for variable size pattern lockscreen". He even placed his username in the log statement to show that it was from him.

This isn't a big deal, Log.v() is used for verbose logging and is commonly used for debugging. They probably added that line in while doing some testing and forgot to remove it when they were finished. Besides, not only do you need to *first* gain access to the device to get to the archived logcat logs, but this doesn't even reveal anything insecure if you had the log file and were just reading through it. You would see a verbose debugging line with some coordinates for which you'd have no idea what they are for. They could literally be for anything.

The offending line would print: SMASHER816/([column], [row])

Where [row] is the row value and [column] is the column value. This would mean nothing to someone just perusing the logs at random, assuming they had access to them in the first place. I guess you could figure out what it was for by looking at a stack trace and the source code side-by-side eventually, but I doubt anyone would make the effort. The point still stands that this is logged locally anyway, so to get the unlock pattern you would first have to... Unlock the phone with the pattern.

Also, smasher816 is Rowan Decker on GitHub.

ciwrl from CyanogenMod here. This sums up the topic succinctly. The code was implemented for testing of a patch that enabled changing from a 3x3 grid to a wider grid (ie 6x6) http://goo.gl/6pOrs implemented by smasher816. The offending line was missed in review http://goo.gl/aYF97

While the risk of exploit is low, we aren't fans of leaving identified gaps open, and was patched within 8 hours of being reported.

To the author: Log.v() is a verbose log method that adds a log entry to LogCat. It does not write anything to a file, and the entry will be erased from memory after a few minutes due to memory constraints (Logcat is similar to ProcMon for Windows, but developers put logs there instead). This article completely ignores the fact that it never writes to a disk, and is entirely false.

To the author:Log.v is a verbose log that adds a log to LogCat. It is not written to a file and it is erased from memory after a few minutes and is never stores on disk. This article completely ignores that and is wrong.

It's possible though to have an app installed that does store a logcat file to the disk for it's own purposes, but it would catch everything else as well. In a case such as this, this file would be secure since you would have to gain access to the device before you could read it. This is the worst case scenario, and even then it's not a big deal.

I honestly don't know what the author of this article was thinking or why it's even an article at all. It's not newsworthy.

Yeah, as others have noted, the log is stored in memory, so it's lost if the device loses power or is rebooted. And this patch has no impact on being able to extract a lock pattern from a backup (Again, since the log isn't on disk, so it isn't part of a backup).. but if you have access to an unencrypted backup, do you really need the phone itself anymore?

While the risk of exploit is low, we aren't fans of leaving identified gaps open, and was patched within 8 hours of being reported.

Just wanted to say thanks for your wonderful work (not just this, but for CM in general, love you guys!)

Brass2TheMax wrote:

I honestly don't know what the author of this article was thinking or why it's even an article at all. It's not newsworthy.

Mgamerz wrote:

To the author:Log.v is a verbose log that adds a log to LogCat. It is not written to a file and it is erased from memory after a few minutes and is never stores on disk. This article completely ignores that and is wrong.

You two must be new here. Ars likes to show the evils of Android quite regularly while sweeping certain stuff from a certain fruit company under the rug or preaching how good fruity is while everything else is evil.

Swipe locks as implemented by stock ROMs and clones thereof are vulnerable to physical access anyway. Notwithstanding the lack of security from a factorial-based pattern (you can't swipe to the same "node" twice, so your security options are only 9!) and the ability to pick up the oil trail on the screen, swipe locks can be bypassed easily if USB debugging is enabled or reset entirely through ADB, USB debugging enabled or not. Given the closure of this "window" in 8 hours, I'd say open source and Cyanogenmod is working as intended. Not to mention, the increase in pattern size from 3x3 to 6x6 increases your security lock options from 9! to 36!

To the author:Log.v is a verbose log that adds a log to LogCat. It is not written to a file and it is erased from memory after a few minutes and is never stores on disk. This article completely ignores that and is wrong.

It's possible though to have an app installed that does store a logcat file to the disk for it's own purposes, but it would catch everything else as well. In a case such as this, this file would be secure since you would have to gain access to the device before you could read it. This is the worst case scenario, and even then it's not a big deal.

I honestly don't know what the author of this article was thinking or why it's even an article at all. It's not newsworthy.

Yeah, I'm looking for the 'thumbs down' button for the article. If there was one I'd definitely hit it.

While this is easy to fix, it does make me wonder if anybody else has embedded anything secret in the code.

A clever group could embed seemingly innocuous hooks into CynaogenMod that work in tandem with malware to do all sorts of Bad Things.

Except for something like that it would be caught in a code review quite easily. Here we're talking about one line of code that got by because 1) It's only one line of code, it's easily missed, and 2) It's the type of line of code that developers commonly use all the time for debugging purposes, so would be missed simply because developers are so used to seeing it all the time, and 3) Some developers do leave debugging lines in sometimes for a while until they know for sure it's stable, then they get removed later. It's possible the code reviewer even saw the line and thought that it was meant to be left in for a while.

You couldn't get away with committing changes that would be malicious into Cyanogenmod or any other open source project, really. You couldn't do anything truly malicious that would allow you to, for example, steal information from the user in only one or two lines of code. That would be almost impossible to go unnoticed.

Now, however, a disgruntled employee from a certain company named after a fruit could maliciously slip in some code to their closed-source OS which end-users could never even hope to be aware of, and if that got by no one would be the wiser. That would be much easier to pull off, especially if such an employee was in a higher position than the lowest rung on the ladder. Just saying.

ciwrl from CyanogenMod here. This sums up the topic succinctly. The code was implemented for testing of a patch that enabled changing from a 3x3 grid to a wider grid (ie 6x6) http://goo.gl/6pOrs implemented by smasher816. The offending line was missed in review http://goo.gl/aYF97

While the risk of exploit is low, we aren't fans of leaving identified gaps open, and was patched within 8 hours of being reported.

Thanks for the great response ciwrl (in your links and quick fix)! Here's the xda post on the feature if anyone's interested.

ciwrl from CyanogenMod here. This sums up the topic succinctly. The code was implemented for testing of a patch that enabled changing from a 3x3 grid to a wider grid (ie 6x6) http://goo.gl/6pOrs implemented by smasher816. The offending line was missed in review http://goo.gl/aYF97

While the risk of exploit is low, we aren't fans of leaving identified gaps open, and was patched within 8 hours of being reported.

Okay, so the key finding here is that the logged line was legitimate debugging code, and was almost certainly not some step down the road toward an 'advanced persistent threat' or the like. Good to know.

ciwrl from CyanogenMod here. This sums up the topic succinctly. The code was implemented for testing of a patch that enabled changing from a 3x3 grid to a wider grid (ie 6x6) http://goo.gl/6pOrs implemented by smasher816. The offending line was missed in review http://goo.gl/aYF97

While the risk of exploit is low, we aren't fans of leaving identified gaps open, and was patched within 8 hours of being reported.

Okay, so the key finding here is that the logged line was legitimate debugging code, and was almost certainly not some step down the road toward an 'advanced persistent threat' or the like. Good to know.

Yeah pretty much. So basically this article is completely irrelevant to any sort of news whatsoever.

Is Ars going to start a new section where they "report" on every single Git commit for Cyanogenmod ever and give their inexperienced commentary on how they think it's insecure when it's really not? Because that would be a great April Fool's joke. You should save it until then.

In all seriousness, I'm still waiting for this article to be deleted. I'm surprised Ars doesn't even seem to be embarrassed about it. It's a shining example of a perfect journalistic fuck-up.

This is not nearly as big of a deal as it seems. First of all, in nightly builds like this it is normal for there to be increased logging until it is ready for release. Second, in Jelly Bean/CM10 applications can no longer be granted the READ_LOGS permission. Thus, the only way to read these logs is to have root access. If a malicious app or person already has root access, them being able to read your unlock pattern is the least of your concerns. Finally, even hashed versions of unlock patterns are easily decrypted. There are less than 15000 possible combinations for unlock patterns. Even the most complex unlock pattern hashes can be decrypted in less than a millisecond on modern GPUs.

TL;DR If a malicious someone already has root access, who cares if your unlock pattern is stored in plaintext?

First time I hear Log.v in production stores information in a file. I only catch those messages on the emulator's LogCat IDE or on the command line with adb - android debug bridge - connected by USB.

To connect via USB and access LogCat you have to put the phone in debug mode, so you can hardly get an attak vector out of this.

I thought the internal format of LogCat was a circular buffer in memory. Are you sure the article is right and Log.x() is writing on a file on the phone's SD ?

If an app has the READ_LOGS permission, can't it read Log.v? (Not that you should ever install something with that permission.)

IagoRubio wrote:

Anyway, they should use proguard to strip all those Log calls, as even while they are not viewed by anyone, those calls are computed for nothing.

With proguard, that's nicely integrated with Android SDK to strip those calls is quite easy. Or at least put them behind a if( BuildConfig.DEBUG ) guard so they are not executed but on debug builds.

That's probably true for Log.v, but you wouldn't want to remove all logs or error reports would be much less helpful. (I don't know where it goes for CyanogenMod, but stock Android has Google Feedback.)

Guys, being open source isn't proof against malicious and surreptitious code. Any reasonably clever individual can obfuscate such that their code doesn't do what it looks like it does.

This exists as a proof-of-concept that either:1. People are actively trying to embed weird shit into CynogenMod, and this in particular was lazy enough to be caught.2. CyanogenMod developers are ineptly leaving in all sorts of debug hooks and code which expose more than the user intends to software that isn't playing nice.

Take your pick.

You don't know what you're talking about. Cyangenmod changes go through a code review. If the reviewer can't tell what the code is supposed to do, it gets called into question.

1. In this instance, it was a legitimate DEBUGGING LINE that was left in and removed afterwards for *cleanliness* and *reduction in overhead*. It wasn't malicious, nor could it really be used in a malicious way... Unless you've already unlocked the device somehow *and* if the logcat logs were being stored, which they are not be default.

2. A "debug hook"? I don't know what the hell you're talking about here. You should read up and actually get a working understanding of Android and programming in general, because you're obviously very inexperienced, if at all.

You're right though in that being open sourced doesn't make a project bullet-proof against malicious code being put in. But that's why they do code reviews. It's not a "whoever wants to commit, just commit, and we won't bother checking it" process. If they can't tell what the code does or why it's there, it's questioned. A debugging line can go forgiven if it's not questioned, it's a legitimate tool used often that is sometimes left in even in production builds, usually until the devs are sure a new feature or an enhancement is stable.

This is basic knowledge in the industry, man. If you don't know how it works, don't comment. You end up looking like an uninformed ass.

Slightly off-topic, but this "exploit" (as others have pointed out, it's pretty innocuous) gave me an idea for a study:

It would be interesting to have an anonymous, opt-in investigation using a logging mechanism like this to determine if there is a hierarchy of more or less common lock screen patterns. This would be in the interest of science, of course, but maybe also to show if lock screen patterns are actually secure on average.