Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

NeverVotedBush writes with this excerpt from CNet:
"A security firm disclosed holes today in mobile apps from Bank of America, USAA, Chase, Wells Fargo and TD Ameritrade, prompting a scramble by most of the companies to update the apps. ... Specifically, viaForensics concluded that: the USAA's Android app stored copies of Web pages a user visited on the phone; TD Ameritrade's iPhone and Android apps were storing the user name in plain text on the phone; Wells Fargo's Android app stored user name, password, and account data in plain text on the phone; Bank of America's Android app saves a security question (used if a user was accessing the site from an unrecognized device) in plain text on the phone; and Chase's iPhone app stores the username on a phone if the user chose that option, according to the report. Meanwhile, the iPhone apps from USAA, Bank of America, Wells Fargo, and Vanguard and PayPal's Android app all passed the security tests and were found to be handling data securely."

"But how is Chase's App on iPhone "insecure" when it is the user's responsibility to not leave their username laying around ?"

... for the same reason that there isn't a little box to write your PIN number in on ATM cards. If you offer people a less secure but simpler alternative then many of them will use it out of shear, if understandable, ignorance of the implications. Since leaving your username information "laying around" is a security concern, the only way to keep the mass of people from making things less secure is to not offer the option in the first place. It is the responsibility of the banks, who have security experts, to make things more secure. It cannot sit on the shoulders of the masses, as you suggest it should, because it is a known fact that most people using the app are not security experts.

Indeed, by offering the option, they are implying that there is no issue with using it.

This is not a platform battle. The banks clearly take shortcuts or hire developers unfit for the task.Maybe the iPhone developers also developed the Android apps and were not properly educated on Android development (just a thought).

You mean the same review process that keeps letting apps through that clearly violate Apple's policies (WiFi sharing, emulators, etc.), only to have them pulled within a day when Apple finally hears about it from users?

Apple is clearly incapable of determining reliably what the major function of an application is, so you're incredibly naive if you think they are able to figure out something like whether an app stores a password somewhere.

Most institutions are concerned with whether they are legally covered and covered adequately for insurance purposes. Merely being covered to prevent customers from having money stolen is much, much less important. The concern of the higher-ups will be "did they sign our agreement that says we're protected" more than "Are our customers actually protected?"

IT systems are a tool, like an axe or a chainsaw. The problem is you may not realize you want steel-toed boots until your foot protests strenuously at being attacked.

These apps have the ability to remember the users credentials. The program can either store it in plain text or in a reversibly encrypted manner. There is only marginal benefit to encryption as someone can quickly figure out how to reverse it. The solution is to not store the username or password, but then people would simply ask for that feature. Any bet the apps transmit the username/password in cleartext as well?

Agreed - it drives me nuts when app makers deny me the ability to cache passwords/etc. It drives me nuts that chromium stopped caching passwords at the request of the requesting pages a few months ago.

If I want it to remember my password, I'll tell it to. If I don't want it to remember my password, I'll tell it not to. Either way my password gets recorded in a safe place - I can't set and remember 487 unique passwords for all the sites I visit. I just don't want app writers dictating that choice for me.

What practical alternative do you offer, beyond making the user type it every time? At best you can encrypt it using a different password, which you have to type every time (at least that is only one password to remember, but only if the phone provides some kind of central support for this).

Plaintext or encrypted makes no difference - if the key is stored on the phone anyway. What passes for "encrypted" in most applications is really just obfuscation.

if the site accepted a cryptographic certificate from the device the first time you log in and select "remember me" then future logins are done with your saved user name and the cryptographic identity of your device, so an attacker would have to have a real time rootkit on your device, or take your crypto chip out of your device without you noticing it being destroyed

Not a bad idea. Also, I'd have to insist that all devices get delivered without keys, so that there is no way to know if a key is the original one or one that was later generated. Also, as the owner of the device I should be able to choose whether the device generates a key that never leaves the device, or one that it provides a copy to me for safe keeping. The device should never remember disclose which option was chosen.

This is necessary to defeat trusted computing approaches. If the device locks up k

chip should have the following functions. loadKey, wipeKey, signData, cipherData. you give the device any key you want, but it will never under any circumstance output a plaintext key, this should be enforced in silicon such that the chip cannot read the key store for output.

I'd add generateKey as well - for secure key generation (without any way to distinguish which way the key got there). However, what you suggest seems fine.

I'd also arrange that by law devices come without any key and that they are generated when the consumer first uses the device. The wipekey function would need to be easily accessible by an end-user as well. Again, the purpose for this is to make this very useful for security purposes, and useless for trusted computing purposes.

I take it you've never heard of the OS-level security feature called Keychain, present on both OS X and iOS - basically, it's a way of storing data in an encrypted form, using the user's login password (or PIN) as the seed for the encryption key. Not unbreakable, but surely a hell of a lot better than plaintext.

Considering this ships as default with the OS, it's inexcusable to not use it. Morons.

The encryption bit of the keychain is AES, which is effectively unbreakable as far as current tech goes. However, as I mentioned it uses the user's login password (by default) as the seed to the session key, which might reduce the keyspace quite a bit, if the user has a weak password.

So no, not quite the same. In case a device is stolen, the attacker would need to have access to the user's password or PIN (I think the iPhone encrypts the device's whole storage anyway if it is PIN-protected, so you can't j

I'll be honest and say I don't really know much about the details of iPhone encryption, but would it be fair to say that without the PIN code, it is not possible to get access to the Keychain database on the phone?

And trying to bruteforce the PIN code would cause the device to get wiped / locked?

I'm asking, because if this is not the case, then bruteforcing the Keychain might be trivial assuming most people use 4-5 number PINs...

In iOS, an application always has access to its own keychain items and does not have access to any other application’s items. The system generates its own password for the keychain, and stores the key on the device in such a way that it is not accessible to any application. When a user backs up iPhone data, the keychain data is backed up but the secrets in the keychain remain encrypted in the backup. The keychain password is not included in the backup. Therefore, passwords and other secrets stored in the keychain on the iPhone cannot be used by someone who gains access to an iPhone backup. For this reason, it is important to use the keychain on iPhone to store passwords and other data (such as cookies) that can be used to log into secure web sites.

I'm not sure where the password is stored. I'd be curious to know if it is on the filesystem (and thus accessible when jailbroken) or if it is perhaps stored in silicon somewhere, and whether this password survives system restores.

I take it you've never heard of the OS-level security feature called Keychain, present on both OS X and iOS - basically, it's a way of storing data in an encrypted form, using the user's login password (or PIN) as the seed for the encryption key. Not unbreakable, but surely a hell of a lot better than plaintext.

Considering this ships as default with the OS, it's inexcusable to not use it. Morons.

Keychain in iOS has limitations, for example it's always unlocked and the user doesn't have to authentication for the apps to get their keys back out. Still I agree that apps should use the built-in key services when possible.

Also on an iPod Touch you should lock it when not it use. Otherwise if it's stolen the thief doesn't have automatic access to your bank account.

"...But that password is plain text!""Well, the program has to read it. I can encrypt it, but then the app will just have to decrypt it, which means there will be a decryption key in plain text""Then encrypt the key!""...errr...."

etc etc.

Either you allow the user to save their login and password every time, and store it REVERSIBLY, or you don't allow it. If the decryption is reversible then it is totally irrelevant and might as well be plain text, since the "e

No, if you know anything about programming the decryption key does not have to be in plaintext.

You could always XOR it and store it in the registry. (Bonus points to those getting this reference). My point is that the decryption method or key is a known quantity and only a mild impediment to getting the password.

You've over-simplified the problem and created a false dichotomy. There are many solutions that are more secure than plain-text. It's not a binary decision. You are correct in that you can't get perfect security, but that doesn't mean you can't do better than plain-text. Perfect is the enemy of good.

First, while you cannot achieve true security through obfuscation, you can certainly improve your odds. If I steal a computer and scan cookies and documents looking for passwords, I'm more likely to find and use

As for your first paragraph - this is just obfuscation and no better than ROT-13. It doesn't make anything "harder", it just provides a very very false sense of security that is trivially defeated. Such things are better off NOT DONE so at least the user realizes how insecure it is to store their passwords on the device.

As for the second, Do you know how many cell phone users have their phone password protected at screen on? I would venture it is close to 0, as it is horrifically inconvenient to do so. So t

There is a third option. On a successful login, store a token that is unique to that device. If you don't store the password, it's impossible to obtain it. I believe steam uses an approach like this now, after lots of malware was written to steal users passwords.

"Any bet the apps transmit the username/password in cleartext as well?"

Ignoring for the moment that such a phenomenally stupid move would have not only made the article, but also surely have been the focus of the article title, it is absurd to suggest that they aren't using https, as they have been doing properly for years.

Part of the security for an application can be attributed to the underlying platfom. It is very difficult to write a secure application on a operating system that doesn't require a user to log-on to access all files on the system. On such a system, anybody who can access a terminal can compromise any unencrypted data from any application, and the application developer must work that much harder to secure the data. On the other hand, on an operating system that has log-in and protects files from access by un

All?! [wikipedia.org] I mean, seriously, what else can you get away with today?

I'm not suggesting that poor security is unacceptable but when is it time for individuals to take some responsibility concerning the security of your own information? It is precisely because of "chicken with their head cut off" people like you that we have this stupid SOX compliance to deal with in public corporations. SOX should have a narrow focus because ENRON was the result of bad accounting practices, not IT policies.

I read this story immediately since I use one of the listed apps but it wasn't until the end that I saw that it was only the apps running on Android. Should have had the modifier "Android" in the title.

Are Apple's policies or requirements for their app store responsible for this not being an iPhone problem?

How is this only an Android problem? The summary clearly states it relates to both Android and iPhone apps. Should the title have a brand modifier based on the number of times said brand appears in the summary? Ridiculous.

I wouldn't trust those banking apps to not rip me off or expose me, since they're made by the banks. The banks are untrustworthy.

What we need is a standard for consumer banking transactions with any bank server. Then a single client could connect to multiple banks, or to a single one even when it changes its style and services. I would install the banking client app that I trusted and preferred. One view of all my finances, including my IRA, insurance, mortgage, savings, checking, stock market, even perhaps debts owed to/from individual people. In fact I'd like such a client to keep a database of all my financial transactions, including all bills. I'd like it to keep records of every "automatic withdrawal". I'd like it to use my phone to alert me to deposits and withdrawals if I wish, including "OK/Cancel" per transaction. I'd like it to lock each payment with a one time password it generates and sends, instead of using my credit card number in the clear all the time.

Some desktop apps, like Quicken, already do some things like this. But it's time that all my finances are handled by an app I trust that doesn't come from the server that has an interest conflict with me in reporting transactions, that is simple enough without lots of "financial planning" baggage necessarily coming with it. This has been true for email and websites for decades, as well as every other successful kind of info transaction over networks for even longer. It's long past time to leave the consumer side of the banking to businesses actually in the business of serving consumers. Banks are not in that business, haven't been in a long time, and show less and less real interest or reliability in returning to it.

While the regulators need changing to truly protect us from banks, we just took a big step backwards this week by putting Republicans back in charge of that legislation. They are busy deregulating again, though the most they'll probably get is monkeywrenching the new regulations. The reason the legislators can't be trusted is because Americans are stupid, and vote for corrupt legislators, even when that's obviously what they're getting.

While the regulators need changing to truly protect us from banks, we just took a big step backwards this week by putting Republicans back in charge of that legislation. They are busy deregulating again, though the most they'll probably get is monkeywrenching the new regulations.

The Democrats control the Senate and the White House. Since any legislation has to pass both houses and be signed by the President, I think your scenario is pretty unlikely to occur....but don't let me get in the way of an uninformed rant. Carry on.

Forget that: how long have you been reading the news? How could you think that banks are either trustworthy or reliable? If it weren't for the public (FDIC, FSLIC, Federal Reserve, Treasury) bailing them out every 10 years, they'd have lost more money than ever existed. It doesn't get more untrustworthy than that.

I wouldn't trust those banking apps to not rip me off or expose me, since they're made by the banks. The banks are untrustworthy.

Then if I were you, I would find another place to store my money. If I didn't trust my bank to make an app for me to manage my account and not rip me off [assuming I believe them to be competent developers], why the hell would I let them hold my money?
Other then that, I agree with the rest of your post.

None of the banks actually write software themselves - certainly not any retail consumer bank. All of their software, especially something like a consumer mobile app, is written by an independent software firm, usually for hire but possibly an exclusive license of a premade app. As far as I can tell, every banking app is tied exclusively to a single bank that allows it access, which is how banks do business. I know - I've had bank customers for software of all kinds for going on 20 years, including lots of

I use the Chase iphone app and am perfectly happy with its security. I did not opt to store my username on the phone and therefor my security was never in a perilous state. People who chose to store their username on the phone have a SLIGHTLY less secure system, but probably chose to do so because their password is very secure or they just don't care. I think this is more about people than systems.

Really you got your Paypal login information stolen by a sniffer App on a non-jailbroken iPhone. Where was the news story? (I am serious). Oh right there isn't one because you are not telling the truth. Either A: You jailbroke your iPhone and installed a bunch of random crap and then decided it was a good idea to use it for financial transactions too or B. You are full of shit. I vote B...

Meanwhile, the iPhone apps from USAA, Bank of America, Wells Fargo, and Vanguard and PayPal's Android app all passed the security tests and were found to be handling data securely.

This article is attempting to make iPhone look less problematic then Android based phones.

Examples: - why don't they list the uneffected Android apps as they do for iPhone?
- why don't they mention that the Android paypal app is uneffected unlike how it effects the iPhone?
- why would they provide a link to "Google Android" and not "iPhone iOS" other then to highlight "Android" in bright blue along with the title of this article?

Tell me about it. I wouldn't even consider these "holes" as they aren't (immediately) remotely exploitable. If that were the case, firefox has the same "major security hole", as it remembers my bank username and passords. I'm not even sure if the iphone apps are locally exploitable with out the os level jailbreak first?
It's simply the loudest "researcher" that gets the most headlines and generates the most noise.

I've had numerous discussions with my credit union about their inadequate response to computer security.

For instance, their customer service messaging is handled through a third party, so e-mails will reference a third party URL that seemingly has nothing to do with the credit union. I've tried to explain phishing attacks to the credit union to no avail.

At the same time, their customers are pressuring them to support transactions via the phone. What a disaster. The sad part is, as a credit union me

A hole can be fixed. An app that saves data, plain-text or not, is just that. The real issue is WHERE these apps are running: on mobile devices which, if not properly secured, serve as an easy target when stolen or lost.
Is a non-secured iPhone filled with holes since all of your contacts, e-mail, etc. is stored in plain-text?

Really, I mean come on, after 20 years of online banking, they would come back to these junior level mistakes.Someone should lose their job for lack of competence...and I am not talking about the junior programmers!Quality control is management level responsibility, and if you have no nunit testing tools, or diagnostics tools, you hire a firm that does, especially when dealing with banking info like this.

Thanks to the company that did the research , we nipped it in the bud, how many more problems like this