Last year, a Googler named Dr. Elie Bursztein noticed that Apple's App Store protocols weren't very secure.
Much of the interaction your iDevice had with the App Store was conducted via plain old HTTP.
Apple should really have been using HTTPS, or secure HTTP.
HTTPS, as you probably know, is HTTP traffic carried inside a Secure Sockets Layer (SSL) or Transaction Layer Security (TLS) wrapper.

→ SSL/TLS uses public-key cryptography to create a secure data channel, even between users or websites that have never corresponded before. Conventional encryption, like a doorlock, relies on a single key that can lock or unlock. How to share that one-size-fits-all key before you start using it is a security problem all of its own. Public key cryptography relies on an algorithm that uses two keys. One is kept private, and the other made public. What the public key locks, only the private key can unlock.

The problem with HTTP is that if you're on someone else's network, whether it's wired or wireless, they can probably listen into all your web traffic.
Likewise, if someone else is on your network, they can do the same thing, eavesdropping undetectably.
Worse still, it's very likely that they'll be able not only to watch what you're doing, but also to modify the traffic you send and receive.
So, in an ideal world, there would be HTTPS only, since the encryption layer inhibits both eavesdropping and unauthorised modification. Nobody would use HTTP for anything.
And why not? SSL/TLS encryption can be made largely transparent both to the programmer and the user, so the difference in online experience between encrypted and unencrypted web sessions is pretty modest.
In practice, however, HTTPS isn't quite as convenient for your IT department as HTTP.
You need to get certificates signed, your private keys stored securely, and more.
That means an operational change, which means paperwork, implementation time and (you can guess what comes next) money.
Also, because every HTTPS download is encrypted uniquely for each user each time they fetch it, it's much harder to cache HTTPS traffic.
If 2000 users from the USA pull down the same image file from your database in New Zealand, you can't rely on a web cache on the USA side to serve up an identical copy of the file to 1999 of them, because each download is individually negotiated and encrypted.
That means an operational change, which means paperwork, implementation time and (you can guess what comes next) money.
As a result, a sort-of HTTP/HTTPS hybrid evolved.
You use HTTPS for the parts of the transaction that really have to be secret, such as sending passwords, credit card numbers and other Personally Identifiable Information (PII).
For everything else, you use HTTP.
That was the model used by many online services, including webmail providers and social networks, until fairly recently.
Things started to change after the release of Firesheep, security researcher Eric Butler's mildly controversial effort to push the envelope of web encryption.
Implemented as a Firefox plugin, Firesheep listened on the network until the HTTPS-protected part of your social networking session was complete.
Then it sniffed out your session cookie, the magic token embedded in your post-authentication HTTP requests that tells Facebook, Twitter and others that you're an authorised user.
Firesheep could then pretend to be you, posting status updates, links, tweets and more from your accounts as if you had done it yourself.
Of course, even without actively hijacking your social networking accounts, an eavesdropper can learn an awful lot about you from your HTTP traffic.
After all, not everything you upload to Facebook or Twitter is inevitably intended for public consumption, so it oughtn't really to be uploaded without being wrapped in an SSL/TLS session.
Facebook, Twitter and others, bless them all, eventually bit the bullet and simply switched to HTTPS for everything. (At least, they did for web-based clients. Special-purpose mobile apps were, and some still are, a different story, but we shall ignore that issue here.)
But Apple, it seems, didn't bother with HTTPS everywhere, even for its own App Store, until 2013.
Since there's no other place to shop when you're buying or selling iDevice software, and since Apple likes it that way, you might think that Cupertino would have set the bar a bit higher.
You might also have expected Apple to react a bit more quickly after Dr. Bursztein's fairly detailed explanations of why the bar really needed to be higher.
In July 2012, he explained several problems, which he's now made public, including active attacks (that's where you change HTTP content en route between server and client) by which a malcontent could steal your password, trick you into buying the wrong App, deliver you a bogus update, or quietly prevent you from applying a needed update.
Burzstein also showed that the App Store routinely uploaded an unencrypted list of already-installed Apps from your device.
That doesn't sound like much, but it is.
Firstly, some of those Apps will identify aspects of your life that would be handy for a social engineer to know: the bank you use, the newspapers you like, the games you play, the share-trading services you invest with, and more.
Secondly, the complete selection of Apps on your device may very well be unique to you, thus making it a handy form of digital fingerprint for an attacker.
Earlier this year, Apple finally made a start towards the change that many of its web traffic competitors like Google, Facebook and Twitter made some time ago, and bumped all the App Store's active content to HTTPS:
Good. (Better yet would have been to serve everything using HTTPS, but let's be thankful for what we've got.)
If you're a web developer and your web services rely on users sending you traffic that contains anything at all that oughtn't to be public, you should be doing the same.
Even data that isn't legally considered PII can be pure gold to cybercrooks, and so leaking it could be putting your customers at risk.
And you wouldn't want that, would you?