I made a mistake in this column–I misread an Apple tech-support note about restoring an iPhone in an Apple Store as evidence that you could also borrow a computer there to backup your iPhone and then restore it. That’s not the case, as two people pointed out, so I’ve asked my editor to correct the piece.

This post started life as a simpler, shorter unpacking of a report about the limits to Internet providers’ visibility of their subscribers’ online activity, but the topic and the word count expanded a bit from there.

As part of this long guide to wedding presents, Casey Johnston interviewed my wife and I about the stand mixer that (I think) some of her parents’ friends gave us, and which I use to make bread every week.

Federal Communications Commission chairman Tom Wheeler proposed some not-too-sweeping proposals to limit what your ISP can do with the data it collects about your online activity, and Big Telecom is not amused.

I was pleasantly surprised to see some large Internet providers support IMAP syncing and TLS encryption–but others have horribly obsolete and insecure setups. Think about that when you hear somebody insist that the only way to get a good and reliable service online is to pay for it.

This column became a lot more work to report when financial-industry PR types clammed up after I asked what I thought was a simple question about their sites’ security. And then Google wasn’t much more help itself.

But when you think you’ve uncovered an obvious error in a site that’s been out for over a week, it’s usually your own setup at fault. And within minutes of my tweeting about those warnings, I got a reply from the guy who configured the site saying he couldn’t reproduce the problem.

@robpegoraro I configured the site. I can't reproduce, and neither can SSL Labs. Got any more information?

After some quick testing on this computer, my MacBook Air, my iPad and my phone (during which I silently congratulated myself for editing some accusatory sarcasm out of that tweet before posting it), I realized this fault was confined to Safari and Chrome on my two Macs. Every other browser, including Firefox on my iMac, got through to that HTTPS-Only site normally.

Both Macs had an old copy of Comodo Group’s root certificate, one not listed on Apple’s inventory of trusted root certs. I tried deleting that certificate, figuring it probably wouldn’t make things worse–and that was all it took for the HTTPS-Only site to work as advertised and for one or two other sites to stop coughing up security warnings.

With my encrypted browsing back to normal, I’m left to wonder how my system keychains got tangled up like that. Any theories? Before you ask: Yes, I’ve done a full scan with the ClamXav malware scanner and haven’t found any issues.

I enjoyed crafting the photo for this, and not just because it gave me an excuse to flip through old postcards. I did not enjoy reading the comments as much: the repeated assertion there that nothing online can be made secure is both incorrect on a technical level and fundamentally defeatist.

Yes, you read about this topic earlier this year in my USAT column. But this time around the remedy may work a little more reliably. There’s also a tip about watching Netflix on a computer without Microsoft’s Silverlight plug-in–if you’re running Windows 8.1.

For at least the last decade, I’ve been telling readers that open-source development matters and helps make better software. If everybody can read the code of an application or an operating system, there can’t be any hidden backdoors; if anybody can rewrite that code to fix vulnerabilities and add features, the software’s progress can’t be thwarted by any one company’s distraction, fraud or bankruptcy.

My experience using open-source software tells me this is true–even if that doesn’t guarantee a constant rate of improvement or an elegant interface.

And if any genre of software should benefit from this method of development, it ought to be code that Web sites use to secure their interactions with users from eavesdropping: Everybody sending or storing private information needs this feature, billions of dollars of transactions are at stake, and you don’t even have to worry about wrapping a home-user-friendly UI around it.

And despite buffer overflows being a well-known risk with documented defenses, nobody caught this for two years. Two years! It took a Google researcher and engineers at the Finnish security firm Codenomicon to find the bug separately and report it to the OpenSSL team.

“Catastrophic” is the right word. On the scale of 1 to 10, this is an 11.

It seems that everything that could go right in open source development went wrong in this case. As an excellent story from Craig Timberg in the Post outlines, the free nature of OpenSSL made it an obvious choice for hundreds of thousands of sites and something of a natural monopoly, that same enormous deployment of OpenSSL encouraged people to assume that they themselves didn’t need to inspect the code that carefully, and OpenSSL developers got so little financial support from the corporations relying on their work that they couldn’t even subject their code to a proper security audit.

The stupid thing is, we knew this could happen. See John Viega’s 2000 essay, “The myth of open source security,” in which he outlines how thousands of users failed to catch “a handful of glaring security problems” in code he’d contributed to the Mailman mailing-list manager:

Everyone using Mailman, apparently, assumed that someone else had done the proper security auditing, when, in fact, no one had.

That doesn’t mean that closed-source development suddenly looks better. (When all this is done, Microsoft’s proprietary and hideous Internet Explorer 6 may still have greased the skids for more successful attacks than OpenSSL.) But it does mean that selfishness/laziness/distraction and open source can become a toxic mix, one we should have seen coming.

This is a story I kind of missed during the show, but it also took me a day or two to realize how dangerous CBS’s rationales for interfering with CNET’s editorial decisions would be for tech journalism in the traditional (read: media conglomerate-owned) media. I was glad this little rant got as much attention as it did; I wish that had been followed by accountability for the twit or twits in CBS’s executive suite who thought this stunt would work.

Friday marked the first anniversary of the Internet rearing up and kicking Big Copyright in the hindquarters during the battle to quash the Stop Online Piracy Act. That’s worth celebrating, but a week after the death of net-freedom advocate Aaron Swartz I also thought it necessary to point out all the items remaining on the tech-policy to-do list if you value a more open Internet and technology economy. I hope the results doesn’t make me sound like a total Eeyore.

I discussed the things I saw at CES, Apple’s stock price and other tech-news topics on Gene Steinberg’s podcast. I haven’t heard Kirk McElhearn‘s segment yet, but I’m sure that Macworld and TidBITS contributor had insightful things to say too.

I returned to the topic I covered in my USAT column last spring, this time with more context about what Java was supposed to do and how it became the nuisance it is–plus a few remaining, non-Web uses for this software I hadn’t addressed in detail in that earlier piece. There’s also a tip about enabling a security feature Yahoo finally added to its Yahoo Mail service, some five years after Google had provided the same option to Gmail users.