Apple on Saturday said it is working to fix a flaw in OS X that could in some cases allow hackers to intercept communication sent using SSL/TSL security protocols. The same error was patched in an iOS update the company rolled out on Friday.

CVE ID description for Apple's iOS security flaw.

In a statement provided to Reuters, Apple confirmed researcher findings that the same SSL/TSL security flaw fixed with the latest iOS 7.0.2 update is also present in OS X. The Cupertino company said it expects to have a software update ready for release "very soon."

"We are aware of this issue and already have a software fix that will be released very soon," said Apple spokesperson Trudy Muller.

On Friday, Apple quietly pushed out iOS 7.0.2, with accompanying release notes saying the software "provides a fix for SSL connection verification." A support document issued alongside the update read:

Impact: An attacker with a privileged network position may capture or modify data in sessions protected by SSL/TLS

Description: Secure Transport failed to validate the authenticity of the connection. This issue was addressed by restoring missing validation steps.

End users not running the latest patched iOS software may be open to attacks when connected to a shared network. Nefarious users could potentially view, alter or download email and other data sent via the Secure Socket Link protocol, which falls under the umbrella of Transport Layer Security.

As noted in the security document, iOS Secure Transport "failed to validate the authenticity of the connection." At its core, the issue stems from the mishandling and faulty recognition of digital certificates used to establish secure encrypted connections.

In the case of iOS and OS X, Apple's implementation is missing code, causing a failure to verify these certificates. When a user visits what they believe to be a trusted site, hackers can potentially pose as a legitimate certificate holder and collect data sent over the connection before handing it off to the real site.

While it is unclear exactly when Apple discovered the flaw, the CVE (Common Vulnerabilities and Exposures) identification code for the iOS version was reserved and assigned to an unknown party on Jan. 8. The CVE is a publicly available standardized reference for known software security vulnerabilities.

So which news articles have there been, before the patch was released, about actual attacks using this exploit.

Well, that’s a point to be taken. Almost all of these so-called major flaws or bugs never see the light of day in the real world. They are just ginned up to paranoia level by trolls and security software hawkers.

For someone to exploit this, they need to be on a network between you and your destination. If you're on your home network, that's just people on your router and your ISP. They also would need to know that the exploit exists and how to exploit it to their advantage.

The worst case is for public wifi if you check email or do any digital banking but someone would have to be pretty much dumping all traffic from a public hotspot at all times in the hope that someone doing something worthwhile comes along with a device that had the vulnerability and then exploit it. Now that the exploit is known, it's more likely someone will try targeted attacks but they'd still be in for a long wait dumping public wifi traffic.

Just exactly how serious is this? The threads at Mac Rumors make it seem like the biggest breach in the history of software.

The worst part about it is it's a simple, fairly obvious typo (presumably). It shows poor software engineering practices at Apple all around: a coding style that's inconsistently applied throughout the file, poor code review, and poor software testing. And the worst part of it is that it's on a security critical piece of software which should have been third-party audited. If they can't get this right, what else is wrong?

Just so you're aware, the engineering team, design team and marketing team do not work on iOS's code.

Really? I didn't know that! So, I guess they don't all work for Apple. Probably the iOS code is some kind of external OS and Apple has nothing to do with it. I that case I have no complains at all. It's not Apple fault. Not their OS, sorry! Please Apple, please, concentrate all of your resources to make the next iPhone 0.00000001 mm thinner! That's what I really want!

Just so you're aware, the engineering team, design team and marketing team do not work on iOS's code.

There's no engineering team working on software? I wonder who wrote this code, then trained monkeys?

Things like security, coding style, and review are taxes. You have to pay your taxes because it's necessary, but you don't see a immediate benefit from them. If they had discipline and required braces on if statements, then the programmer would have gotten an error and instantly fixed it. The bug would have lasted a whole minute, and nobody would have ever known about it. (In the strictest organizations, those who do safety critical stuff, the coder would have had to log that error so they would have had metrics). They didn't pay their taxes, and now look what happened.

The product managers, design teams and engineers, and most importantly senior leadership, need to understand the value of these taxes, ensure they are paid. Otherwise they will gain a reputation of being slow and unreliable (Blackberry) or insecure (Android) and people will stop buying their products. What's the use of designing such a thin phone if nobody buys it?

Really? I didn't know that! So, I guess they don't all work for Apple. Probably the iOS code is some kind of external OS and Apple has nothing to do with it. I that case I have no complains at all. It's not Apple fault. Not their OS, sorry! Please Apple, please, concentrate all of your resources to make the next iPhone 0.00000001 mm thinner! That's what I really want!

Um, the people involved with the thickness of iPhone are not the same employees involved with source code. Last time I checked mechanical engineers are not software engineers.

The worst part about it is it's a simple, fairly obvious typo (presumably). It shows poor software engineering practices at Apple all around: a coding style that's inconsistently applied throughout the file, poor code review, and poor software testing. And the worst part of it is that it's on a security critical piece of software which should have been third-party audited. If they can't get this right, what else is wrong?

Maybe Apple should fire their entire software engineering team since, according to you, they obviously have poor engineering practices all around. What I find ironic is this apparently first appeared in iOS 6 which was released under Forstall and yet there are people who claim Apple is doomed if they don't bring Forstall back.

Maybe Apple should fire their entire software engineering team since, according to you, they obviously have poor engineering practices all around. What I find ironic is this apparently first appeared in iOS 6 which was released under Forstall and yet there are people who claim Apple is doomed if they don't bring Forstall back.

Thanks for putting words in my mouth. I said the entire incident shows poor engineering practices, not that all software engineers there were bad. I didn't say anybody should be fired, but when things like this happen it starts at middle level management and above.

This is where Google shines, they're fundamentally run by nerds. It's also where Google fails, it seems they are more interested in the taxes, like new programming languages and codecs and HTML extensions rather than actual product development.

If this is as serious as some suggest, why hasn't Apple released a patch for Mavericks yet? One would assume if someone gets compromised it wouldn't take two seconds to file a lawsuit. I can't imagine Apple would want to expose itself to that. I guess I'm trying to understand if this really is as bad as some are suggesting, or, if it's just the weekend with nothing else to talk about and this will find its way to the back page come tomorrow when MWC starts and Samsung introduces their new phone.

If this is as serious as some suggest, why hasn't Apple released a patch for Mavericks yet? One would assume if someone gets compromised it wouldn't take two seconds to file a lawsuit.

This was obviously not published on Apple's timeline, somebody found or exploited it. Why was the Apple TV, which doesn't have anything important on it, patched before Mavericks? If Apple was half competent and they were in control, they would have released everything at once.

Have you ever seen anybody sued for software defects? Microsoft? Doesn't work that way. Your license agreement in big letters says they are not liable, and there's never been a precedent for holding a company liable for negligence in consumer grade PC software.

This was obviously not published on Apple's timeline, somebody found or exploited it. Why was the Apple TV, which doesn't have anything important on it, patched before Mavericks? If Apple was half competent and they were in control, they would have released everything at once.

Have you ever seen anybody sued for software defects? Microsoft? Doesn't work that way. Your license agreement in big letters says they are not liable, and there's never been a precedent for holding a company liable for negligence in consumer grade PC software.

So let's see, Apple has poor engineering practices and is not competent or in control. Guess that means someone should be fired then?

So let's see, Apple has poor engineering practices and is not competent or in control. Guess that means someone should be fired then?

More words in my mouth. I said Apple is not in control of the disclosure timeline of this bug. The cat got out of the bag before Apple had a chance to fully react.

I didn't say anybody should be fired. I am saying there are serious software engineering issues that are evident by the open source code on security critical software and a comprehensive organizational review and changes are necessary.

I predict your next claim is saying that I am now blaming Tim Cook on this bug, and it's not really a bug and just made up by Android fans.

More words in my mouth. I said Apple is not in control of the disclosure timeline of this bug. The cat got out of the bag before Apple had a chance to fully react.

I didn't say anybody should be fired. I am saying there are serious software engineering issues that are evident by the open source code on security critical software and a comprehensive organizational review and changes are necessary.

I predict your next claim is saying that I am now blaming Tim Cook on this bug, and it's not really a bug and just made up by Android fans.

You said: "If Apple was half competent and they were in control, they would have released everything at once." "If Apple was half competent" seems to me like you don't think they that are competent (at least in this area). I never suggested that you think someone should be fired. I said that. IF this is as serious as some suggest and millions of people are/have been at risk since iOS 6 over a line of code that should have been caught in code review, then yes I think someone should be fired over it.

The worst case is for public wifi if you check email or do any digital banking but someone would have to be pretty much dumping all traffic from a public hotspot at all times in the hope that someone doing something worthwhile comes along with a device that had the vulnerability and then exploit it. Now that the exploit is known, it's more likely someone will try targeted attacks but they'd still be in for a long wait dumping public wifi traffic.

That would explain Square sending out an iOS update alert. Finding a retailer using them as their CC processo and over wi-fi would be pretty easy. As quickly as Square responded perhaps they've already seen this vulnerability in action?

I was most impressed that Apple had an iOS patch for this that covered iOS 6 and my aging iPhone 3gs. That's the kind of long-term support, I tell people, that puts iPhones ahead of their competition. Most smartphone makers seem to give no support once a product is discontinued.

I hope Apple does the same with the OS X patch. My MacBook, which I still find useful, won't run any OS X past Lion. It needs a patch too.

Does that mean all my 1Password data being synced via iCloud was sent unencrypted under Mavericks and 7.0.x?

It means that if an application used Apple's secure framework for HTTPS connections, that someone with access to your network or any network inbetween could have replaced the certificate with one they control, seeing the plain text of your communications.

However, 1Password could also encrypt their data on top of this, which would frustrate any analysis, and being in a position to do this would normally be something like the NSA or a poisoned open wifi AP.