Google builds bigger crypto keys to make site forgeries harder

No one has cracked a 1024-bit key yet, but Google isn't taking any chances.

Google is upgrading the digital certificates used to secure its Gmail, Calendar, and Web search services. Beginning on August 1, the company will start upgrading the RSA keys used to encrypt Web traffic and authenticate to 2048-bits, twice as many as are used now.

The rollout affects the transport layer security (TLS) certificates that underpin HTTPS connections to Google properties. Sometimes involving the secure sockets layer (SSL) protocol, the technologies prevent attackers from reading the contents of traffic passing between end users and Google. They also provide a cryptographic assurance that servers claiming to be Google.com are in fact operated by Google, as opposed to being clones created by attackers exploiting age-old weaknesses in the way the Internet routes traffic.

There are good reasons for Google to upgrade the strength of these crucial digital keys. The weaker the key strength of an RSA key pair, the easier it is for anyone to mathematically derive the "private key." Such attacks work by taking the certificate's "public key" that's published on the website and factoring it to derive the two prime numbers that make up the private key. Once the private key for a Google certificate has been factored, the attacker can impersonate an HTTPS-protected Google server and provide the same indications of cryptographic security as the legitimate service. Someone who was able to derive the secret primes to Google's private key, for instance, would be able to create convincing attacks that would fool many browsers and e-mail clients.

The factors in private keys are extremely time-consuming to find, but increases in computing power are making the task gradually easier. In 2009, researchers were able to factor a 768-bit RSA key, according to Wikipedia. The online encyclopedia went on to say that a 1024-bit key has not yet been factored. While it may take years for that to happen, it's only a matter of time until it is. And of course, secretive agencies within powerful nation states may already have the ability to factor larger bit sizes.

Another type of attack used to defeat cryptographic certificates is known as a collision attack. Rather than deriving a private encryption key, it forges a digital key used to sign software. It works against hashing algorithms by finding two pieces of plaintext that generate the same cryptographic message digest. While it's extremely time-consuming to find such collisions, certain algorithms—particularly MD5—have known weaknesses that significantly reduce the requirements. The Flame espionage malware that targeted Iran wielded a never-before-seen collision attack to hijack Microsoft's Windows Update mechanism.

Researchers have estimated that the SHA1 algorithm, which is considered more resistant to collision attacks and underpin huge parts of Internet security, could fall by 2018.

The new certificates being rolled out by Google could create headaches for some software developers, particularly those whose code is embedded in certain types of phones, printers, set-top boxes, gaming consoles, cameras, and other types of devices. For those developers, Google provided the following advice:

Include a properly extensive set of root certificates contained. We have an example set which should be sufficient for connecting to Google in our FAQ. (Note: the contents of this list may change over time, so clients should have a way to update themselves as changes occur);

Support Subject Alternative Names (SANs).

Also, clients should support the Server Name Indication (SNI) extension because clients may need to make an extra API call to set the hostname on an SSL connection. Any client unsure about SNI support can be tested against https://googlemail.com—this URL should only validate if you are sending SNI.

On the flip side, here are some examples of improper validation practices that could very well lead to the inability of client software to connect to Google using SSL after the upgrade:

Hard-coding the expected Root certificate, especially in firmware. This is sometimes done based on assumptions like the following:

The Root Certificate of our chain will not change on short notice.

Google will always use Thawte as its Root CA.

Google will always use Equifax as its Root CA.

Google will always use one of a small number of Root CAs.

The certificate will always contain exactly the expected hostname in the Common Name field and therefore clients do not need to worry about SANs.

The certificate will always contain exactly the expected hostname in a SAN and therefore clients don't need to worry about wildcards.

Any software that contains these improper validation practices should be changed. More detailed information can be found in this document, and you can also check out our FAQ if you have specific questions.

People using mainstream browsers and e-mail clients aren't likely to notice any difference at all.

Promoted Comments

I'm pretty sure that new certificates issued from a public CA like Verisign, Thwate, GoDaddy, etc. have had the requirement to be 2048 bit since at least last year.

I'm not sure why this is a story. We've had 2048-bit certs on our external facing services since 2010, if not earlier.

This. I can confirm that our CA issuer will stop accepting cert requests with 1024bit keys January 1st 2014. Minimum key length will be 2048bit.

Actually, just recently they would not renew our certs with 1024bit keys because their expiration date would have fallen past the 2048bit cut off date. Which means already, today, we are unable to get web server certs with 1024bit keys because minimum expiration for them is one year.

This is a bit of a pain, because I can't just renew an existing cert as they are all 1024bit keys. I need to request a new one, with the same common name, then load it on the server, then switch the service from the old 1024bit to the new 2048bit key cert, then delete the old cert. But there are definite long term benefits to better securing SSL transaction so we are willing to put up with short term pain.

Only if your span of time is longer than the age of the universe. Any crypto that can hold it's own against moore's law and new maths for the next 100 years is good enough for anyone.

You left out quantum computing as a potential death knell for RSA. If they do ever get a quantum computer working, RSA breaking becomes a logarithmic, rather than polynomial, excercise.

I believe, though, that there are asymmetric algorithms that are not susceptible to quantum speedup. At least not susceptibe to Shor's algorithm. If quantum computers ever become reality, then your "new math" category would really have two sub-categories: "new classical algorithms" and "new quantum algorithms". Since quantum computer science is newer, it may be the case that new quantum algorithms are more likely than new classical algorithms.

The classes of problems that quantum computers could potentially be faster at than classical is relatively well defined. You are not suddenly going to find a way to extend that set, for all other problems QC will remain irrelevant. The asymmetric algorithms that are believed to be QC resistant appear to have no solutions that fall in the above set of problems, we could be wrong but I don't see any reason why we are any more likely to find that to be the case than to find that there is some polynomial time solution to them after all. Either our assumptions about the problem domain are correct or they are not: QC computers may be new but this kind of analysis really isn't. P.S. We still can't really prove that the RSA problem is fundamentally dependant on the integer factorisation problem and someone that solved that problem wouldn't even need a QC to attack RSA.

The article title is wrong - this change does not make Google any more difficult to forge.

As long as your browser has *any* trusted root CAs with 1024 bit private keys, 2048 bit keys make no sense. You are only as strong as the weakest link. Why brute-force a 2048 bit key when you can go after a 1024 bit one and sign yourself a google.com certificate with it?

Google's move isn't completely pointless, it's just that until everyone moves away from 1024 bits, we're all just as secure/insecure as before.

As others have already said, this is not news. Using a 2048-bit private key has been a NIST recommendation for years now and the CABForum's baseline guidelines banned the issuance of 1024 certs that expire past 12-31-2013 (and mandate the revocation of any that expire past that date starting Jan 1, 2014).

More interesting is why it took Google this long to switch. I'd guess they ran into trouble with older Asian smartphones, many of which didn't support keys >1024. Several Japanese telecoms approached the CAs during discussion of 1024 sunsetting with that concern.

Not a bad move (if a bit late, I've used 2048 for my SSL certs since what, 2011?) but on a long enough span of time all cryptography can be defeated. Still, if you had been working tirelessly to crack the 1024 cert for the past few years, you're back to square one.

As long as your browser has *any* trusted root CAs with 1024 bit private keys, 2048 bit keys make no sense. You are only as strong as the weakest link. Why brute-force a 2048 bit key when you can go after a 1024 bit one and sign yourself a google.com certificate with it?

Only if your span of time is longer than the age of the universe. Any crypto that can hold it's own against moore's law and new maths for the next 100 years is good enough for anyone.

You left out quantum computing as a potential death knell for RSA. If they do ever get a quantum computer working, RSA breaking becomes a logarithmic, rather than polynomial, excercise.

I believe, though, that there are asymmetric algorithms that are not susceptible to quantum speedup. At least not susceptibe to Shor's algorithm. If quantum computers ever become reality, then your "new math" category would really have two sub-categories: "new classical algorithms" and "new quantum algorithms". Since quantum computer science is newer, it may be the case that new quantum algorithms are more likely than new classical algorithms.

The article title is wrong - this change does not make Google any more difficult to forge.

As long as your browser has *any* trusted root CAs with 1024 bit private keys, 2048 bit keys make no sense. You are only as strong as the weakest link. Why brute-force a 2048 bit key when you can go after a 1024 bit one and sign yourself a google.com certificate with it?

Google's move isn't completely pointless, it's just that until everyone moves away from 1024 bits, we're all just as secure/insecure as before.

This is where pinning (and similar techniques) comes in, allowing sites to declare not only that they should only be accessed through HTTPS connections (this is HSTS), but only through connections from an enumerated list of root authorities:

I'm pretty sure that new certificates issued from a public CA like Verisign, Thwate, GoDaddy, etc. have had the requirement to be 2048 bit since at least last year.

I'm not sure why this is a story. We've had 2048-bit certs on our external facing services since 2010, if not earlier.

This. I can confirm that our CA issuer will stop accepting cert requests with 1024bit keys January 1st 2014. Minimum key length will be 2048bit.

Actually, just recently they would not renew our certs with 1024bit keys because their expiration date would have fallen past the 2048bit cut off date. Which means already, today, we are unable to get web server certs with 1024bit keys because minimum expiration for them is one year.

This is a bit of a pain, because I can't just renew an existing cert as they are all 1024bit keys. I need to request a new one, with the same common name, then load it on the server, then switch the service from the old 1024bit to the new 2048bit key cert, then delete the old cert. But there are definite long term benefits to better securing SSL transaction so we are willing to put up with short term pain.

Only if your span of time is longer than the age of the universe. Any crypto that can hold it's own against moore's law and new maths for the next 100 years is good enough for anyone.

You left out quantum computing as a potential death knell for RSA. If they do ever get a quantum computer working, RSA breaking becomes a logarithmic, rather than polynomial, excercise.

I believe, though, that there are asymmetric algorithms that are not susceptible to quantum speedup. At least not susceptibe to Shor's algorithm. If quantum computers ever become reality, then your "new math" category would really have two sub-categories: "new classical algorithms" and "new quantum algorithms". Since quantum computer science is newer, it may be the case that new quantum algorithms are more likely than new classical algorithms.

The classes of problems that quantum computers could potentially be faster at than classical is relatively well defined. You are not suddenly going to find a way to extend that set, for all other problems QC will remain irrelevant. The asymmetric algorithms that are believed to be QC resistant appear to have no solutions that fall in the above set of problems, we could be wrong but I don't see any reason why we are any more likely to find that to be the case than to find that there is some polynomial time solution to them after all. Either our assumptions about the problem domain are correct or they are not: QC computers may be new but this kind of analysis really isn't. P.S. We still can't really prove that the RSA problem is fundamentally dependant on the integer factorisation problem and someone that solved that problem wouldn't even need a QC to attack RSA.

You left out quantum computing as a potential death knell for RSA. If they do ever get a quantum computer working, RSA breaking becomes a logarithmic, rather than polynomial, excercise.

I'm pretty sure you mean it becomes polynomial rather than exponential. Breaking RSA on a classical computer takes exponential time with respect to the number of bits in the RSA key; and breaking it with Shor's algorithm on a quantum computer takes polynomial time.

By the way, although the scaling of Shor's algorithm is exponentially faster than the best known classical algorithms, it is still pretty slow. Breaking RSA-2048 on a full-scale quantum computer could take many years (depending on the clock-speed of this hypothetical QC).

It's amazing that it comes down to an inability to factor numbers. I feel like somewhere a new freshman will say "Factoring, well duh, you just have to....." and then the whole house of cards will collapse.

This is news, I thought Google was already using 2048 bit RSA. I was expecting from the headline that Google was going to 4096 bit RSA. This is really disappointing. They're waiting until August?

I haven't found one big name website that still uses 1024 bit RSA they have all moved to 2048 bit: Wikipedia, eBay, PayPal, Amazon, Microsoft, Yahoo, and you guessed it even Arstechnica. Oh forgot about Facebook :O 1024 bit... That's odd. Just checked other social networks Twitter, Linkedin, Pinterest, and MySpace which all have 2048 bit.

Wait this makes sense Google and Facebook are the Internet and they run HTTPS everywhere. This is probably a big investment for them.

-----On another note I need to change my Arstechnica password since I just logged in on an HTTP page that sent an HTTP request. Hmm apparently you need to click login at the top of the page then click the login button (without credentials) then you get to the HTTPS login page... You guys should fix that.

The classes of problems that quantum computers could potentially be faster at than classical is relatively well defined.

No. It's not. There are certain known quantum algorithms, and their application to various problems has been well-studied. But the existence of other useful quantum algorithms is still an open question.

Quote:

You are not suddenly going to find a way to extend that set, for all other problems QC will remain irrelevant. The asymmetric algorithms that are believed to be QC resistant appear to have no solutions that fall in the above set of problems, we could be wrong but I don't see any reason why we are any more likely to find that to be the case than to find that there is some polynomial time solution to them after all. Either our assumptions about the problem domain are correct or they are not: QC computers may be new but this kind of analysis really isn't. P.S. We still can't really prove that the RSA problem is fundamentally dependant on the integer factorisation problem and someone that solved that problem wouldn't even need a QC to attack RSA.

I like how you make the absolute claims that quantum computing will not be extended but then go on to propose the possibility of new classical algorithms.

Way more work has gone into such classical algorithms than has gone into quantum algorithms. If there's low-hanging fruit anywhere, it's in the quantum computing world.

It's amazing that it comes down to an inability to factor numbers. I feel like somewhere a new freshman will say "Factoring, well duh, you just have to....." and then the whole house of cards will collapse.

Ah yes, the armchair cryptographer. The reality is not quite as simple: there are secure cryptographic methods that use hard number theory problems to make attacks hard. RSA and its variants basically use computations modulo a large number that is the product of two large primes (of equal bit size (for security they probably need to be but attacks have not proved that conclusively)). These reduce to factorization. They are indeed not proven secure, due to the freshman problem you mention, however that does not mean the freshman can actually be born either.

RSA does not scale well. In particular to be secure, the key exchange length (2048 bits in this article) grows too fast. For RSA you need the following size to transmit the secret key (RSA/key being transmitted): (1024/80)(3072/128)(15360/256). So we see that already 2048 is not even adequate for a 128 bit key and 15k for a 256 bit key is just out of the question for a lot of applications.

Instead there is a general shift to Eliptic Curve Cryptography (ECC). So same cryptographic methods, but different difficult problem that needs solving as part of an attack. The problem is much harder and scales better, so for instance you only need 256 bits for the 128 bit key instead of RSA's 3072. This article would instead read Google upgrades to 256bit ECC keys. NSA Suite B (google it) gives an idea of what the NSA publicly thinks of various schemes. You may also sense some impatience between the lines at the slowness to adopt ECC and dump the older more vulnerable schemes.

Sadly in the "Stubabe vs Chuckstar Fight!" above, ECC is vulnerable to a modified Shor's algorithm for when we actually have Quantum computers.

However, this just means we need to use Quantum hard problems in our crypto methods. The bottom line in the quantum world will be that despite the advances in previously intractable problems we can then solve, there will remain intractable problems that can form the basis for quantum crypto.

It's amazing that it comes down to an inability to factor numbers. I feel like somewhere a new freshman will say "Factoring, well duh, you just have to....." and then the whole house of cards will collapse.

Ah yes, the armchair cryptographer. The reality is not quite as simple: there are secure cryptographic methods that use hard number theory problems to make attacks hard. RSA and its variants basically use computations modulo a large number that is the product of two large primes (of equal bit size (for security they probably need to be but attacks have not proved that conclusively)). These reduce to factorization. They are indeed not proven secure, due to the freshman problem you mention, however that does not mean the freshman can actually be born either.

RSA does not scale well. In particular to be secure, the key exchange length (2048 bits in this article) grows too fast. For RSA you need the following size to transmit the secret key (RSA/key being transmitted): (1024/80)(3072/128)(15360/256). So we see that already 2048 is not even adequate for a 128 bit key and 15k for a 256 bit key is just out of the question for a lot of applications.

Instead there is a general shift to Eliptic Curve Cryptography (ECC). So same cryptographic methods, but different difficult problem that needs solving as part of an attack. The problem is much harder and scales better, so for instance you only need 256 bits for the 128 bit key instead of RSA's 3072. This article would instead read Google upgrades to 256bit ECC keys. NSA Suite B (google it) gives an idea of what the NSA publicly thinks of various schemes. You may also sense some impatience between the lines at the slowness to adopt ECC and dump the older more vulnerable schemes.

Sadly in the "Stubabe vs Chuckstar Fight!" above, ECC is vulnerable to a modified Shor's algorithm for when we actually have Quantum computers.

However, this just means we need to use Quantum hard problems in our crypto methods. The bottom line in the quantum world will be that despite the advances in previously intractable problems we can then solve, there will remain intractable problems that can form the basis for quantum crypto.

RRob is right to point out that the factoring problem has not been proven hard. It is only believed to be hard. Unless it can be proven hard, then there is always the possibility of the naive freshman saying "but what if you..." and breaking any crypto that relies on factoring being hard. This is a long-studied problem, however, so we believe the odds of that happened are very small, though.