I work for a large non-US based international organisation. Regarding the recent leaks about the NSA breaking most encryption on the internet, we now know the US and UK governments are actively involved in weakening encryption standards and backdooring hardware and software. Unfortunately what exact encryption is broken is unknown, we do know the encryption protocols currently in use on the internet are from Suite B. We are highly concerned that the NSA may be using this capability to steal trade secrets, intellectual property and other information from our organisation that will benefit the US government, companies and industry, or allow them to patent an invention before we have had the chance. We need stronger encryption programs that will give us more protection.

Apparently "strong encryption works" according to Sir Edward Snowden. But what is considered "strong encryption"? There is clear evidence of it working as he managed to originally communicate with the Guardian reporters about his plans and leak highly classified information without the NSA knowing about it. It is not known what encryption he was using for that. I've seen posts saying TLS is highly likely to be broken, as they could use a secret court order to gain the root certificates from Certificate Authorities in the US allowing them to perform a transparent MITM attack on most communications as it transits US/UK networks.

There have also been discussions about backdoored RNGs which have been pushed by the NSA e.g. Dual EC DRBG. Obviously they should be avoided and proper random number generators used. The problems potentially go deeper than that however. The TLS protocol is made up of other algorithms. As you know a Diffie-Hellman key exchange is used to exchange symmetric keys between two parties. These symmetric block-ciphers could also be compromised.

Block-cipher encryption may have unknown attacks on it. As you know with the DES standard, the NSA is at least 20 years ahead of academia and the commercial world in terms of cryptanalysis and employs the best mathmeticians in the world. Not to mention they have many supercomputers and viable quantum computers. We also know the US government in the 1970s - late 1990s used to block export of cryptography over 64 bits. Now those restrictions are now "conveniently" lifted and they now recommend Suite B which has a maximum of 256 bits.

Suite B comprises of the Advanced Encryption Standard (AES) with key sizes of 128 and 256 bits. The two modes specified are Counter Mode (CTR) and Galois/Counter Mode (GCM). Because the government recommends "256 bit" as the highest "top secret" security level and given that their mandate is to backdoor and weaken encryption systems so they can leverage that capability in their surveillance network, then clearly 256 bits or the algorithms in particular (e.g. AES) cannot be trusted. Also it appears we cannot trust anything written or endorsed by the US government anymore as they could be pushing standards that they know have secret weaknesses.

This post is not inviting speculation, modification or opinion on the information presented. I am asking for answers to these specific questions:

Will doubling or even quadrupling the 256 bit block size to 512 bit
or 1024 bit make it considerably more difficult for a powerful
attacker to attack? I assume rework of the key schedule will be
required, correct? When the block size is increased the key size
should match as well, correct?

I know Threefish exists with 512 bit and 1024 bit key and block
sizes. What other non US government approved and public domain
block-cipher algorithms are there but which still have reliable
security analysis done on them from the community? Are there any open
source libraries using this algorithm?

I understand TrueCrypt can chain algorithms for better protection for
storage e.g. Twofish-Serpent-AES using XTS mode. Does this provide
actual verifiable protection against algorithm weaknesses or are
there attacks against this? Can this method be used for
communications as well?

If the number of rounds in the encryption process was increased
beyond the standard number, perhaps by a variable pre-agreed number
between sender and recipent, would this increase the security margin
of current algorithms significantly?

What other modes of block-cipher encryption are considered more
secure than the government recommended ones (CTR, GCM and XTS)?

What other encryption methods or open source programs could be used
as an alternative to block-ciphers for encrypting data more securely?

There are either too many possible answers, or good answers would be too long for this format. Please add details to narrow the answer set or to isolate an issue that can be answered in a few paragraphs.
If this question can be reworded to fit the rules in the help center, please edit the question.

3

This is incredibly broad question and one could write many books and still not cover all the angles. Please narrow the scope of your question down to a level suitable for this Q&A format. Thanks!
–
TildalWaveSep 11 '13 at 4:18

@TidalWave: Well, it is also a very interresting one, and (as the first answer) some could give very interresting and useful answers... But they are now prevented from doing so by the "hold" :(. And the OP presented things both clearly and with interresting details and notes. I wish this would fit in this site (where we can't always "fragment" a question into small ones?)
–
Olivier DulacSep 12 '13 at 15:15

1 Answer
1

Most of the problems outlined by Mr Snowden are essentially side-channel attacks. The algorithms aren't suddenly less robust but rather the software and infrastructure that uses these algorithms.
Root CAs for example have always been open to the nation they reside in, the protection they provide is against other normal civilians, not government agencies.

The raising of the key strength bar in the 1990s might have been because the cat was already out of the bag on stronger algorithms (all other non-third world nations already had far better encryption). Blocking keys above 40 bits and later 52 bits would simply have stifled the USA's growth in eCommerce and ICT products. Still; it could also have meant the quantum computers in the basement went online at that point - after all they did sell the cracked Enigma to countries in the 1950s.

I recall a comment on a different question where it was mentioned that NSA's primary responsibility is to protect US assets more than raiding foreign assets, hence an incentive to release ciphers to the American public that they haven't put back doors in.

Question 1A: Will doubling or even quadrupling the 256 bit block size to 512 bit or 1024 bit make it considerably more difficult for a
powerful attacker to attack?

Answer for Question 1A: Not really, as having cracked the 256-bit encrypted message in a time-useful fashion would imply O(N) decryption or similar. Increasing key or block size in this scenario would no longer change the time complexity of the problem. Increasing the bit strength is useful for normal linear expectations of NSA's ability.

Question 1B: I assume rework of the key schedule will be required, correct?

Question 1C: When the block size is increased the key size should match as well, correct?

Answer for Question 1C: Err. Sort of. Shannon Information Theory suggests as much, but you would need to overhaul the algorithm and this would change the cipher from well-known to not well-known, lowering its security rating.

Question 2: I know Threefish exists with 512 bit and 1024 bit key and block sizes. What other non US government approved and public
domain block-cipher algorithms are there but which still have reliable
security analysis done on them from the community? Are there any open
source libraries using this algorithm?

Answer for Question 2: Don't know. Would have to count the public research papers released for specific ciphers.

Question 3: I understand TrueCrypt can chain algorithms for better protection for storage e.g. Twofish-Serpent-AES using XTS mode. Does
this provide actual verifiable protection against algorithm weaknesses
or are there attacks against this? Can this method be used for
communications as well?

Answer for Question 3: Same outcome as question 1. If you can crack one class of algorithm in time O(N) then you can crack all ciphers of that class regardless of nesting or ordering.

Question 4: If the number of rounds in the encryption process was increased beyond the standard number, perhaps by a variable pre-agreed
number between sender and recipent, would this increase the security
margin of current algorithms significantly?

Answer for Question 4: Yes, if new key material is supplied. But as your question posits, either NSA has been using side-channels attacks (most likely) or broken the time complexity problem of primes (unlikely but possible). A strengthened algorithm wouldn't stop that.

Question 5: What other modes of block-cipher encryption are considered more secure than the government recommended ones (CTR, GCM
and XTS)?

Answer for Question 5: Don't know.

Question 6: What other encryption methods or open source programs could be used as an alternative to block-ciphers for encrypting data
more securely?

Answer 6 for Question 6

If you are organisation willing to invest in this problem, you should establish a TCP/IP network tunnel that uses (massive) one-time-pads for communication between company offices.

Your organisation can generate the one-time-pads at head office on hard drives and have trusted staff act as periodic data couriers between offices - possibly while on other company business.

If an open source TCP/IP VPN application for one-time-pads doesn't already exist, it shouldn't take more than a medium-size software development project to create one.

Note that you still need protect end points against operational weaknesses or a targeted attack. Sometimes spy agencies use very low tech methods - buy off an employee or pickpocket a mobile phone.

Thanks for the answer. Sounds like one-time pads are the way to go. They offer provable security rather than trying to stay ahead of a guessing game and ever increasing bit sizes. Considering quantum computer clusters are now viable for the US government - irregardless of some random's blog who has no hands on knowledge of the system. The actual capabilities of the computer have likely been classified anyway. How do you recommend creating that much key data? I found a post mentioning one method here.
–
jcnrmSep 11 '13 at 10:40

1

When I last looked at this a few years ago, the best well-thought out implementation was LavaRnd. It's open source and I'd imagine you'd want to port to a more modern programming language. Webcams make a good source of one-time-pads providing the signal is stripped from the noise. An inefficient simple hack solution: Record raw WAV from an unplugged line-in audio, slice into 512 bit chunks; and SHA512 each; concatenate.
–
LateralFractalSep 11 '13 at 12:45

1

Of course, your OS may have a decent RNG function that uses CPU temperature oscillation or hard drive head skipping for random noise. Amusingly the forward development of computing has been to eliminate the natural analog noise inherent to the circuitry (especially at the nanometer level of transistor waste heat), rather than record it.
–
LateralFractalSep 11 '13 at 12:51