Mozilla plans to establish an automated process which would verify that binaries contain only the code found in the official source repositories, and not spyware secretly added during the build process at the behest of government intelligence agencies. In a blog post entitled Trust but Verify, CTO Brendan Eich and R&D VP Andreas Gal note that governments "may force service operators [such as Mozilla] to enable surveillance (something that seems to have happened in the Lavabit case)" and pledge to develop systems which will make Firefox resistant to this form of tampering.

Well, for most it's probably even more secure than compiling the code ourselves because most of us don't even conduct a cursory inspection of the code first. There's a very good chance that a source code based back door can be installed undetected even if it's not concealed. If no one looks at it, it might as well be commented in all caps "HERE BE A BACKDOOR".

I see your point: the equivalence would be to installing from source, where I have some certainty that the source I'm using is the same as the source other people are using (and presumably auditing).

If ANYONE on the network is doing a good job monitoring the code for backdoors, then a hash verified binary copy is probably more secure than a copy compiled from source by an end user.

Yes, but it does still rely on this happening. Whether I compile from integrity-checked source, or use a binary that has been verifiably generated from a given source tree, I still have to rely on the assumption that someone else audited it, the libraries it relies on, and all previous versions of the compiler (since there's no way I'm doing that myself!).

It's a good initiative and reducing the requirement to trust a single organisation makes a lot of sense. If only I could apply the same technique to all of the other technologies I use regularly.

"I still have to rely on the assumption that someone else audited it, the libraries it relies on, and all previous versions of the compiler (since there's no way I'm doing that myself!)."

Yes of course it might even go down to firmware & hidden hardware (like the System Management Mode that's inaccessible even to the OS itself). I think this is where diversity helps overcome the security implications of monoculture.

If I can cross compile a binary on ARM hardware running linux and you can compile it on amd64 running windows, and we end up with the same binaries, then it rules out large swaths of the system that might be compromised. The compiler itself is a weak link here, however I don't know how to solve this monoculture problem? If we used a different compiler we know right off the bat that we'd end up with different binaries even if nothing was wrong. How can we prove that nothing is hidden in the compiler?

"It's a good initiative and reducing the requirement to trust a single organisation makes a lot of sense. If only I could apply the same technique to all of the other technologies I use regularly."

I agree, this might be a good model for all open source projects in the future. Mozilla is just one small part of a large collection of software we use. This model depends on open source software, closed sourced software cannot be independently compiled outside the influence of those who want the backdoors added.

Perhaps, we could have a very strict meta-compiler to sign code. Something like a pass 1 with no code rearrange or optimization and totally independent of the target architecture. Once the base system was audited, we could just scale it to upper levels. It may work for the software stack but, of course, not at hardware/firmware level. For that diversity and network auditing will continue to be required.