Could a decentralized autonomous organization use this to safely store private keys?

Quote from one of the authors of the research:"The right way to think of what secure obfuscation allows is to create, *under many technical conditions*, software that has secrets built into it. These secrets are used by the software to compute output, and yet the secrets remain hidden even if an attacker obtains the entire machine-level code of the software, which of course the attacker could run and analyze. Thus the attacker would be able to *use* the secrets only in the way that the software allows, but not recover these secrets in any way beyond that. "

Perhaps you could create distributed autonomous organizations where there are nobody who has the keys to control it. Keys could perhaps be automatically generated and stored in the software itself, but nobody could access its funds or shut it down.

The implication of that article goes far beyond just storing private keys in the blockchain. If they manage to perfect their obfuscation, it would pretty much invalidate the need for Ethereum since you could just pass some encrypted code to somebody which reports back to you via a signed message when it gets run. Not to say that there wouldn't still be use cases for protocols like Ethereum, but it would no longer be the only way of validating that a series of events happened.

While this *might* be usable for storing AND operating on data securely in an untrusted environment, it is unlikely to perform well enough to be used for something like near-real time trading on markets. The verdict is still out on whether its approach is secure.

With Ethereum, there's trusted computing because you have peers validating the computations and an assumption of decent performance (someone might point us to performance estimates of Ethereum here). The two concepts working together are what gives you a utopian computing environment.

I'd hesitate to say this new technique 'invalidates the need for Ethereum'.

Yeah, but a compiler also turns short, simple programs into giant, unwieldy albatrosses if you choose to think about it like that. I don't know exactly what the performance hit of a program scrambled like this would look like, but assuming it grows roughly linearly with the size of the program it's obfuscating, then Moore's law should make this just as viable as any other type of CPU intensive cryptography.

That said, it seems unlikely to me that this form of scrambling will ever truly be able to be able to hide the inner content of a program. I imagine that it'll hit the same roadblock that every other DRM scheme has run in to, which is that, at some point, the program has to re-assemble itself in order to work. But who knows? If I can scramble a piece of text beyond recognition, maybe that can work on programs, as well. It's an interesting premise, and I'd be interested to see what further research on it can dig up.

If it's anything like homomorphic encryption, it is likely to be slow. A compiler turns source code into compiled, often times optimized, machine code. Machine code is far from being slow and I'd say using the term 'albatross' (defined as being burdensome) is a bad analogy. Also, compiled code is usually far from being obfuscated. People decompile code all the time. Hackers.

The primary takeaway from this conversation is there are two basically two types of 'trusted' computing. The first is being able to do computations in a way where you get a reliable (trustworthy) result. The second is being able to do computations in a way where nobody can see the result, even if they run it themselves on their own (untrusted) hardware.

My first point touched on the second concept, in which keeping the results hidden becomes an expensive process. We already know prime numbers are hard to find. There's a correlation here between 'privacy' and 'work'.

I'd digress into a 'chasing power' rant, but that would require more space than allowed here. Basically you won't see the entire infrastructure of the 'internets' switching to something simply because it allows for secure storage and use of said storage without revealing its nature. It's simply not required, in most use-cases.