"It's never good to scare away your customers. It's even worse if you don't realize you're doing it. That was me. Like most folks in the developer community, it's been years since I last used Internet Explorer as my daily browser. Oh sure, we all keep copies around for web development work, but Firefox, Chrome, and Safari now rule the web roost. Unfortunately, that was not the case with the Blurity userbase." Wise lesson from Jeff Keacher.

A measure need not be technically foolproof to be useful. From a game theory standpoint, the certificate requirement will disproportionately affect malware authors.

Once a certificate is blacklisted, all other malware signed with the certificate will also get blocked. Thus, malware authors only have a limited time window in which to reuse a certificate before it becomes invalid. They essentially have to buy a new certificate every few malware strain released.

Contrast this to the present situation, in which they can release as many variants as they want, for free. Even when one of the strains is detected, the antivirus signature may not block the other strains.

In contrast, non-malware software publishers only need to buy one certificate for all their software -- every release, every hotfix.

Certificates can only identify WHO wrote a piece of code, not what it does or what the author's intention is. Even the most "trusted" CA's are compromised from time to time - it's only ever newsworthy when false microsoft or google certs are issued, but I'm pretty sure this happens every day with other brands that aren't under a microscope.

Even when certificates are issued legitimately to legitimate developers, how are end users supposed to know this? The certificates really don't tell us what is safe to install. Furthermore, even signed code from known sources can be compromised, and exploited by hackers. Developers may or may not be aware of it. And even if they are, now they're faced with revoking the certificate used to sign all their software and potentially cause interruptions for their existing customers (which is why certificates shouldn't be shared in the first place between all their software like you suggested).

So certificates do help provide some additional trust measures, but they aren't ideal for security. If you have any doubt about this, just recall the IE COM component debacle - it's a prime example of why certificate "identity" does not lead to "security".

We should have more emphasis on fine grained application sandboxing to keep dangerous applications from having their way on user systems, regardless of the code's "identity".

In other words, ideally the OS should allow us to download and play a game from any source without concern for the safety of the rest of my files/applications - not much different from how we visit web sites.

Sandboxing (presumably together with users being asked about permissions) will probably just bring "UAC tiredness" - maybe even multiplied, training people to accept everything or to block everything in panic.

No way out of (more or less) walled gardens for general population, I'm afraid.