Good, Bad or just Different?

Aaron Stenber has a rather interesting article discussing a novel approach used to author recent Microsoft packages such as Silverlight 1.0 and the .NET Framework 2.0SP1/3.0SP1.

Basically these setups are authored as fake MSI’s that then get Major Upgrades applied as patches. The UI is outsourced to an external UI handler and the whole thing is packaged up into a custom self-extracting bootstrapper.

I have to admit, when I first read the article my first reaction was something along the lines of `What the heck???`. The design is so different to other servicing stories that I’ve seen that I had to email a trusted friend of mine for his reaction. His answer to me was:

I don’t think you’re missing anything. My initial reaction is a mix of “wow, what a hack” and “does this mean they admit MSI’s model is too limited?” The empty MSI + patch is cute, because it gets around InstallShield’s lingering cache-web-download file problem, but it just highlights that MSI should have had better options for caching the original MSI rather than wasting one of your patches as a workaround. A custom bootstrapper, custom external UI code, etc., are the only way to make installs like this bearable.

I want to be honest, I’m trying to keep a very open mind with this design. I am somewhat saddened that it has come down to this. I sometimes find myself wishing that merge modules and concurrent installs worked. It would be so nice if you really could isolate and package all of your third party dependencies into a nice, clean, single MSI package that could be loaded into a GPO advertisement. It would also be nice if GPO could understand how to apply transforms and/or set public properties. It would be nice not to have chainers and external UI handlers and to not have to decide how our servicing strategy would be in advance.

I guess in a sense this latest design is the ultimate expression of these two rules. For example the .NET 2.0SP1 performs a Major Upgrade of the 2.0 RTM and in effect calls a mulligan. For clean installs it basically accomplishes the initial deployment by using the expected servicing strategy thereby eliminating the potential pitfall of being in production before you figure out what your strategy will be. Furthormore by not deploying components in the base MSI you avoid running afoul of rule 42.

Still, a part of me is still scratching my head thinking “Wow, is this really how MSI is meant to be?”. After all, these packages look and feel to a Systems Administrator a lot more like legacy setup.exe packages then MSI packages. For example the Microsoft Silverlight Deployment Guide makes no mention of GPO advertisements using MSI. Instead they mention wiring a batch file up to a machine starup script to fire the exe from a UNC path.

I sure would love to see a follow up discussion from various parties including the Windows Installer team. Is this just an ad-hoc approach used by certain teams from within the Microsoft firewall or is this the future best practices that tools like ISHNMET should be supporting?

Post navigation

5 thoughts on “Good, Bad or just Different?”

In my original post the discussion I meant was that of the MSI + Patch pattern. Does the MSI team think this is a good thing, will this pattern be published anywhere, will tools vendors adopt it into their authoring environment somehow?

I would like the conversation to be a townhall style with Microsoft, Tools Vendors and MSI Experts ( End-User Developers ).

To my knowledge, this never occurs, only conversations been Microsoft and Tools Vendors occur. Some of us have backdoor channels with tools vendors to try to inject our opinions but this is the exception not the norm.

Yes, I'm aware of that white paper and the resulting MSI updates coming. I've also spoken with several tools vendors on the subject.

BTW, trivia question: Who know who invented the merge module specification that is soo busted?

Most of my thoughts were geared to the fake/empty MSI followed by a patch to major upgrade ( true install ) the application. As far as I know, that's not addressed by this white paper but if it is and I missed it, please feel free to chime in.

BTW when MSI first came out the bootstrapper was really only intended to deploy MSIEXEC. All the other package integration was intended to be performed by merge modules and concurrent installs. Of course we've now compltely acknowledged that the design is flat out busted from a servicing perspective.

Personally I don't think we should need a bootstrapper to host an external UI. There should be someway of loading a library into the MSI and when the client side process fires up it automatically uses it if present. Further I think there should be some way of storing all of these micropackages inside the MSI and chaining them all together without the help of a bootstrapper.

In otherwords, an MSI should still be an MSI and it should still support things like GPO and a standardized deployment experience ( /qn /qb public properties ectera ) without going way off the reservation by hiding contents inside an EXE and creating complex XML files to drive the configuration.

Hope that's lucid enough. I'm rarely accused of being that. 🙂 I usually come across much better while drinking coffee for several hours.

My personal opinion is that the concepts are sound, but their implementation by MS, or lack thereof ("We are grateful for the effort our partners extend to our mutual customers through our technology" – to answer your last question, I'm sure ISHNMET has been working on supporting this "new way" for some time now), feels unsatisfactory, as you point out.

I would love to hear your opinion on that paper. Your blog is one of my favorites covering setup, mainly for the lucidity.