Question about core apps on Standard Image

Background information: At my company we have about 10,000 desktops and 2000 laptops. We use one standard image for both so this one standard image is installed on all computers. On this image, we include our "core" applications - the one's that are used by everyone in the enterprise. We also run about 200 other applications that are installed ad-hoc on a per user basis.

The problem: We've been directed to develop "application layers" that would consist of several apps that are generally installed together. Thus reducing the numbers of these ad-hoc installs from 200 to maybe 30 or 40. I'm interested in hearing from other people about how they do this sort of thing. first, how do you decide if an application should go on an "image" - is it based on percentages of installs? If so, what percentage justifies putting it on the image? Secondly, does anyone do this sort of application layering? and if so, do you have any suggestions or recomendations on what sorts of things to layer?

Comments

Answers

0

I think like in many situations, there is no one right answer here - just things to consider. Personally, I like to keep all applications off images. The reason is I like to keep the distribution points to a mimimum, and keep administration as centralized as possible.

Example: let's say you decide to make Office a part of your 5 images. Barring WUS, you now have five different images on which you must maintain service packs and updates. As the images age, you have a little more work to do to update office. Or, you have to bring those images down, update them, then send them back up again.

If you have an administrative installation point and a push deployment/update method, all you have to do is update one of them, copy it over to your other AIP's, then deploy or redeploy.

Thanks for your input, and I agree. Having only our 'core' applications of Office, anti-virus, etc. on our one standard image makes updating that image only a 2 or 3 time a year deal. Now I'm just trying to figure out how to naturally group our ad-hoc apps into 10-20 installs versus 150!

Thanks for your thoughtfull response. I agree with everything you state. We use Altirisi for Software distribuiton and it works well for us. The issue we face is that in our company there are no defined "groups". We have about 350 locations, and a "clerk 1" at one location may have a need for software that a "clerk 1" at a different locations doesn't need. I think our use of one standard image with only core apps installed is the right way to go. We're going to investigate the use of a "software portal" that will allow users to download and install thier own applications - but of course there are issues there also as few of our users are Admins on their boxes. Anyway, thanks for your responses everyone.

I work in an Enterprisee with 30,000 workstations and laptops and I always use the word "SCALABILITY"

Does the plan scale?

The more layers you place in your deployment the better. It is better to affect 200 users with an adverse reaction to a software package you deploy than to have 10,000 help desk calls because you deployed an ActiveX Control that breaks your intranet to everyone. This is just an example but you get my meaning.

You also need a way to test changes based on a standard and adding apps to images actually breaks the standard because they will need constant patching and its very hard to troubleshoot an app problem down the track when there are so many variables in place.

Images should have the OS, the IE version and patch level set and left at that. This way you are starting from a rock solid base. As you add apps and hotfixes etc you can go backwards to see what breaks what.

I think everyone here makes a vaild point, it really depends on your organisation.
I have to say im converted to msimakers suggestion, I like to treat the OS just as another application to deploy, with no apps or configuration. This means like a house you start with a really firm foundation, to build upon with limited complexity.

The more modular your design the easier it is to maintain and upgrade.
As soon as you start integrating things togther, you can easily lose control of change that you have applied.
Im unsure if Im exactly up with your situation when you say include core applications, are these core applications distributed via a manged distribution system? or are they embedded (imaged...etc).

If you include the applications into the OS you have to think of the upgrade path for those core applications.
After the intial relase, you would need to package an uninstall preinstallation task, then upgrade package for each core application.
If you dont do this and rely on say updating the applications on the master image then have your support staff slowly migrate clients. There would then be multiple builds in your environment, and you would have to test against each legacy image with any new change.

I think the benfits of embedding are:
Faster development timelines, easier testing (rolled up change), less reliance on a mature software distribution infra.

Some downsides:
Lack of freedom as its not as modular, harder to keep all you clients matched to the current revision level of applications.
as they require a new os migration.

Anyway my next large release will be fully modular.
Bring on the leggo... :-)

Ok so you deploy Windows XP Pro SP1 with the drivers for 3 machine types. Say Dell, IBM and Compaq.

Now at this stage you have only to deal with the MS OS and whatever drivers the OEM's need.

Now thats a solid rock.

If you add say Office 2003 and an IE version to this base build then ANY change to the OEM drivers will require a regression test to make sure your not totally causing a FUBAR to your image because of these extra apps.

But if you had left the image as OS and OEM drivers then if you load Office 2003 and its fine then adding SP1 for Office means the baseline troublehooting starts at that patch andf not the image.