Debian packages are by far the best of the Linux packaging / distribution formats, but I've still found too many dependencies and hang ups to say that the compare to what you find in Windows / OSX for the average user.

TBH, I miss the days when 90+% of installs on OSX were just dragging and dropping an icon. I always thought that was the most logical and user friendly install I've seen, as well as handling updates inside the application (instead of through an application control panel), but apparently no one designing OSes agrees.

YES, since i have moved to OS X, i love how everthing is in the ONE folder!

There's nothing wrong with debian packages but if you'll be installing packages with no dependencies on phones and tablets you don't need the overhead of building indexes, which sometimes takes more time that the package install itself.

There's nothing wrong with debian packages but if you'll be installing packages with no dependencies on phones and tablets you don't need the overhead of building indexes, which sometimes takes more time that the package install itself.

It's really a question of speed and ease versus security and integration. Self-contained packages like Canonical is considering introducing should be much faster to package because everything is contained in a single directory and much faster to install because there is no indexing, hashing, or system integration to slow it down. On the other hand there is an inherent lack of system security features that can be applied to this approach, such as strict checksums and signature checking, and the packages do not integrate well with the system. Some of the benefit of shared libraries and other deduplication features of modern package managers is lost with this less integrated approach. Strict control over all installed programs is also sacrificed.

Canonical's proposed package system makes sense - and even looks attractive - from a proprietary software perspective, but it is also what Debian and most other distributions would consider the "wrong way" of doing things. Distributions strive to make software more integrated, not less. Despite the fact that they are a successful commercial vendor of open-source software, or perhaps because of it, RedHat has rejected this approach in favor of their more traditional package management system. In fact, one of RedHat's certifications is in properly patching and packaging software for RHEL - not working around it with a secondary packaging system.

On a side note, I'd be interested to hear what the Gentoo developers think of this plan. They tend to consider even Debian's policies too liberal, so I'm sure they will disagree with Canonical's proposed approach. The interesting part will be their technical reasoning. That's always my favorite part of talking to Gentoo developers and users: they are generally well informed and very opinionated.

On the other hand there is an inherent lack of system security features that can be applied to this approach, such as strict checksums and signature checking, and the packages do not integrate well with the system. Some of the benefit of shared libraries and other deduplication features of modern package managers is lost with this less integrated approach. Strict control over all installed programs is also sacrificed.

Was kind of thinking similar as well as far as security goes. One of Linux's "safety-nets" to malware is that the majority of software a user would get would be from the respective repository.. not 100% of course but would guess the overwhelming majority. "Portable" programs like these are convenient, but I could easily picture (just as an example) tricking gullible users into installing things, say finding a torrent for Half Life 3 beta or Microsoft Office for Linux. You know it's fake, I know its fake, random clueless guy who decided to try Linux because he saw Steam was available might not, and being a self-contained portable app with no fear of missing dependencies just makes it that much easier. Click click done, infected with a lovely rootkit or popup ads.

Also curious as to relative bloat.. some dependencies can be rather large depending on the program. It can be heavy enough mixing up say GTK and QT stuff, but multiple copies/versions for the various programs? That could get potentially rather fat. WinSxS and the GAC has a similar problem.. absurdly handy, especially if you were around for the 9x days, and at the same time it can get rather piggish with enough programs installed.

I also get what you're saying about possibly not integrating well too. Portable programs work rather well with Windows as you have a known set of APIs that's available and can easily have your program "fit in".. extremely unlikely you'll find somebody running a different shell than Explorer for example. With Linux you have an absurd number of possibilities to deal with that can potentially cause whatever program to either run perfectly or explode on startup.

On the plus side though it would solve a couple problems too dealing with dependency issues/conflicts.. dependency hell isn't specific to old versions of Windows and can be a pain in the butt depending on the distro.

I can see Canonical's reasoning more or less.. personally I'd be ok with it as long as it's "in addition to" and not "instead of" their existing package management. There if you want/need it, but won't get in your face if you don't.. think something like Chakra's bundle system. One click self-contained installer for "out of repository" programs that doesn't pollute the file system but it's not a replacement of their standard repos.

There are no dependency problems on Linux so long as you stay within your distribution's repository. Problems occur when you try to grab software outside of the package management system, which Canonical's proposed system would pseudo-fix for closed source programs programs not in the main repository. It's somewhat like their basic premise behind personal package archives, but taken to the next level.

That was not my point with regards to bloat, however. The point of shared libraries is that there is only one copy loaded in memory no matter how many programs are using it. When each application bundles its own copy of a library there will potentially be many copies loaded in memory simultaneously, negating the general performance gain they otherwise would have received from resource sharing. There is also the matter of keeping each dependency up-to-date. Flaws are regularly found in software, and developers are far less likely to keep on top of patches issued to the libraries they are using so long as their application "just works". Whereas the maintainers of libraries and other development packages in a distribution are dedicated to their piece of software and generally keep it updated and secure. Those two points summarize the reasoning behind Debian's policy of banning all software in the main repository from embedding libraries or using internal forks without bulletproof justification to the contrary.