"The Portable Linux Apps project brings the ideal of "1 app, 1 file" to Linux. Applications are able to run on all major distributions irrespective of their packaging systems - everything the application needs to run is packaged up inside of it. There are no folders to extract, dependencies to install or commands to enter: "Just download, make executable, and run!"" A follow-up article describes how it works, and how to transform debian packages into AppImages. The packages don't include libraries, so the system won't need to update the same library in each individual app.

It is not a good security. The package manager (or Apple App Store) take care or maintenance and updates. If this system become mainstream, it will be a mess. Just like it was once this was the preferred way of installing software. The stuff in /opt is let to rot until it get hacked. Having a single file is as bad.

What's new about this? Klik has had the exact same thing for 5 or 6 years under Linux - if not longer. A single .cmg file that is an "app image" and contains everything to run the app - no install required. Double click the .cmg file and the app runs.

Currently, package maintainers are assumed to either:
a. Package all an applications dependencies inside its AppImage (excluding those already present in the intended base system)
OR
b. Provide users with instructions to resolve dependency issues.

vs OSNews article

There are no folders to extract, dependencies to install or commands to enter:

The packages don't include libraries

Either packages include all the libraries except for basic ones assumed to exist on every intended target distro and users really don't have to install any OR packages don't include libraries but users are required to deal with it themselves.

The two don't apply at the same time
Personally I only see merit in this when packaging with libraries.

The huge thing that the NeXTSTEP/OSX style .app directories give you is that everything is packaged up in a single directory. Note, this DOES NOT PRECLUDE YOU FROM USING SHARED LIBRARIES!!!. All the OSX style .app directory does at a minimum is package the config / icons up into a single directory. This means that you do not have to copy / edit .desktop files, copy icons, create shortcuts, or any of that BS.

From what I've read, the portable linux apps project does something similar. I have not checked in a while, but I think the Linux Standards
Base specifies that both GTK/QT libraries are installed. This should be sufficient for many if not most apps out there, so there should be little need for static linking.

Hope this takes off, as installing apps (say a newer version than is shipped with the package manager) is my biggest sore spot with Linux.

The huge thing that the NeXTSTEP/OSX style .app directories give you is that everything is packaged up in a single directory.

Decades ago, when working with DOS apps directories, I remember thinking, "this is so much like Nextstep/OS-X."

All the OSX style .app directory does at a minimum is package the config / icons up into a single directory. This means that you do not have to copy / edit .desktop files, copy icons, create shortcuts, or any of that BS. From what I've read, the portable linux apps project does something similar.

Gobolinux and other Linux distros already do this.

Hope this takes off, as installing apps (say a newer version than is shipped with the package manager) is my biggest sore spot with Linux.

I have never had any problem with package updates and package managers. In fact, I find the package manager system speedier, safer and more convenient than the standard OS-X and Windows methods (and similar Linux methods).

Don't forget, the Apple apps stores are just knock-offs of Linux package managers, except you have to pay.

I have never had any problem with package updates and package managers. In fact, I find the package manager system speedier, safer and more convenient than the standard OS-X and Windows methods (and similar Linux methods).

Package updates typically work just fine, I agree. However, the sore spot for many is that there's no dead simple way to install newer versions of a program than the package manager offers. Take OpenOffice for example, if your distribution comes with 3.0 and you want 3.2, currently there are two things you can do on the package manager end:
1. Search for packages of the new OO version you want. If you're dealing with a common package format, like deb or rpm as well as a common enough bit of software like OO, , these usually aren't hard to find. However, dependencies then can become a problem, though thankfully dependency hell isn't as common as it once was. Still, in OO's case it requires downloading many packages, figuring out which ones you need, and installing them. Far more complicated than Windows or OS X. This becomes easier if someone is maintaining a repository for the software of course, but that's comparatively few and not many common software packages are doing this and it's still highly distribution dependent.
2. Upgrade your entire os to get the new software you want. In all honesty, this isn't practical nor should it be, and it's a ridiculous way of operating. This even Apple got right with the iPhone, in that app updates are separated from the os itself. A separation such as this in a Linux system would be invaluable, as then it would be easy to do application software updates separate from system updates.
Of course there are rolling release distributions such as Archlinux (which is what I use) but these generally aren't suited to most computer users. They require somewhat regular updating, and there's always a chance something will break if it hasn't been tested as well as it should.
Considering no one in the Linux community will ever agree on a standard package format, self-contained apps might be the best way to deal with this. If everyone used deb, or rpm, or insert-package-name-here and followed a set list of system dependencies, then installing new software would be easy. As it stands now though, self-contained bundles or statically linked binaries seem to be the only chance of distributing something that will work for sure on 99% of the Linux install base.

Of course there are rolling release distributions such as Archlinux (which is what I use) but these generally aren't suited to most computer users. They require somewhat regular updating, and there's always a chance something will break if it hasn't been tested as well as it should.

As the parent you replied to rather clearly said: it's difficult to install versions newer than what's available in the repos! As such your apt-get won't help you a bit.

Sidux is Debain Sid -- latest versions. Plus one can use many other cutting edge Debian repos with Sidux. The other distros that I mentioned also put the latest versions in the repos. There are other distros that do likewise.

Furthermore, most of the Linux repo packages are modified and updated way more frequently than their Windows and OS-X counterparts.

Dude. You just don't want to get it. How about someone who wants to install 3.0 and sidux has 3.2?

Package managers rock - but you loose flexibility in what you can install, period. My repositories don't offer the last year of OO.o installations - only the latest. openSUSE and Ubuntu offer these user repositories, which might help, but if it doesn't you're out of luck.

Either way I agree with what was said before: nothing new here, move along - klik did that years and years ago and nobody cared then either.

For this to work properly you need to have a heavy base of libs installed - all of the Gnome and KDE libs by default at the very least (and all their dependencies includion optional ones). If you do it like that, yes, this works - generally speaking. You could define a stable API and ABI for it through LSB and only update it every 3 years, demand backwards compatibility. The Gnome and KDE communities would provide it, everyone else probably wouldn't, so you'd quickly have to depend on outdated libs - and you're screwed.

IOW it simply doesn't work unless all libs provide EXCELLENT backwards compatibility and the ability to keep older versions installed next to the new ones.

Dude. You just don't want to get it. How about someone who wants to install 3.0 and sidux has 3.2?

Are you the Dell spokesman?

There are about a dozen different ways to do that within Sidux and within most Debian-based distros. Many of these methods use a GUI.

The same capabilities exist with other package managers used in other distros.

My repositories don't offer the last year of OO.o installations - only the latest. openSUSE and Ubuntu offer these user repositories, which might help, but if it doesn't you're out of luck.

Hint: Use the distro that offers what you want.

Again, you are not necessarily "out of luck" if the package version that you want is not in your default repository. There are several easy ways to install the latest version or an earlier version of a package, with many distros.

Either way I agree with what was said before: nothing new here, move along - klik did that years and years ago and nobody cared then either.

So did DOS, Windows 3.1, Gobolinux, Zero Install, etc. By the way, none of these systems preclude the use of a package manager to retrieve/install their packages.

For this to work properly you need to have a heavy base of libs installed - all of the Gnome and KDE libs by default at the very least (and all their dependencies includion optional ones). If you do it like that, yes, this works - generally speaking.

Yes. And such a scenario requires no more resources than the "self-contained" package systems.

However, you don't really need the "optional" dependencies and you only need Gnome/KDE libs if you have applications that need those libs. So, in this sense, one could operate with fewer resources than those required for a system that is designed around completely self-contained packages.

IOW it simply doesn't work unless all libs provide EXCELLENT backwards compatibility and the ability to keep older versions installed next to the new ones.

"Dude," when I used to multi-boot, I would run applications from the other distros (some compiled years before) located on other partitions, with very few problems.

Anyway, I think it is a good idea. I would prefer to have the libraries included inside the package simply because then there would never be any dependency issues, ever. Or overwritten libraries, or 40 different libraries with almost the same name in /usr/lib and /usr/local/lib.

"This is great, now all the dependencies (including libraries) are included in the .app directory!" VS. "This is horrible, now you have 17 versions of the same library and 17 times the bloat!"

seems like a fanatic flamewar like vi VS. emacs or Gnome VS. KDE to me imho.

Here is why.

Construct the "autopackage" or whatever you want to call it this way:

Upon being run the first time, it checks what major Linux branch you run (e.g. Debian based .deb, Redhat based .rpm, etc (LSB should help here, as lsb_release -icr outputs distro name, version, code name). Yes I realize that would encourage fewer distros with fewer unique package formats, and no I don't think this is all that bad. If you have a distro that doesn't fit, send the package developer info that says " my distro has library x in location y, or my distro has .deb packages but the libraries are in the fedora default locations, and voila, if they use your input instead of ignore you, the next revision of the autopackage WILL work for you too.

If any of its dependencies (libraries or whatever) aren't present in their default location (for that distro), maybe even pop up a question "where is this library in your system?" with a default answer already input by being piped from locate "name of file" or a hint for you to run this short command yourself and paste the result in the question dialog. Once successful, symbolic link to system supplied files. Done!

If it can't be found anywhere, warn that it is so, and offer a choice:

A) Please install this from your package manager (supplying names of debian/redhat packages that contain this as hints), or

B) automatically wget said library and place it in the .app directory, with a brief warning (with a never show this warning again checkbox/command line switch) that "using the downloaded/"standalone" dependency may mean that this app is outdated and insecure, even when the rest of you system is properly up to date/secure.

And of course a preference item inside the app (toggle switch) between 1) "use built in dependencies" (more likely to run first time) VS.
2) "use system files for dependencies" (more secure and up to date).

No bloat or duplication for anyone that can resolve dependencies now, AND no "I can't run this junk, its missing some obscure library (or I don't know why it won't run!)". Everyone is happy!

Yes the initial packaging is longer and more complex(but the complexity is so structured I'd be surprised if it can't be scripted away), but only the very first time! All updates, whether from version 3.0.1 to 3.02 or 1.0 to 5.6 only need the changed files, so now Delta packages start to become easier/more common, so now updating Open Office isn't a 200 Mb download, its 25 Mb! So much more sane in places with slow/metered/capped/too expensive connections! Exactly the kind of places where open source can shine!

Imagine there is a library that is used in a lot of packages, just not so much that their system will assume it is there. Hence the library get packed with the apps.
Then there is a fatal security flaw unveiled in this library. The user will then have to update each application that uses the library instead of just a library packages...

Why not throw all rational thought away and just all link static instead of dynamic.

Then there is a fatal security flaw unveiled in this library. The user will then have to update each application that uses the library instead of just a library packages...

Security flaws are rarely in libraries, and the entire updating system could be automated.

There is also a major security flaw with the shared library system in that an application cannot immediately patch itself. We've seen seen this plenty of times with Firefox where the Windows and OSX versions were patched faster.

Why not throw all rational thought away and just all link static instead of dynamic.

I wouldn't call the current system rational when:

1. Dependency issues can break programs.

2. Dependency issues can require an OS update just to run a newer version of a program.

3. Program files are allowed to be scattered across the system.

4. The entire system requires far more labor via package maintenance compared to systems where programs and libraries are independent.

5. It increases porting costs for ISVs

You know it was one thing to defend the status quo back in 2001 but when Linux has been at 1% since then I say it is time to give the reformers a chance. The status quo isn't working.

You know who the status quo works for? Microsoft. When you work against the reformers you keep Linux exactly where Microsoft wants it to be.

There are two main problems that I see with this type of app installation. First as mentioned, you get multiple copies of the same library which becomes a security nightmare. Especially if you want to limit users actions. If they are allowed to load any library into their personal directories and have them work, there are all kinds of things they could do. Second, app folders seems like a single user system designed technology. How well does this work on multi user systems?

You also have the problem where major subsystems are being upgraded still. Sound, video, the kernel, they are all changing quickly. Some dependencies go well beyond minor version changes, requiring system updates anyway.