In Windows, I download an .exe file and install it.
When I install, I define the path where I want that application to be stored.
In that folder, I have all the files required for the application.

However, when I install a package in Linux using yum or apt-get, I don't know where the package is installed to and where the required files for that application are stored.
I have seen that most of the configurations are in the /etc directory.
But why does Linux store the required files for an application in different directories?

Can someone tell me how packages are installed, and where and how are they stored?
And if my understanding about package management is wrong, please correct me.

This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.

If your distribution uses rpm, you can use rpm -q --whatprovides to find the package name for a particular file and then rpm -q -a to find out what files a package installed.
–
David SchwartzAug 15 '12 at 1:00

With apt-get, if the package is installed use dpkg -L PKGNAME, if it isn't use apt-file list.
–
ThorAug 15 '12 at 2:03

@KOU I don't know the history of this, but it could be so that programs could be updated without messing up the configuration information since it would be in a different directory. This way different versions could use the same config information (assuming the format etc was not changed ie was compatible). I am just surmising here. You'd have to research the history of the Unix/Linux file system for a definite answer.
–
LevonAug 15 '12 at 0:10

1

I'm not sure of the history of the decision to put all system-wide config files in /etc/, but having one central location for config files makes it very easy to back them up. (Imagine backup up system-wide config files in Windows, where they're scattered all through the filesystem and registry...)
–
Li-aung YipAug 15 '12 at 2:12

Under Windows, particularly older versions, it was common for programs to store configuration files and non-constant data in their C:\Program Files directory. This is derived from how programs were usually installed and ran under single-user, non-networked, non-file-permission DOS.

From a security standpoint, this is a bad idea. Places where executable code lives should be separated from modifiable data. That way it's easier to apply appropriate file permissions to prevent modification of installed binaries by unauthorized users. Similarly library directories which may be updated separately from main executables should also be in a separate directory.

With the advent of Vista and UAC annoyances, this tradition is finally starting to seriously lose traction.

UNIX, and Linux, being a multiuser system from much earlier on, had the tendency to separate executable directories from other directories much earlier, since there was a need to prevent users other than root from modifying installed binaries. It's also why /usr and even /sbin are sometimes separate partitions - a particularly security conscious admin can mount those partitions readonly and remount them read/write when an install/uninstall needs to happen.

Packages are usually installed from a package manager. There's various package managers, such as aptitude (Debian and derived distributions), yum (Redhat and derived distributions), pacman (forget which distro this is...), and others.

The package manager lets you browse repositories, download, install, query, and remove software, much like a sophisticated (and free) "app store." It assumes responsibility for ensuring dependencies are taken care of and tracking what is currently installed.

Usually the package manager will also allow the same operations on a package you downloaded manually outside of any repositories. Tools are also available if you want to create your own from software you made or compiled yourself.

Since the package itself is NOT an executable file, you don't have to run an untrusted executable which you don't really know what it does. (Windows is finally coming around with updates by distributing .msu's instead of .exe's - but .msi's have been around a while...)

In Linux/Unix most programs don’t usually end up in a single directory, but different parts of it (executables, configuration files, log files, documentation, other resources) are scattered through the filesystem — usually through symlinking. The Wikipedia article describes in more detail the standard directory structure under a normal filesystem hierarchy, showing the different directories and what you can expect to find in each.

The /opt directory is reserved for Windows-like installations where each package has its own directory tree. Nobody uses it. I'm not sure why; it might be that adding /opt/PACKAGE/bin to your $PATH every time you install a package is just too annoying.

Software in Linux is a little different in paradigm from Windows or Mac. In those, an executable and all its supporting files are installed into a single folder: Windows normally keeps them in c:\Program Files\program name, Apple in /Applications/program.app. Under Linux, there's a more ... communal structure. The binaries are generally in /usr/bin, the system-wide configuration is in /etc, user-specific configuration is usually at ~/.program. Libraries are in /usr/lib, supporting files (e.g. artwork) are often in /usr/share/program, etc. There's even a standard suggesting where things should go.

Programs are generally installed by package managers, rpm and dpkg; they are automatically searched for and retrieved (including dependency-management) by yum and aptitude/apt-get, respectively. On a more technical level, the packages are simple compressed files (I believe rpm and deb are both .tar.gz). These archives contain a mirror of the pieces of the filesystem from the root where the files go (e.g. a file supposed to be installed at /usr/bin/program will be at usr/bin/program under the appropriate folder within the package.

To find information on a particular package, use the package manager for your system, as others here have explained.

While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes.
–
MaQleodAug 15 '12 at 3:43

@MaQleod If you are concerned that the link may become invalid, I have linked just for convenience. I could simply write "man hier" in plain text, since you can find this manpage in any mainstream Linux distribution I guess.
–
AnonymousLurkerAug 15 '12 at 4:41

"see the manual" is also not an appropriate answer, it is a comment. Answers should actually answer the question. How does this answer the question posed? how is the hierarchy listing significant? Why should the OP (or anyone else) consider this answer as noteworthy? Copy the relevant portions of the link (or man page) and explain why they are significant in the context of the question, then you will have answered the question and not simply made a comment.
–
MaQleodAug 15 '12 at 4:53