If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

With newer versions of apt-get a new command 'source' was introduce which basically downloads the original sources to generate the deb files on the repository servers. If you as a user have so much time to waste for building the software with custom cflags, etc... then you can basically

apt-get source packagename
cd packagename-vblah

and recompile as much packages as you want.

Also some users insinuated that you can't compile software on Ubuntu (debian based) systems but well that just sounds insane, you can always

Meh, that's mostly a myth, I did it for a long time without a single breakage.

You shouldn't break system if you always install in new directory software compiled with ./configure --prefix=/usr/local/software-123
You can easily break system If you try to replace some software which was installed from deb package with newer version of software compiled with ./configure --prefix=/

You shouldn't break system if you always install in new directory software compiled with ./configure --prefix=/usr/local/software-123
You can easily break system If you try to replace some software which was installed from deb package with newer version of software compiled with ./configure --prefix=/

I'm aware, but it's really simple to avoid those problems. make install will surely not break your system if you do things right, and since doing them right is still considerably easier than packaging...

I'm aware, but it's really simple to avoid those problems. make install will surely not break your system if you do things right, and since doing them right is still considerably easier than packaging...

I was talking about upgrading/replacing software which is already installed from deb package. It isn't safe even if you create own package. make install is dangerous in that case.

Careful with that, it's a bit misleading.Yes, Arch is a binary distro and while it does let your only include things you want it isn't as flexible as compiling your own..... However there is nothing stopping you from making your own packages or using the AUR. Though, I guess you could do some of that stuff in Ubuntu as well.. God I love Linux.

Careful with that

Originally Posted by nightmarex

Careful with that, it's a bit misleading.Yes, Arch is a binary distro and while it does let your only include things you want it isn't as flexible as compiling your own..... However there is nothing stopping you from making your own packages or using the AUR. Though, I guess you could do some of that stuff in Ubuntu as well.. God I love Linux.

Yes, no one stops me from recompiling software manually with just the features I want and then package them in my own repo.
Theoretically, it's possible and works in most cases.

Gentoo, however, does that for me automatically and allows me to set _global_ USE-flags, which is a tremendous simplification!

If I wanted to strip PAM from all packages, I would have to first identify all packages depending on it (To be fair, that's still quite easy on Arch and Debian).
Then I would have to get source-tarballs for the respective programs and compile them without PAM. Packaging would need to be done and it would be required to explicitly declare the package as independent from PAM-libs to prevent aptitude from pulling the library in accidentally.
So there I go. If everything works, I can install the new packages and remove the PAM-components manually.
In Gentoo, I just need to add "-pam" to $USE (in /etc/portage/make.conf) and the rest is done automatically.
Removing PAM is just a trivial example. What about libpng? What about removing all traces of ConsoleKit?

To be fair, we are talking about a binary-distribution and you have your freedoms by being able to package your own stuff. But if there is a library many programs pull in as a dependency unnecessarily (and that could be a security and is a performance problem), then you can't get around a source-based distribution like Gentoo, as repackaging is a waste of time when it involves many packages.

I like to put it this way: On Gentoo, easy stuff is complex. The more complex the tasks get, the easier it is. Overall, you don't have to worry about the easy stuff once you've taken care of it .

Overall, I love GNU/Linux for being that flexible . This would never be possible with Windows or Mac OS.

I am interested in Gentoo, but I couldn't find benchmarks comparing it to recent versions of binary-based distros as Debian, Ubuntu, Fedora...

Do you know how many performance you get from going from a generic AMD_64 to compiling with optimizations? The own Gentoo site recommend the distro for owners of 8-cores, but does not give any detail on why.

Go for it!

Originally Posted by juanrga

I am interested in Gentoo, but I couldn't find benchmarks comparing it to recent versions of binary-based distros as Debian, Ubuntu, Fedora...

Do you know how many performance you get from going from a generic AMD_64 to compiling with optimizations? The own Gentoo site recommend the distro for owners of 8-cores, but does not give any detail on why.

You should really go for it! I started with Gentoo in December with a quad-core Mac mini, which literally is an 8-core machine when hyperthreading is activated. But normally, you could use Gentoo with a single-core machine, as compiling doesn't need to be supervised. For very large programs, binary-versions are available in case of problems.

Installing it for the first time is a _real_ challenge. I don't want to talk it down in any way: My first installation took over 8 hours and additional setup took days, but it was definitely worth it.
You can't compare it to normal GNU/Linux-distributions: My system boots in 3 seconds and you can literally tweak anything.

It's not about extreme compiler-flags (optimizations and the like), but more about what you compile into your software (shared libraries, generally speaking: dependencies).
If you use a binary distribution and install GIMP for instance, it pulls everything in. Support for udev, image-libraries, acl's and stuff.
You don't need most of it and compiling your own version of a given program can definitely yield positive results in regard to memory-consumption, performance and startup-speed.
Added to this come a tremendous package manager (portage), a great documentation and an awesome community.

I reinstalled Gentoo a few months ago (don't get me wrong, one setup can be literally used for decades) and knew a lot about the system by then. I was finished pretty quickly, as I could easily move all relevant configuration-files to the new setup.