Hello everyone, I'm an Italian guy with a passion for programming (and linux of course)

I like to try new distros on my pc but when I do so, I have to reinstall all the libraries (often recompiling them by hand) and having the same library in two different places is not memory efficient (I was able to have more than 5GB in my lib folders)

I would like to have a couple of folders (lib32 and lib64) on a different partition so that i can install libraries there and let every distro I install use them.

Before I get technical: is this even possible (and why)?

I made some experiments using libraries from a different distro on a different filesystem and it worked but is it possible with every other distro?

I thought different distros would probably have their own optimizations when compiling the same library. Is it correct?

Thanks in advance for your help.

12-07-2012

daark.child

Hi and welcome to the forum. I think your idea will be difficult to implement because different distributions do not necessarily use the same version of a library and if a library is compiled against one distribution, there is no guarantee that it will work correctly with another.

12-07-2012

dogiordano

Thank you for your answer.

Is it possible to sum up these differences?
I mean, would it be possible to recompile libraries with no distro-related optimizations so that I can use it on every distribution?

12-07-2012

Irithori

Using the same lib on every distro is not the goal.
There are a lot of distributions for various usecases.

For example fedora is a testbed for redhat and usually has quite new software.
Gentoo and other "rolling distros" compile their software according to their package manager configs.

On the other end RedHat/CentOS offer a stable platform.
The tools/libs as well as their version and features are chosen and then frozen.
There will be no version/feature upgrade during the lifecycle of a major version.
Which is valueable for mass deployments and SLAs.

So currently it works (very roughly) this way:
- A tool/lib is made available via sourceforge/github/freshmeat/etc as sourcecode.
- Package maintainers evaluate it
- They chose a version, probably patch it and then compile and package it into RPMs or DEBs
- The RPMs and DEBs are copied to repositories
- From now on the packages are installable for the users

In your usecase..
Imho, as long as you actively develop these libs use a VM and create a makefile to quickly compile and install it.
Once it stabilizes or you want to distribute it in a controlled manner to multiple machines:
Building packages is clean way to go.

12-07-2012

dogiordano

Thanks a lot fot all those informations.

Programming it' s just my hobby, not my job (at least not yet)
I'm not developing libraries, just using them to develop software (and just for myself).

Anyway, your idea to use a virtual machine is a very nice solution to my problem: like this I can download the libraries I need only once and keep the system image on a separate partition to be visible by any other system installed on my pc.