Back in April 2008, Canonical's Mark Shuttleworth pitched the idea of major open source projects synchronising their release cycles on a 6 month period. Projects like gcc, the Linux kernel, GNOME, KDE, as well as the distributions, would work out an acceptable release schedule. It would allow for easier collaboration between the various projects, and hardware vendors would be better able to support Linux since all major distributions would ship with the same kernel version.

Are we talking about dependencies here? Aren't people already working as fast as they can to use new features of libraries their stuff depends on?

Lets say there is a program that you use written in PyQt. The authors want to use a new feature of Qt that is going to come out. As a user of the program you have to wait first for the Qt release, then the PyQt release, then the author of that program to use the new feature.

How could this happen in any other order and still get proper testing?

It's more of a business thing. If all major distributions ends up using the same version of gcc, the linux kernel, etc in their releases (which in turn are released in the same time-frame) hardware vendors for example will have an easier time supporting those releases with drivers.

It's not really about bringing new features of underlying libraries to the users in a different way or a faster way, it's more of how to make Linux more attractive to the enterprises.