I guess I'm missing the point... IBM is all excited at a better way to use thermal grease to cool CPUs, so they can be "smaller and faster." Isn't the objective to make them use less power?

I just spent 12 years (on a ten week contract) working in a big data center, and power use was a constant issue. Intel and AMD are making the CPUs deliver more performance per watt, and I'm really dubious that better cooling is going to be a big deal.

I'm old enough to remember SOS (Silicon On Saphire) and chips on diamond to get better cooling. I'm pretty sure the answer is not in more cooling, but in less power.