It's recently been a year since I started working on my pet OS project, and I often end up looking backwards at what I have done, wondering what made things difficult in the beginning. One of my conclusions is that while there's a lot of documentation on OS development from a technical point of view, more should be written about the project management aspect of it. Namely, how to go from a blurry "I want to code an OS" vision to either a precise vision of what you want to achieve, or the decision to stop following this path before you hit a wall. This article series aims at putting those interested in hobby OS development on the right track, while keeping this aspect of things in mind.

I quite doubt that. There's been plenty of discussion of this and the general consensus nowadays is that it's really, really hard to beat atleast GCC's optimizations anymore. Though, I admit I personally have never even tried to

"I quite doubt that. There's been plenty of discussion of this and the general consensus nowadays is that it's really, really hard to beat atleast GCC's optimizations anymore."

I feel you've copied my phrase entirely out of context. I want to re-emphasize my point that c compilers are constrained to strict calling conventions which imply shifting more variables between registers and the stack than would be possible to do by hand.

I'm not blaming GCC or any other compiler for this, after all calling conventions are very important for both static and dynamic linking. However it can result in code which performs worse than if done by hand.

As for your last sentence, isn't the consensus that GCC output performs poorly compared to other commercial compilers such as intel's?

They're all from 2009 or 2010 and in all of them icc beats GCC by quite a large margin, not to mention icc is much faster at doing the actual compiling, too. Quite surprising. What could the reason be then, why does an open-source compiler fare so poorly against a commercial one?