"When I started writing programs in the late 80s it was pretty primitive and required a lot of study and skill. I was a young kid doing this stuff, the adults at that time had it even worse and some of them did start in the punch card era. This was back when programmers really had to earn their keep, and us newer generations are losing appreciation for that. A generation or two ago they may have been been better coders than us. More importantly they were better craftsmen, and we need to think about that." I'm no programmer, but I do understand that the current crop of programmers could learn a whole lot from older generations. I'm not going to burn my fingers on if they were better programmers or not, but I do believe they have a far greater understanding of the actual workings of a computer. Does the average 'app developer' have any clue whatsoever about low-level code, let alone something like assembly?

The point I'm making and my feeling about it is that the tool is irrelevant: what you produce in relation to what is possible to produce, is.

I feel that last sentence is dead on, and what I was feeling throughout reading the article but couldn't put in to words, the best I could do was an analogy:

Just because Darwin was "first" in describing evolution, doesn't make him a better biologist than Rosalind Franklin for discovering DNA, or Boyer and Cohen who were the first to modify it in a living organism.

We think everyone in the past was great, but that is because all of the mediocres are lost to history; like all the kids that copied code from computer magazines and painstakingly, manually, "pasted" in to their BASIC machines. :-)