Judging by the harsh reaction to your first post, I think many of our colleagues have become a little too obsessed with the programming part of their profession. We should be focused on building better solutions. Programming is probably no more than 20% of the work in delivering a solution. Projects rarely fail because of technical issues.
As others have blogged, we should be spending more time becoming better communicators and solution builders (and encouraging others to do the same). Instead it's easier to have the same old programming arguments.
I think the issue some take with the "everyone should learn to code" movement is that it seems to put coding at the center of the information age, while in reality it's just a small part of what goes into solving problems. It's also the part that's least likely to look the same in 20 years as it does today.

I didn't intend for Please Don't Learn to Code to be so controversial, but it seemed to strike a nerve. Apparently a significant percentage of readers stopped reading at the title. So I will open with my own story. I think you'll find it instructive. My mom once told me that the only reaso...

I'm shocked by the number of acquaintances that have had their e-mail compromised in the last couple of years. However, none of the victims were computer savvy people.
I think the audience that most needs two-factor authentication is the least likely to use it, and encouraging friends and family to use strong passwords would be a better, and more readily accepted, first step. A little education regarding safe computing would also go further and be less burdensome.
I've always wondered how these exploits are happening on such a mass scale. Weak passwords? Brute force attacks? Keystroke logging malware? Phishing? The similarity of the hacks I've seen leads me to believe they're all using the same automated tools to do them. Why isn't this being covered more by the tech press? Understanding the attack vector would help us better defend against these hijackings.

It's only a matter of time until your email gets hacked. Don't believe me? Just read this harrowing cautionary tale. When [my wife] came back to her desk, half an hour later, she couldn’t log into Gmail at all. By that time, I was up and looking at e‑mail, and we both quickly saw what the re...

Best wishes, Jeff. As you noted, too many in our field don't realize what's really important until it's too late. I don't think there are many people who find themselves on their death bed saying "I wish I had worked more."

I am no longer a part of Stack Exchange. I still have much literal and figurative stock in the success of Stack Exchange, of course, but as of March 1st I will no longer be part of the day to day operations of the company, or the Stack Exchange sites, in any way. It's been almost exactly 4 ...

Wow, looks like the telco lobbyists are out in full force today.
More seriously, network neutrality is a more complicated subject than it first appears. Most people can't even define it properly.
I think a big reason for the Internet's success is the fact that network neutrality has been a de facto principle to date. We can lament the fact that some intervention may be needed to keep the Internet neutral, but I think it is critical to do so. However, any regulation should be measured and only target clear and present threats to neutrality.
The Internet has leveled the playing field -- everyone can have an equal voice on the Internet. This clearly makes some of those in power uncomfortable. That's exactly why we need to preserve network neutrality.

Although I remain a huge admirer of Lawrence Lessig, I am ashamed to admit that I never fully understood the importance of net neutrality until last week. Mr. Lessig described network neutrality in these urgent terms in 2006: At the center of the debate is the most important public policy you...

Wow, I'm surprised by the amount of anger in response to this post. Sure the quote "Algorithms are for people who don't know how to buy RAM." is provocative, and probably meant to be somewhat tongue-in-cheek, but I have to agree overall.
Most of us are already coding on top of many layers of abstraction, each of which comes at some cost in terms of efficiency. However, hardware is cheap and getting cheaper, and each layer helps us to be more productive as developers.
The overall response here really reinforces my opinion that as an industry we're often too focused on the geeky details, such as optimizing for minimal RAM usage, while we fail so often at understanding the problem domain and our customer's needs.

Are you familiar with this quote? 640K [of computer memory] ought to be enough for anybody. — Bill Gates It's amusing, but Bill Gates never actually said that: I've said some stupid things and some wrong things, but not that. No one involved in computers would ever say that a certain amou...