Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

"The bigger injustice," Edwards writes, "is that programming has become an elite: a vocation requiring rare talents, grueling training, and total dedication. The way things are today if you want to be a programmer you had best be someone like me on the autism spectrum who has spent their entire life mastering vast realms of arcane knowledge — and enjoys it. Normal humans are effectively excluded from developing software. The real injustice of developer inequality is that it doesn't have to be this way." Edwards concludes with a call to action, "The web triumphalists love to talk about changing the world. Well if you really want to change the world, empower regular people to build web apps. Disrupt web programming! Who's with me?" Ed Finkler, who worries about his own future as a developer in The Developer's Dystopian Future, seconds that emotion. "I think about how I used to fill my time with coding," Finkler writes. "So much coding. I was willing to dive so deep into a library or framework or technology to learn it. My tolerance for learning curves grows smaller every day. New technologies, once exciting for the sake of newness, now seem like hassles. I'm less and less tolerant of hokey marketing filled with superlatives. I value stability and clarity."

"In the old days there was a respected profession of application programming. There was a minority of elite system programmers who built infrastructure and tools that empowered the majority of application programmers. Our goal was to allow regular people without extensive training to easily and quickly build useful software. This was the spirit of languages like COBOL, Visual Basic, and HyperCard. Elegant tools for a more civilized age. Before the dark times before the web."

"The web is just an enormous stack of kluges upon hacks upon misbegotten designs. This Archaeology of Errors is no place for the application programmers of old: it takes a skilled programmer with years of experience just to build simple applications on today’s web. What a waste. Twenty years of expediency has led the web into a technical debt crisis."

I have seen plenty of these "tools" and they are worthless for anything complex. If you need to put a nail in, I can give you a hammer to make your job easier, but what happens when you need to put the nail someplace the hammer doesn't fit?

"Complex" is not for laymen. There is only so much that you can do with any "appliance". Beyond that, you actually have to know what you are doing. This "problem" has nothing to do with programming.

Once you get into "complex", you really do want something along the lines of a profession were people have to be licensed and they can be held accountable for their failures. For the "complex" stuff, we should be striving MORE for something comparable to real engineering or medicine rather than pushing for traine

"Complex" is not for laymen. There is only so much that you can do with any "appliance". Beyond that, you actually have to know what you are doing. This "problem" has nothing to do with programming.

This. Thinking about the web apps I've written, most of them required fairly deep knowledge in the area of the app -- auroras, photography, specialized group management, history, genealogy, measuring instruments, Chinese, retail procedure -- all areas an interested party could potentially bring to the table.

How about instead of giving you a hammer, I give you a toolbox. That's what all of these 'tools' are, they're toolboxes. And unless you got training in the specific tools to use, you will probably and eventually get the job done... poorly. A craftsman will know which tools to use and when to use them.

There is no difference in programming. Everyone can program these days. There are plenty of languages that are easily understood. However when you can buy a toolbox at Home Depot for $300, everyone becomes a cr

Well, to expand on your analogy, when nail guns were new, they were huge, heavy, hard to operate, and required investment in hoses, and for most jobsites, an expensive gas-powered air compressor. The nails were also much more expensive as they required special rolls/loaders, and those input mechanisms were completely proprietary. Even today, the good nail guns that will last for a long time are not cheap, the gas-powered air compressors are still expensive, and the and the nail rolls/loaders are often still proprietary. One can easily get $2000 into a system right now just to hammer-in nails.

By contrast, a hammer, ranging between $5 at Harbor Freight Tools to $80 for a top-of-the-line deluxe framing hammer forged from olympus steel and quenched in the tears of angels will drive in almost any conventional nail that one needs, and unless abused will probably last as long as the owner will.

I'm working with some web software at the moment. It's the kludgiest amalgomation of crap that I've seen in quite some time. It's got OS library dependencies, but they need to be newer than one stable distribution's version, but older than another stable distribution's version, so one has to use unsigned third-party repositories for those. Then for Ruby on Rails and for Node.js it needs two other sets of proprietary repositories, and it needs specific versions of packages from those repositories too, not default, and it installs some redundant packages that were already covered by OS in slightly different version. Then once you go to put it in it requires MySQL for some of the dependencies but the main program itself only runs on PostgreSQL, so you're stuck with two DBs running, one doing almost nothing but required to be there.

This is sickening. This will make it almost impossible to do OS updates, and will cause all manner of problems if those third-party repositories ever go away, or if the developers for them stop maintaining those specific versions. It's dangerous and stupid to do this.

I'm working with some web software at the moment. It's the kludgiest amalgomation of crap that I've seen in quite some time.

It sounds like some poor decisions have led to that situation for you. Ruby and Node both have fairly flexible package management solutions that let you pin dependencies and provide private repos for your specific dependency versions when for some reason you can't use official ones.

However, one thing that has always bothered me is when we say "well we're using ruby xx.xx (or node xx.xx or php xx.xx or whatever) on our development machines, so we must install that version on production" and then the hoops ta

However, one thing that has always bothered me is when we say "well we're using ruby xx.xx (or node xx.xx or php xx.xx or whatever) on our development machines, so we must install that version on production" and then the hoops taken to do that. It should be "production can run ruby xx.xx so that's what you have to develop against".

I doubt that will ever be the case.

The main issue is, developers usually have a work backlog and those in charge have very little interest in what version everyone is running. If it already works on _a_ platform version, then chances are that the users will get better value for the developers time through adding another feature to the web app itself, than whatever benefits the upgrade or downgrade in platform version will bring.

You can try negotiating with the development team before the work commences thou

So, what you're saying in effect is that you might put in a large investment on the tool (nailgun=$, framework=$time) from which you're hoping to get a long useful life, and perhaps buying those tools from a reputable company (nailgun=Dewalt,Craftsman, framework=Google,Adobe) with the expectation that the tool won't be discontinued/EOL'd and parts/repo's will remain available. The reality is that the nailgun/shiny IDE might not last as long as the older simpler stuff (hammers are older than neaderthals/VI is >30yrs old, Eclipse is 10, Webstorm is 2? 3?). And company reputation is no guarantor of longevity [wikipedia.org].

However, if the Dewalt Model XJ-9 nailgun lasts 5yrs you can finish a helluva lot more roofs in that time than you could with a hammer. Perhaps then we should look at Angular, PhoneGap, nodeJS as specific models of nailguns from which we should extract as much 'juice' as we can in the 2-5yrs they might be useful and presume that we'll be using something else after that.

Unfortunately, the roof/nailgun analogy completely falls apart when you realize that if some of the shingles fall off after the XJ-9 has been discontinued you can still use a regular hammer to fix it; whereas if Angular 3 is EOL'd in 2017 then your PhoneGap app built on it might be left with some vulnerability (all geolocation requests are hacked to only report your current location as the nearest strip club) that Google is not going to fix (having sold off their money-losing software biz in 2016 to focus on crowd pacification robots).

And perhaps, instead of waking up every day wondering if today is the day the Yosemite super volcano [goo.gl] or a planet killer comet wipes us all out, we should just dance (and code) while the sun shines and not worry so much about the future.

Bad example. The $10,000 hammer was because of the paperwork required to buy a single hammer for a high security project. Yes, it was extreme idiocy, but it WAS following the rules as specified, and the CIA wasn't involved.

If they'd been buying 100,000 hammers it would have made a lot more sense, and the increment in the cost wouldn't have been so absurd.

What's really sickening is that there was a project that carefully specified the particular alloys and heat treatment that the nuts and bolts were to have, paid for them, and the contractor supplied off-the-shelf nuts and bolts from a hardware store. This was determined after the cause of failure was found to be a split nut. The spec'd one wouldn't have failed. The cheap nut ended up costing a lot more than $10,000.

3D printers are a fundamental game-changing technology. What you're seeing now is basically proof-of-concept stuff, but the technology is developing rapidly and it won't be long before pretty much anything can be built from scratch on demand. 3D printing and associated CNC mills have the potential to revive American manufacturing in a localized, small-batch paradigm rather than wasteful mass production and national/internation distribution chains.

No, he has a point. Back in the day, we had few tools and we learned how to use them.

now, we have a tool for every hour of the week, and as soon as you've mastered one, someone comes along and says "your skills are sooo obsolete, you must learn now or fall behind", so you get to grips with it and start top master it, and then realise its a pile of poop and hunt around for a new, cooler tech to use instead.

Software projects today are littered with the corpses of technology that was the silver bullet to make your life as a developer so much better, easier and productive. Constantly.

That's the problem - we're not productive, we spend all our time learning new crap that is little better than the ancient stuff we used to use and got stuff done with.

The tools, well I know people who swear vim is easier to use than the latest IDE that has full intellisense and refactoring builtin, and they are probably right - in that they have learned their craft using that tool and actually are more productive than the bloated and slow IDE could make them. The trouble is that newbies start with the IDE and don't know anything else, so they stay in the "its easy" camp and never progress to real masters of their art. Which is understandable when you need to re-skill every couple of years, but not beneficial to the software industry.

No, he has a point. Back in the day, we had few tools and we learned how to use them.

now, we have a tool for every hour of the week, and as soon as you've mastered one, someone comes along and says "your skills are sooo obsolete, you must learn now or fall behind", so you get to grips with it and start top master it, and then realise its a pile of poop and hunt around for a new, cooler tech to use instead.

Apologies, but we still have all those old tools. We just don't use them any longer. Because you can't use Turbo Pascal to make web pages, but you can use jQuery. If you were working on the same problems today as you were working on 20 years ago, you probably would be using many of the same tools. The only reason you're using the new tools is because you'd rather spend 20 hours throwing something together versus 20 weeks writing it from scratch.

Honestly, if you think this is different than it was in the 90s and 80s, then you weren't paying attention in the 90s and 80s. The technical periodicals were FULL of the new stuff that was going to change everything. The only real difference is that it's easier to find stuff and get distracted these days, simply because the industry is much larger. I assume it was similar as you go back further, I just am not old enough to remember it first hand.

You are right of course it is similar to the 80's and 90's in that companies that wanted to steal the sales of other companies simply created new fangled languages and marketed the hell out of them instead of embracing what works and adapting it to the new paradigms. The only reason you can't use Turbo Pascal to make web pages is the compiler was never updated for the functionality but it very well could have been. In fact its progeny Delphi [embarcadero.com] is alive and well and building apps for almost every popular platf

The only reason you can't use Turbo Pascal to make web pages is the compiler was never updated for the functionality but it very well could have been.

The web is not a runtime environment.

But web servers are.

The reason you can't use TurboPascal is because web pages run in the browser virtual machine,

Web pages are served by a web server, and the OP is exactly correct: TP was not updated to function well in a web server environment, unlike things like Perl that have modules to deal with CGI.

Of course TP doesn't execute on a browser like the javascript that is common, and web browsers will never see a pascalscript. but that wasn't the claim you responded to. "Make web pages" isn't just "run scripts on a browser".

Web languages, n the other hand, are predominantly for programming code on a server to generate markup, which is then interpreted by the browser to render output,

Right. And there is no reason that TP couldn't generate that output, except that it didn't get updated to to deal with CGI and you'd have to write your own library to do that. Or maybe someone has written one, I don't know. I don't care enough to look. I never programmed in it, I used TurboC.

Basically, if you are thinking your browser is a "platform", or you are thinking "the web" is "a platform" in the traditional programming sense, as the OP obvious is, then you are an idiot.

No, actually, he's quite right. It's a different method of programming, a different paradigm altogether. He didn't talk about programming the browser so that part of your statement is irrelevant, but as a design platform the web truly is different. At least before people tried to change a markup language into a full page layout and presentation language.

The tools, well I know people who swear vim is easier to use than the latest IDE that has full intellisense and refactoring builtin, and they are probably right - in that they have learned their craft using that tool and actually are more productive than the bloated and slow IDE could make them.

I would add that very little of my programming time is spent writing code, which is what an IDE is most helpful with; refactoring, code skeletons, reminding you of the order of args, etc. Most of the time I spend programming -- at least on anything that I expect will have a long service life -- is spent thinking through the right way for the code to work so it will be clear, fast, easy on memory, and work in a way that makes sense when we apply it in a different context. There is no IDE or language that can help with that part of the problem.

Tools are simpler and easier to use, and yet a Master can make those tools do things your average user cannot. And that makes the world of difference. A spreadsheet "can" be used as a simple Database, but actual real databases are more complex and can do things a Spreadsheet simply cannot handle. Is the difference "database" or something else? Is it the tool or something else?

The problem isn't where the author thinks it is, masters always make their work look easy, but it takes skill and talent both. Skill is practice, talent is skill with artistry.

Tools are simpler and easier to use than ever, and this guy is mistaking nostalgia and innocence for actual difference.

When websites like TopCoder are offering $100-200 bounties for something as simple as changing how a table is sorted it really shows how it's become easier than ever. It's just layer after layer of needlessly complex code and we don't realize how inefficient, poorly designed/coded, and horrible it really is because the speed of the hardware masks it.

Working with a framework and some spreadsheet code things just didn't "feel" right to me, felt sluggish. No one else noticed it, they said it was fine, I was imagining things. It took me almost a full day of digging to discover that a single 37 character line of code was slowing everything down - no one bothered to do any efficiency testing on it because it was a "low level function". Changing it sped up the application a thousand fold. Had the code been well designed and not set on mounds of anonymous functions, hacks, and bad practices it probably would have taken about 20 minutes to locate and fix.

Just because something can accomplish the tasks you need it to simply/easily doesn't mean it's well designed, simple, or easy from a coding perspective.

The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs

"Application Programming" is today done in things like Excel spreadsheets. You don't need to write a COBOL app* to keep track of interest payments and such. I'd argue that computers are more accessible than ever, and thanks to Google routine coding often becomes this exercise in searching for already-solved problems and applying the solutions to your similar problem.

"The web is just an enormous stack of kluges upon hacks upon misbegotten designs. This Archaeology of Errors is no place for the application programmers of old: it takes a skilled programmer with years of experience just to build simple applications on todayâ(TM)s web. What a waste. Twenty years of expediency has led the web into a technical debt crisis."

I know, right? We had it so much easier back when we could just write our own interrupt handler (and pray we didn't step on DRAM refresh or vice-versa) to pull bytes directly off the 8250 - And once we had those bytes, mwa-hahaha! We could write our own TCP stack and get the actual data the sender intended, and then do... something... with it that fit on a 40x25 monochrome text screen (yeah, I started late in the game, those bastards working with punchcards spoiled all the really easy stuff for me!).

And now look where we've gone: Anyone using just about any major platform today can fire up a text editor and write a complete moderately sophisticated web app in under an hour. Those poor, poor bastards. I don't know how I can sleep at night, knowing what my brethren have done to the poor wannabe-coders of today. Say, do I hear violins?

If you don't want to get left behind the fads, don't choose an area that's all about fads.

Any kernel developer will currently be using basically the same toolset as they used in 1980.Any driver developer will currently be using basically the same toolset as they used in 1980.Any game developer will currently be using basically the same toolset as they used in 2000.

Not everyone jumps on a new shiny framework every 2 years because they're struggling to overcome the limitations of a crappily designed language like javascript. If you don't want to jump from fad to fad... just don't be a web dev.

In 1980, a kernel or driver developer was entering data into a mainframe using punchcards in binary (or if they were lucky, an assembler was available for the architecture they were targeting). Version control consisted of a row of 7 cabinets, one for each day of the week, where you stored your most recent stacks of punchcards. They most certainly weren't using vim/emacs, gcc and git and debugging in a VM.

I don't think its fair. A modern web application is expected to do a whole heck of alot more than COBOL as it was originally designed even envisioned. You can still bang out a simple shell script or procedural program in Ruby today without knowing much of anything but we just don't consider those things 'applications' anymore.

Hell COBOL (propper) isn't really even interactive, its read in records, and write out some other records. You needed something like CICS to do much of anything interactive and guess what its not so easy to use or understand anymore once you go there.

Lets not even talk about the job control stuff to get your program running in the first place; normal people were never expected to handle that, it was the job of the OPERATOR who HAD EXTENSIVE TRAINING to do that.

So really its just not true.

Applications are more complicated to build today fundamentally because they are more complex in terms of what they do. Could it be simplified yes, we could fix lots of the technical kludges by replacing http and other web technologies with some truly stateful application delivery protocol and languages + libraries but it while it would be simpler it would not be simple.

His view of the past is skewed, things were never really available to regular people. There was always specialized professionals in the background handling the details. Except for a breif period in the late 80's and early 90's during the height of the PC revolution. Those machines though were a great leap backward in terms of what the limitations were as compared to the mainframe, and in leaving those limitations like (single user) behind we have put all the complexity back in.

I wonder if the writer has ever seen the monstrosities programmed in BASIC/VB, COBOL or HyperCard by the resident business manager. People in general have no clue about programming or mathematics. People in general, don't go for higher education. People in general have an IQ of about 100. People in general can't work with a computer when the outline of things changes or the buttons move around. And you want those people to program a math equation that requires 2 years of college math... and they need to place the buttons themselves?

Hell, take things "programmed" in Excel for that matter. I've seen people use 3 columns to do things which could've been written in 1 operation especially when it comes to adding percentages to a value (they'll calculate 4%, then add it's outcome to the source value to get a +4% and then hide the other 2 columns instead of just doing 104%). That will take them 2 hours to complete.

The Web is fine. Plenty of people understand HTML, even without much education. People UNDERSTAND that things within a document need to be described at some point. Plenty of people can even understand basic JavaScript, even without much education.

The reason the web and most of programming in general is so kludgy and broken in many places is because we've let those people that understand HTML and basic JavaScript make websites and entire applications. We have told business managers that they can describe their business in a common and easily understood language and the business manager did describe their business but then they've gotten in way over their head where they themselves can't even understand what they've done. And then those business managers moved on and started claiming they had programming experience and then they went to another company to make ever bigger monstrosities. And REAL programmers get a bad name because programming these days is so easy, anyone can do it.

"In the old days there was a respected profession of application programming. There was a minority of elite system programmers who built infrastructure and tools that empowered the majority of application programmers.

I think this is more of deluded statement than anything.
In the old days you typically had to have an Electrical Engineering degree to do programming - at a time when having a college degree was not the norm.
This only filtered out of that circle as geeks took interest before college and tools became easier and costs were greatly reduced.
The point: programming has always been done by a small group - the "elite" - at any time in the history of computer systems.

Our goal was to allow regular people without extensive training to easily and quickly build useful software. This was the spirit of languages like COBOL, Visual Basic, and HyperCard. Elegant tools for a more civilized age. Before the dark times before the web."

Again, progress has certainly occurred towards this, but the fact of the matter is that most people are not interested in being creative the way programming requires you to be. They'll be happy to play around with HyperCard or Excel long enough to get some basic thing done, but they'll be atleast equally happy to pass it on to some one so they can focus on what their actual job in stead of trying to figure out how to make a fancy little graph.

"The web is just an enormous stack of kluges upon hacks upon misbegotten designs. This Archaeology of Errors is no place for the application programmers of old: it takes a skilled programmer with years of experience just to build simple applications on today’s web. What a waste. Twenty years of expediency has led the web into a technical debt crisis."

Many of those things are because of people not skilled enough making the decisions - not understanding what's there and trying to fix it, only to realize later when they do understand it better that they royally screwed it up.

That may be true, but you miss the deeper underlying issue that TFA (the friendly article) is whining about.

They want to be able to be a programming superstar by reading a book such as:
* Learn Programming in 24 Hours!
* Learn Brain Surgery in 24 Hours!
* Learn Rocket Science in 24 Hours!
* Learn To Be A Concert Pianist in 10 EASY Lessons!

Various programming boards are flooded with people who want to know how to break into programming for big bucks, quick, overnight, but don't want to actually do the hard learning [norvig.com].

Whining about how hard the tools are to use and how, if only the tools could be made as simple as a hammer then everyone could program, is as naive as suggesting that if word processors were as simple as pencils, anyone could write poetry.

What these utopian visions of programmatic democracy all lack is any notion that attacks the essential complexity of the problems being solved by code. Problems that have, if anything, grown more complex with increasing memory and CPU power. All the forays into "graphical programming" or other tools to take the programminess out of programming have shown that it doesn't matter whether you're expressing a solution in text or little icons connected by arrows - the essential complexity of the problem remains. The only way we're going to democratize programming is if AI gets to the point where the thoughtwork of breaking down the essential complexity of problems can be offloaded to some other intelligence.

Is now a good time to admit that I learnt BASIC programming from a book titled "Teach Yourself BASIC in 8 Hours" or something very much like that? To be fair, it was sometime around 1980 and you really could learn BASIC in that amount of time, with change left over to learn fun memory locations to PEEK and POKE. I was writing my own games before I made it to the end of the book.

You can learn Lua in under 8 hours, but really, that's just the syntax - the hard part of learning any language is learning it's li

And I was writing a few BASIC programs shortly thereafter. But they are today what I would call TRIVIAL. Things that I would do in a single method of a modern language. With much better style, correctness, comprehensibility and maintainability.

Having just learned programming myself doesn't mean I was by any means an expert ready to work on big commercial problems worth lots of money. It took years more to learn a lot of important th

I wonder if anyone in the architecture profession has ever proclaimed "Well if you really want to change the world, empower regular people to build skyscrapers." Probably not. And yet the programming profession seems to be constantly obsessed with making the field accessible to everybody and her sister, as if programming should be something any idiot off the street can do easily.

I wonder if anyone in the architecture profession has ever proclaimed "Well if you really want to change the world, empower regular people to build skyscrapers." Probably not. And yet the programming profession seems to be constantly obsessed with making the field accessible to everybody and her sister, as if programming should be something any idiot off the street can do easily.

I think you're mixing your metaphores somewhat.

There's a different between learning to build a basic house and a skysraper. Only t

It IS accessible. Every copy of Windows since 2006 has included Powershell, which is one of the easiest to learn things you will ever come across, and it can handle 99% of the tasks your average non-programmer user will ever want to do, from simple GUI's with scripted events, to excel automation, to bulk administrative work. Theres even an IDE for it built right into windows.

Im not an OSX guy but I understand things are pretty similar over there, with whatever OSX uses (Applescript?), and Im pretty sure most Linux distros come with Perl or Python (if not theyre a 1-liner away).

If you're not finding those scripting languages accessible enough, you dont care enough about the project you want to do. Alternatively, maybe some people just dont naturally have a gift for the type of thought process required by programming-- and I dont think that needs to be a "problem".

Some people in life find an "unfair advantage". This is very evident in professional athletes. They must start with natural athletic ability and then hone that through practice and training. And then a select few get paid huge dollars to essentially play a game.

People with natural problem solving and logic skills also have an "unfair advantage". It doesn't generate the quick wealth of the professional athlete but can lead to a promising professional career path. It still takes practice and learning to really take advantage of these skills much like the professional athlete learning their sport.

I will not apologize for taking advantage of my abilities any more than a professional athlete will give back the money they earned playing a game.

I say the professional athlete is luckier than you are. There are hundreds of thousands of kids every year working as hard as they can to become professional athletes, and that hard work combines with two big patches of luck - good genetics and the fortune to avoid a career-ending injury - to make success. The ones who get hurt can't do it, no amount of hard work offsets poor genetics, and the pool of available paid athlete positions is relatively small.

In our field, average talent or at most slightly above average talent and a lot of hard work is all you need to succeed. You don't need to be born a genius, average intelligence and a willingness to learn is sufficient. And there are a huge pool of open positions plus the possibility of creating your own niche. The only thing "elite" about most of us is that we learned not to be lazy and in the modern world that appears to less common than it was a century earlier.

Those are jobs that involve a vanishingly small percentage of the general population. Programming is not. I couldn't stop laughing after reading this gem - "programming has become an elite: a vocation requiring rare talents, grueling training, and total dedication."

Does this egotistical idiot actually believe that?

Programming is not something that requires grueling training or rare talents. Algebraic topology, cardiothoracic surgery, and competitive chess require those. If you're writing code that requires elite skills, you're doing it wrong - no one is going to be able to understand it, and you will never be able to troubleshoot it. Someone with an IQ of 100 can become a perfectly competent Java or C++ programmer with two years of intensive training. Programming is more akin to a trade skill like plumbing or electrical work, than it is to engineering. And before everyone gets on my case that being a top 1% programmer is incredibly difficult, the same holds for a top 1% electrician.

As far as amateurs, the barrier to entry for programming is far less than for working with electricity. Which requires more training - writing an Apple Store app, or safely changing out the breaker box in your basement?

Changing out your breaker box. Hands down. There is no arcana, English as a language, no IDE, no security, no graphics to work out, no logic. You almost can't buy the wrong materials, and if you actually ask the guy at the store you won't and after that it is pretty much tab A, slot B, kin

The idea is to find your niche in life and exploit it. Not call the whaaambulance.

Sure. But lots of people participate in sports, just not at the Olympic level. Lots of people play Jeopardy, play the piano, dance, and vamp for photos, to the betterment of their own lives and for the entertainment of both themselves and others. How many people are 'casual programmers' in the sense that they can do a little bit of programming to enrich their own lives and those of others in their immediate circle?

I see this as being more about moving away from excessive specialization and exclusiveness, ra

And when I see words like "injustice" and "excluded" I see a typical liberal who views skill, talent, dedication and mastery as bad things. Misapplying words like this, in ways exactly like this, cheapens real injustice, and real exclusion. Normal Humans are excluded from exceptionalism, not because of some "injustice" but rather that is what makes the exceptional so great (mostly hard work and dedication).

How about, instead of deriding the exceptional among us, we inspire others to exceptionalism?

I think he just doesn't see the world of 'regular' programmers. Has he heard of things like SAP or People Soft or SharePoint.

All of these offer pretty regular people to write applications and web applications.

Next comes the point you make that I will just reiterate. Programming is a skilled job. I taught high school computer science. I don't know how long its been since you were in high school, but most students can't even understand assigning a variable properly. If they can't

I think they're right about the problem, but wrong about the solution. They think the solution is to make it easier, but that's just not practical. Most people can drive, few can design engines. Most people can learn to use a blood pressure cuff, few create them. Most people can learn to use a spreadsheet, few know how to create one. Learning how to write software that's more than just user interface tweaks on something that somebody else built is inherently difficult.

But the real problem is this impression that you have to be born 80% as smart as Einstein to get into this field, and that the learning curve is impossible for regular people. That's totally wrong. Average intelligence plus persistence is all you need. You won't be Linus Torvalds tomorrow, you won't be Steve Wozniak next month. But put your time in, try things out, get used to being frustrated as you learn and keep learning anyway, and in a few years you'll understand what's going on and be able to do anything this side of the most advanced work as well as anyone.

That's the lesson we the progressives should be teaching people. And to be clear, it fits all of my original examples too. Few people walk into an automotive engineering program and instantly grasp all of the concepts involved - years of persistence matter more than raw talent if you want to design engines. Few people start building medical equipment and have an instant knack for getting it right - years of persistence matter more than raw talent again. If you were born with an 80 IQ, sorry there's only so far you can go. But the difference between a person with 110 IQ that contributes code to the Linux kernel and one that works at a gas station is their persistence, not raw intellect.

But the real problem is this impression that you have to be born 80% as smart as Einstein to get into this field, and that the learning curve is impossible for regular people. That's totally wrong. Average intelligence plus persistence is all you need.

What you really need is to deal with this anti-intellectualism that's so popular in the culture today, and replace it with genuine curiosity, a joy of discovery, and a delight at learning new things.

Do that, and the rest will naturally follow, and not just in software development.

Actually, yeah... I'm not sure if there's a STRONG link to being on the spectrum (not my area of experience, though I might even find myself on some mild end of it myself), BUT, these kind of things do require a very unusual amount of dedication to learning a thing... most good developers I know have been hacking away since they were between 5 and 10 years old, dedicating their entire lives to it. That is not normal, as defined by numerical analysis of the larger population. The part I have difficulty wit

Generally when people say autistic, they don't mean a mildly afflicted, high functioning person, but someone who never speaks, rocks in a corner, and screams if their normal routine is changed. You DON'T have to be autistic, or be anywhere on the autistic spectrum to be a great programmer.

Like becoming good, even excellent, at anything it requires hard work, dedication, and practice. Any normal person can do it. Programming, and I've been doing it since an assembler was a real cool tool, can be mastered by normal people. Sure, I've seen a few odd balls in the field, but no more so than in other fields.

As far as making programming easy for the masses: that is fine for little toy systems, but if you want a large system built, you want properly trained professionals working on it.

just about every field above burger flipping requires specialization. Are you going to ask Joe Blow about your corporate tax accounting needs? Or are you just going to drop by Intel and see if you can lend a hand with some microcode? Work is becoming increasingly specialized across all fronts, time to get used to it.

After 15 years at this I have met exactly ONE person who is both good at developing AND is socially clueless. Every other socially clueless person I've met sucks at this. So sick of this stereotype...does anyone else look at bigbang-theory and think "wtf, WHEN are we gonna be done with this shit?"

I can't even imagine getting up at 7:00am every day, spending an hour putting myself together, running some shit through my hair, putting on a stuffy and uncomfortable suit and tie... only to walk in to some godawful nothing of a job where I'm expected to spend all day not only performing menial, make-work, soul-crushing bullshit work that could be done by robots, but navigate the minefield of social nuance. After all that, I'd be expected to piss away my evening on some social gathering to talk about some meaningless shit? Fuck all that noise.

I figured out a while ago that "normal" people manage to fulfill themselves in soul-sucking non-jobs by feeding their social needs throughout the day. I have little-to-no social needs and such work would leave me completely empty to the point of contemplating self-harm.

I love that I can roll out of bed at 9:00am, make my coffee and jump straight into work in my pyjamas, do my shit-shave-shower routine an hour later, grab another cup 'o joe and code till noon, have lunch, have my workout, then take my laptop out to the deck if it's a nice day and hack away until supper... all while not interacting with a single person, save for email and the occasional phone call. Maybe I'll the energy for some light social interaction in the evening, but that's all I need for a week.

You have to be REALLY smart and good at pattern recognition and logic to be a programmer. And I mean extremely, unnaturally good. I completely disagree with the years of dedication and research, as I wrote an entire software suite that was pretty much flawless right out of college. Experience and training is not very important as long as you know how to write good code that's efficient and makes sense to others. The biggest determining factor is how smart you are. That's just how it is. I'm not a famous singer because I suck at singing and I'm not a famous artist because I suck at all forms of art. You don't see me writing a whiny article about it.

Experience and training is not very important as long as you know how to write good code that's efficient and makes sense to others.

And how did you learn to write good code that's efficient and make sense to others? Maybe you're the rare case of a person that can just intuit what is good code and what isn't, but I think most developers (including myself) learn how to write good code by first writing lots of bad code, and then suffering the consequences until they learn from experience what works and what doesn't.

Have this guy seen "normal people" use a computer? There are some people so uninterested in the thing (even when is their primary work tool) that they can't be bothered to learn so simple stuff as mouse dragging or keyboard shortcuts.

Hell, I've seen people using Spreadsheet software for 10 years without learning how to use formulas. Don't even try to show them what all that HTML gibberish is.

And Spreadsheet software is a pretty good introduction tool for programming.

I present Exhibit A: The army of skinny-jean, unshaven "Brogrammers" who use end to end, non-scalable, non-portable, all-in-one blackbox frameworks like AngularJS and a handful of selected NPMs or Gems commonly used amongst 90% or more of the existent Rails or NodeJS based sites, while writing flat MongoDB collections because they totally don't get NoSQL but love to use it because it's the new hotness and refer to themselves as "elite hackers" while fist-bumping and drinking beer at their SOMA office in SF.

I flew to SF and interviewed at one of those companies. I interviewed with about 7 people. All of them were idiots, all of them were under 25, and all of them thought they were masters of the universe (well, the ruby-on-rails universe anyway). It was an eerie bizarro world experience I will never forget.

But "New technologies, once exciting for the sake of newness"? That phenomenon is known as being young and stupid. The new technologies are exciting because of the additional capabilities they give us. If you thought the technology was exciting just because it was new, then you've been misguided the entire time, and marketers must have loved you. They could slap "NEW" onto an old product and generate some more sales. A fresh coat of paint and it's a top seller again.

Ignore the paint. Cut through the bullshit. Does the new thing work better? If so, it's worth learning.

Or hey, you can stick to the stable and clear COBOL platform that you know so well. Since all your peers are dying off you can charge an arm and a leg for being a master at it. Hopefully you didn't gamble your decades on something like RPGII.

Seriously, does everyone think programming is a spatial relationships problem or something?

Let's put this on the table right now: Normal humans can build houses. Oh, you might not have any construction knowledge, and you'll build a horrendous little shitheap that falls over when the wind blows, but that's not the point. I can put construction knowledge in your head and, in a few months, you'll be able to properly select foundation for a site, properly frame a house, and properly build out the sheathing and siding and insulation and walls. You won't be a master craftsman, but you'll be able to do it right.

Humans are good with spatial things. Humans can look at a two-by-four and understand what a two-by-four is. The engineering concepts behind building a workable shed are a little different, but easily transferred. Given a little time and guidance, a human can learn to relate building materials spatially, measuring and cutting and nailing or screwing or gluing as needed, planning and building a proper structure.

Humans are terrible at numbers and algorithms.

Humans are so terrible at numbers and algorithms that they become *extremely* proficient at math if you teach them with a soroban--a machine that converts numerical problems into spatial procedures--and can't be taught algorithms without visual diagrams of trees and boxes and other shit to show sorting and transformation algorithms. Have you ever looked at textbooks or Wikipedia pages for stuff like PKI, red-black trees, or AES encryption? There's pictures of the simplest shit! Why? Because HUMANS CAN'T PROCESS ALGORITHMS!

The easiest process for a human programmer implementing an algorithm like a quick sort is to associate variables with objects in the visual diagram, associate their state changes with the movements in the visual diagram, and write code that carries out the analogous behavior. By comparison, BUBBLE SORT IS FUCKING HARD TO IMPLEMENT when your only guidance is: "iterate through each list element. Compare each element to the previous. If the previous element is larger, swap them." You actually have to think about how to do the comparison (greater than, less than? Wait, which am I comparing to which?), and how to swap them--usually with a temporary variable, although "A ^= B; B ^= A; A ^= B;" works. Most people visualize some kind of diagram while trying to understand the algorithm.

The real world requires interaction with space, mainly to avoid hungry tigers, kill tasty deer, and avoid driving your car into trees like you're fucking drunk. It doesn't involve shift accumulator left and XOR with memory at address $FC. It doesn't involve explicit semaphore locking and deadlocks if you fail to unlock the semaphore in a loop with multiple function calls and thread branching during the loop. It requires things you can put your fist through if they don't work right, and then continue with successfully.

I've said that for years. You, however, seem to hold that against those with the rare gift and dedication to code. Kinda missing the point, dude.

a vocation requiring rare talents, grueling training, and total dedication. The way things are today if you want to be a programmer you had best be someone like me on the autism spectrum who has spent their entire life mastering vast realms of arcane knowledge â" and enjoys it

Yes, yes, yes, kinda, yes, and yes. Again - Your point? You've described exactly why normal humans will never succeed as devs, and to a degree, why many devs tend to look down on those who can't even figure out Excel.

And you call that "injustice"? I have a rare combination of qualities that let me do seemingly amazing things with computers, and in return, I make a decent (but by no means incredible) salary. You want injustice? Some of those same morons who can't even figure out Excel (much less writing their own override CSS) make millions of dollars per year telling me they want my latest app to use a differerent font color. Another group of those morons make millions of dollars per year because they can whack a ball with a stick better than I can. Yet another group of morons make millions of dollars per year doing absolutely nothing because Granddad worked a town of white trash (sometimes literally) to death.

And yet you would call me out for busting my ass to turn my one natural skill into a modestly decent living?

I think the difference between this and other professions is we are constantly stacking up abstractions and making development more accessible to less skilled people. I didn't read the whole article (it seemed pointless) , but I'm going to guess he's advocating a drag-and-drop IDE for web sites or something equally stupid.

"The bigger injustice," Edwards writes, "is that being a doctor or lawyer has become an elite: a vocation requiring rare talents, grueling training, and total dedication. The way things are today if you want to be a lawyer or doctor you had best be someone like me on the autism spectrum who has spent their entire life mastering vast realms of arcane knowledge â" and enjoys it. Normal humans are effectively excluded from performing surgery or arguing cases before a judge. The real injustice of legal or medical inequality is that it doesn't have to be this way." Edwards concludes with a call to action, "The web triumphalists love to talk about changing the world. Well if you really want to change the world, empower regular people to perform open heart surgery and argue cases before the supreme court. Disrupt specialist knowledge and training! Who's with me?"

The real injustice is that I'll never be able to fill that spot on my bucket list under "Perform open heart surgery in front of a judge while vigorously arguing a case on behalf of the guy who is having his heart operated on."

Things are wrong if a group of people are excluded from something by others for no particular reason or a frivolous one such as: sex, religion, skin colour,... However: we are not equal in achievement, I will never be a swimming great -- the young lads at the pool power past me, but I could prob write a better C program or shell script than they could. However if they were willing to put many years work they might manage that as well.

Life is not fair, different people have different abilities & achievements. What is important is that society provides equality of opportunity; it is up to the individual to exercise that opportunity based on the time that they are willing to put in and their innate abilities.

The article was called "The Real Software Crisis" (BYTE, January, 1996); you can read the original text here [brucefwebster.com]. (BYTE's archives are no longer online). I wrote a more extended discussion on the subject back in 2008; you can read it here [brucefwebster.com]. One might was well write that "normal humans are effectively excluded from composing and performing music"; if you've ever known a music major in college, you'll know just how true that is (I believe Music to be a harder major than Computer Science, having dated a Music major while getting my own degree in CS)...bruce..

From TFA (the friendly article, or whatever other F-word you prefer) . ..> In the old days there was a respected profession of application programming.
> There was a minority of elite system programmers who built infrastructure and tools
> that empowered the majority of application programmers.

I think it is still that way. But now there is a third class who think that breaking into the application programming is some kind of godlike elite skill because it requires you to actually know more

Yeah, why can't we empower most human beings to be programmers? Hey, why not empower them to be hedge fund managers or rocket scientists? If Joe sixpack wakes up one day and feels like picking up a scalpel and perform a simple appendectomy why shouldn't he be able to do it? Why are we stopping him?

Even with training most people could not paint a simple landscape or compose new music or even come up with an original joke. So why should everyone be "empowered" to be programmers? Who is stopping them anyway? Heck we don't even have the equivalent of AMA that can sue people for programming without a license. In fact that rant would have more validity against the legally chartered professional organizations that have the monopoly in issuing credentials and stopping people from practicing law, medicine, accounting etc without license.

A normal person is a person who's good at some things and bad at others.99.99% of programmers, including myself, are normal people who are good at the things required to be a programmer and bad at others (like social things, perhaps?).

TFA is some self-righteous bullshit. Imagine if a garbage collector wrote a blog about "the insufferable unequality in his profession because it takes somebody with rare talents such as muscle-power and the ability to withstand excruciating smells, excluding all the normal peo

I got my first computer in 1986; I was 13, and it was a ZX Spectrum with a build-in BASIC interpreter. When you switched on, you could start away programming. In fact, the computer came with a little book with programming examples and little games. I spend countless hours typing in listings that I found in newspapers. To even load a simple game you had to enter a command.

Since then, I learned C, tcsh, C++, bash, Perl, much later also Python and R. It was a step by step process, and I would never have started it (and became what I am now, that is, computational biologist) if not for this one computer with the BASIC interpreter.

I have kids now, and they have Android tablets. The sheer power, their parameters and their capabilities are overwhelming. I don't know how many instances of a ZX Spectrum emulator I could run on one of these, a thousand?

But even though they run on a system that is related to the system I am using every day, I would not know how to write a program for them to save my life. In theory, I know how I would approach it, I even set up once an Eclipse environment once, but I never got to even start a Hello world program. If I were 13, I would not even know that I can write a program myself.

It is amazing, but I think that actually, my kids will have a much harder time to learn programming than I had, and they will get much less fun in return...

I would gladly learned how to build web pages back in the 1980's when I went to school. Back then, it was the Apple ][ computer and Logo programming. Since I didn't have an Apple ][ computer at home (I came from "poor" family because we didn't have Cable TV to watch MTV), and there was no open lab hours at school, I flunked that programming class. Despite everyone telling me to get into computers, I wanted nothing to do with computers after that incident.
Building web pages was what got me interested in pr

In most of my jobs, HR hires normal humans. The vast majority of them don't particularly enjoy programming. Most of them got into it because they heard it was a good salary. Some of them are pretty good at it, some of them aren't. Maybe 5% of the programmers I've met will go home and write more code because they enjoy it and have their own projects they want to do. Seems to me that with a small bit of training, a normal human CAN do programming and do it reasonably well if they put their mind to it.

They also seem to have an above-average chance to push management to jump on some new framework bandwagon because they think that will solve all their problems. To be a really good programmer, you have to know how to program, understand the processes that you're automating with your code and realize that no silver bullet will allow you to NOT understand the processes that you're automating. If you don't understand what you're trying to do, you're not going to do it very well.

We live in the golden age of low barrier to entry programming. I'm 31 (upper bounds of millennial). When I started, JavaEE in its earlier stages or.NET were the only choices outside of C/C++ that a typical graduate could get. Now you have Node, Python, Ruby, PHP, Groovy and all sorts of easy to use languages. FFS, JavaScript is now a serious career choice where it was considered a skill that no serious developer needed when I was in college (2001-2005).

"Normal humans are effectively excluded from developing software. The real injustice of developer inequality is that it doesn't have to be this way."

Yeah, it kinda does. Face it, computers are the most complex machines ever designed and implemented by mankind. There is no way to make them much simpler without losing functionality and breaking a lot of things we take for granted.

I'm excluded from practicing law and medicine.... OH THE INJUSTICE. I should be able to take a 2 week course and read some pictu

...with the Finkler part at the end. I've gotten to the same point where I'm coming across "frameworks" that are supposed to be the be all end all of everything you could possible want to develop on to make your enterprise applications. They are designed so generically, and configurably, that they become useless and waste much more time trying to find the right combination of configuration to make things actually work, since they had to duct tape 30 different other kinds of frameworks together into their fr

And that is even worse. While I can learn software development on my own. Make good money if I find someone who believes I am good enough.... I cannot become an autodidact brain surgeon. Heck, I am not even allowed to pull a simple tooth without a proper license. If that is not a real scandal....

Follow any one stack of learning, "the Ruby way" or "the Drupal way" or "the JSP way", and you can create wonderful small-scale things that, while they might get mocked by the tech-weenie chorus, serve their function and make people happy.

Every hip language/framework/DB/deployment tool/bundler/markup language/food processor is designed to make your day better. Virtually all of them actually do just that (okay, a few will piss you off, but most are not intentionally evil).

The problem is supporting a world with 65 different technologies. It is indeed superhuman to expect someone to be a Groovy/Perl/Node.js/SASS/Hadoop/Puppet/XSLT/AWS/PCI-DSS/Postgres-tweaking/network-routing/desktop-supporting "web guy". (My current job wants that and much more, and, sorry, they don't actually have it in me. I hate faking it. I fake it.)

And, yet, much of the suit-wearing world doesn't understand that, and willfully doesn't want to figure that out. In 1998, they hired "a web guy". If they got successful, they hired five "web guys". Or 20. Those business-people are still looking for "web guys". People who are extreme generalists in "the web" in 2014 are either savants or on the hardcore burnout track.

My tolerance for learning curves grows smaller every day. New technologies, once exciting for the sake of newness, now seem like hassles. I'm less and less tolerant of hokey marketing filled with superlatives. I value stability and clarity.