The president of one job leadership consultancy argues C and C++ coders will soon be as obsolete as Cobol programmers. "The entire world has gone to Java or .Net. You still find C++ coders in financial companies because their systems are built on that, but they're disappearing."

A data scientist at Stack Overflow "says demand for PHP, WordPress, and LAMP skills are seeing a steady decline, while newer frameworks and languages like React, Angular, and Scala are on the rise."

The CEO and co-founder of an anonymous virtual private network service says "The rise of Azure and the Linux takeover has put most Windows admins out of work. Many of my old colleagues have had to retrain for Linux or go into something else entirely."

In addition, "Thanks to the massive migration to the cloud, listings for jobs that involve maintaining IT infrastructure, like network engineer or system administrator, are trending downward, notes Terence Chiu, vice president of careers site Indeed Prime."

The CTO of the job site Ladders adds that Smalltalk, Flex, and Pascal "quickly went from being popular to being only useful for maintaining older systems. Engineers and programmers need to continually learn new languages, or they'll find themselves maintaining systems instead of creating new products."

The president of Dice.com says "Right now, Java and Python are really hot. In five years they may not be... jobs are changing all the time, and that's a real pain point for tech professionals."

But the regional dean of Northeastern University-Silicon Valley has the glummest prediction of all. "If I were to look at a crystal ball, I don't think the world's going to need as many coders after 2020. Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI."

Yep, fir example I work on medical diagnostic software and the amount of data you need to manage and render on screen smoothly is so huge that C++ is the most reasonable and common choice (even if not the only possible one). There are lots of fields where C++ is still king. And that's a shame, because it's a crock of a language.

It is the lack of compiler/preprocessor support for legacy standards of the language.

Between changes in the standard headers, changes in keywords (without provisions to disable them for files written to older standards) Changes in API and ABI, there is a huge clusterfuck of underdocumented shortcomings in C/C++ that are mostly there because of standard ego-stroking. Many of which have no excuse for having shown up in the past decade given that most of them manifest in open source software that could have be

Between changes in the standard headers, changes in keywords (without provisions to disable them for files written to older standards) Changes in API and ABI, there is a huge clusterfuck of underdocumented shortcomings in C/C++ that are mostly there because of standard ego-stroking. Many of which have no excuse for having shown up in the past decade given that most of them manifest in open source software that could have been tested against in an automated fashion to ensure that new changes to the standard didn't break older code.

I agree, for C++. Whenever I have breakages after upgrades, it's almost always C++. Programs have to be recompiled, because they've imported and extended templates that they themselves weren't in charge of. Even if the APIs remain the same, there are still breakages.For C, there are far fewer problems. Yes, someone might change an API, but the general consensus is to not do that, but provide new functions. New standards happen, but only affect the source, and not whether binaries continue to work, like can be the case for C++.

C++ works well where you can control or dictate the runtime system, so it matches the developer toolchain. That's great for embedded-like systems where you can change the entire OS with upgrades, or long term stable systems like RHEL, where versions stay put for 10 years with only bugfix backports. But when binaries break after an OS update, they're almost always C++ ones. From big companies too.

Between changes in the standard headers, changes in keywords (without provisions to disable them for files written to older standards) Changes in API and ABI, there is a huge clusterfuck of underdocumented shortcomings in C/C++ that are mostly there because of standard ego-stroking. Many of which have no excuse for having shown up in the past decade given that most of them manifest in open source software that could have been tested against in an automated fashion to ensure that new changes to the standard didn't break older code.

I agree, for C++. Whenever I have breakages after upgrades, it's almost always C++. Programs have to be recompiled, because they've imported and extended templates that they themselves weren't in charge of. Even if the APIs remain the same, there are still breakages.For C, there are far fewer problems. Yes, someone might change an API, but the general consensus is to not do that, but provide new functions. New standards happen, but only affect the source, and not whether binaries continue to work, like can be the case for C++.

So when minor point releases of C libraries break ABI much more often than C++ libraries, that just doesn't happen? libssl, libpng, libflac, libwebp, etc. Have all broken binary compatibility in minor releases. Note btw, that templates changing definition doesn't really break much in C++ unless you export those templated clases over a module interface which is a dangerous thing to do. Templates being in headers alone means the have no binary part whose compatibility they could break. Though of course you ca

I wouldn't even say those are the big advantages of C and C++ any more.

It's a relatively rare application these days that needs the kind of raw speed you can't achieve with other mainstream languages yet which relies on C or C++ for its performance-critical logic rather than either dropping to assembly (or linking to someone else's library that probably does) or resorting to some form of parallelism. I'm certainly not saying that set is empty, but it's probably getting smaller by the year.

As for expressivity, if you mean how easy it is to express any particular idea in code, C and C++ are relatively weak compared to many other mainstream languages today. They lack many convenient language features widely available elsewhere, and their standard libraries aren't exactly broad and full-featured so you have to bring in additional dependencies to do almost anything useful.

The area where C and C++ still shine compared to almost anything else (and I realise this might have been what you meant by "expressivity" instead) is the low-level control. You can deal with memory and ports and interrupts and so on very transparently in these languages, and they have a very lightweight runtime with minimal dependencies that makes them suitable for use in things like systems programming and writing software for embedded devices with limited resources.

The two problems with C++ are the fragile binary interface problem, and the lack of memory management beyond 'new' and 'delete'. The latter can be resolved by creating a base class which provides reference counting and automatic deallocation. The former, however, cannot be easily resolved since virtual methods in C++ are dispatched by going through a virtual method table with fixed offsets (meaning adding new virtual methods may alter the index where a method entry point is dispatched), and the size and off

C++ has an object system modeled after Simula, with virtual and non virtual methods and multiple inheritance. In CLOS you have to program your generic functions to dispatch method resolution yourself. I would not call that "better".

It is better if Simula is not what you want. Simula doesn't even have bog-standard multiple dispatch, not to mention any of the advanced features. Whether you need to add your own method selection code is another thing, but in C++ you can't even if you need to for your application so it clearly loses. Repeating yourself in boilerplate snippets just because you can't offload it into language logic doesn't seem preferable for actual large-scale programs. (For instance, I have a large-scale symbolic algebraic

In computer science, a programming language is said to have first-class functions if it treats functions as first-class citizens. Specifically, this means the language supports passing functions as arguments to other functions, returning them as the values from other functions, and assigning them to variables or storing them in data structures.

You can emulate the function treatments using function pointers in basic cases. It's really not the same as using a language designed to support a more functional programming style from the start, though.

The STL had the right sort of idea, but there's a big difference in the level of expressive power between the standard library containers and algorithms and iteration patterns offer in C++, even used with more recent language features like lambda expressions, and features like folds and traversals or combin

Seriously, whilst C++ (and Fortran) are great to do the heavy computational lifting, most of that heavy lifting that goes on in computational engines can be isolated in, and accessed from, a specialised library.

After that you really don't need C++ anymore.

In fact you'll realise big productivity (and reliability) gains by *not* coding e.g. business logic or HMI's in C++. Use a script language instead and call those C++ libraries when you know exactly what you want done. I daresay that this is why languages like Python are so popular.

In most applications that business logic and HMI fiddling is 95% of the code once you put the heavy computations inside a library call.

The problem for C++ "coders" is that you don't want a load of mediocre C++ coders to build a library.

Instead you want computational scientists and domain specialists to specify the algorithms, supported by a software engineer for systems design plus one or two really good C++ programmers who can both understand the algorithms and what they do, and who just so happen to be able to implement the design plus algorithms in high-quality, robust, efficient, and elegant code.

Interpreted languages are fine so long as there isn't a lot of code sticking together the stuff in those libraries that are nicely compiled for you.If there is a lot of code to interpret they suck just as much as they always have - hence some really sloooooow stuff out there.There's some appallingly slow stuff running on fast hardware - things like GUIs that take a couple of seconds to respond to a mouse click and bring down a menu despite being on a 4GHz machine that's not doing a lot other than waiting for input. That's the sort of thing that shows off a failure of lazy programming and using the wrong tool for the job (eg. a massive lump of custom java instead of handing over to a library).

Almost no mainstream programming language is truly interpreted any more, though, unless you're talking about something like a REPL interface. Even the JavaScript in your browser and that Python script you wrote the other day to parse a log file are actually JIT compiled behind the scenes these days, and will probably get within a factor of 2-3 of the performance of true compiled languages in most cases. The VM-hosted static languages like Java and C# have been in that position for a long time, and get close

It's a relatively rare application these days that needs the kind of raw speed you can't achieve with other mainstream languages yet which relies on C or C++ for its performance-critical logic rather than either dropping to assembly (or linking to someone else's library that probably does) or resorting to some form of parallelism. I'm certainly not saying that set is empty, but it's probably getting smaller by the year.

Well, I tend to write those libraries. No one I know drops to assembly any more for perf

Yes, I agree that C++ (and C) are still good choices for the kind of work you described. I've worked on some of those too, and there wouldn't have been many other viable choices for those jobs. I just think the number of jobs where this is the case is trending downwards.

In particular, I think it will trend down sharply if and when we reach the point that higher level languages with more expressive semantic models can be compiled to run more efficiently on the relevant hardware than C or C++ code that is wri

Speed is becoming less and less of an issue. As computers are getting faster all the time. But we get sectors who reach "good enough" speed every day. And then can start focusing on other problems that language like C and C++ may not handle as well. My biggest problem I have seen is deployment and change control. Being able to safely test out alpha/beta code safely on production data with real world usage. A lot of this stuff is under business processes. But there should be more tool and environments

Speed is becoming less and less of an issue. As computers are getting faster all the time.

People have always been saying that, and it was never true, and still isn't.Why? Because software, software stacks, entire operating systems are becoming slower all the time, eating up all those resources that your newer computers are capable of.It has always been this way, and people like you have always been wrong.

Also, I would add that hoping that more powerful hardware is going to solve your speed issues tells a lot about your professionalism.

That's the Microsoft Way, dating back to when Microsoft and IBM were working together on OS/2. Gates insisted OS/2 be written entirely in assembler, while secretly doing windows in C. He said that by the time it came out, hardware would be "fast enough." That has NEVER been true, if only because "fast enough" changes with time.

An acceptable wait time for a query at a terminal used to be 1 to 3 seconds. Now? Acceptable "lag" is measured in microseconds in many cases. Would anyone accept a game that took a

When I used C++ I found it to be an interesting but ultimately failed experiment in layering OO on top of C. The end result was that the traps of C (invalid pointers etc) were still there, but now hidden under layers so that they were harder to detect / fix. For example, adding an element to one of the STL container classes could cause a reallocate, rendering any "references" that you had into that container invalid. So unlike OO languages that truly have references that don't magically become invalid, C

C++ employers will be employable in the videogame industry for the foreseeable future, at least. I presume that they'll also be employable for working on any large-scale applications that requires support or compatibility beyond what some of the newer, safer, high-performance compiled languages can provide.

People always talk about how terrible C++ is (and it's hard to argue with many of their points), but it continually shows up in the language rankings as a steady #3 to #7 or so, depending on how language "popularity" is figured. It benefits less from being "pure" and more from being incredibly pragmatic as a language, similar to C. R and Go are still lagging far behind, with D almost out of sight. Swift is moving up thanks to iOS, and maybe Kotlin will do the same thanks to Android (but we'll see - I'd literally never heard of it until recently), but those are almost pre-destined to be one-trick ponies due their strong platform ties.

Ultimately, the big problem is that I don't see a real universal contender for high-performance native code taking over from C/C++. There are a lot of promising languages, but at the moment, nothing is really taking off. Simple inertia is pretty hard to overcome, as it turns out.

Final point:

But the regional dean of Northeastern University-Silicon Valley has the glummest prediction of all. "If I were to look at a crystal ball, I don't think the world's going to need as many coders after 2020. Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI."

Bwahahahahahaha! Oh damn, we can't even get our chat bots working reliably (we use them to auto-generate bugs and tasks). And in three years they're going to be replacing programmers? Fucking priceless!

Ultimately, the big problem is that I don't see a real universal contender for high-performance native code taking over from C/C++. There are a lot of promising languages, but at the moment, nothing is really taking off. Simple inertia is pretty hard to overcome, as it turns out.

The whole reason that people claim C/C++ are dying or going out of style is that they are entirely disconnected from this point. They explicitly overlook the fact that the languages they are always citing are written in C/C++ and rely to an extreme degree on libraries written in C/C++ even when they manage to self-host the languages. It's an ignorance of what the tools they are using actually are.

We wrote a language that is a subset of C, wrote a wrapper for all the APIs that are in C, wrote a wrapper for all the libraries that are in C/C++, then used a compiler written in C to compile the compiler, which is of course in C or C++, on an operating system that was written in C. We plan to make this a standard by making it mandatory or almost mandatory on this one platform we have control over, and that became popular by supporting C.

These days, everything is a computer. Your stove, your car, your cable modem, your TV, all are computers. They all have microcontrollers or microprocessors in them to handle various functions. It is cheaper and easier than doing discrete dedicated logic, even for simple things. Well, those need software of course and it turns out C/C++ are the thing that gets used a lot because you have little memory and power to work with. Pennies count in mass production and the smaller a CU, RAM, flash, etc you can get away with the better, but that means the code needs to be small. You aren't loading up Windows and running.NET on a microwave, you are getting a little PIC24 or something and putting on some highly efficient, directed code.

Because of all these embedded devices, there's a lot of market for this kind of thing, it just isn't the trendy shit you see on hip "Web 3.0" sites. It gets done by people with engineering backgrounds at big companies.

Also, speaking of small embedded computers, regular computers themselves have tons of computers in them. Crack open a desktop and you find a lot of chips in there, many of them computers in their own right. Your NIC is a computer. A simple one to be sure, but it is a processor that runs code, it is not all hard wired. Your SSD is a computer, it has a little chip (ARM usually) that runs the code it needs to do its job. Again, someone is writing the code for all that and that code is not being written in Java.

Even when you have a platform that at a high level runs Java/.NET/whatever it had a bunch of lower level code on it.

This doesn't even get into the reality that 70% of all the "computers" are embedded beasties...all those "IoT" processors and the bulk of them are programmed in C or C++. A Node.JS or Python option is available, but neither of those are what you'd call "secure". You might be able to get Go to "go" onto those platforms or Swift- but they're a bit largish and don't really target the small stuff.

The remark about.Net or Java means they're a real Headupassian. No clue whatsoev

They explicitly overlook the fact that the languages they are always citing are written in C/C++ and rely to an extreme degree on libraries written in C/C++ even when they manage to self-host the languages. It's an ignorance of what the tools they are using actually are.
This is not ignorance but irrelevant.

How many people are working right now in C++ on the Oracle Java VM?
And how many people are working in Java, Kotlin, Scala, Groovy with that VM?

What exactly are you trying to say? That C/C++ really is dying? I was told 20 years ago that C is dying. C is still around and I'm coding C as well as C++ for that matter. If anything my employer has more jobs coding C and C++ than there are qualified applicants. Now I'm being told that C++ is dying by some troop of clowns that call themselves a 'job leadership consultancy'. My only remaining question is: Does Netcraft confirm this startling revelation that C++ is dying?

Watch out, according to TFS you (and apparently me as well since I'm a C dev) will be having the same hard time finding jobs as Cobol programmers are having today. I wonder if that is why even the old retired folks with Cobol experience are coming back to the workplaces...

Most languages more or less work the same, the details and flaws are mostly in the libraries (see PHP) or in some niche corners of automatic type conversions (see JavaScript and also C).

Bottom line it does not really matter if you write Java or C++. A competent programmer should learn the other language in a day or two and get good in it in a few weeks or months.

Of course there are edge cases. I don't expect everyone to become super fluent (quickly) in SQL, Smalltalk, Groovy or Lisp or Prolog or Haskell, OCaml or COBOL or Fortran.

However: even if you are not fluent in any of those languages, with a little bit of intelligence you should be able to fix simple bugs. Writing a new program from scratch is obviously more difficult. Look at COBOL e.g. with its "strange" PIC data layouts and so many "divisions". I mean I fixed a bout 1M lines of code in COBOL for Y2K faults, however I could not really "write COBOL".

Kotlin is around about 5 or 6 years, not sure. I don't see a big advantage over Scala or Java 8, probably easier than Scala as it is closer to Java. However the company, JetBrains, offers Kotlin to JavaScript and native code compilation. Kotlin to native could be interesting on Android, on other platforms I fear they don't have the cross platform libraries (GUI, Networking etc.)

But the regional dean of Northeastern University-Silicon Valley has the glummest prediction of all. "If I were to look at a crystal ball, I don't think the world's going to need as many coders after 2020. Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI."

This is actually true. I wrote a "spec" (as in heavy formalized use case descriptions) to Java/Groovy source code "converter" about 10 years ago. It was super easy to make proof of concept prototypes.

For one thing, C is exceptionally well defined and has a very clean syntax

Well defined, I'll agree with. The ISO standards for both C and C++ are much more rigorous than the foundations for a lot of programming languages.

Whether it's a good definition is a different question. For example, C also has the dubious honour of being one of the only mainstream programming languages where its specification deliberately and explicitly makes the results of some actions undefined behaviour. There is really no need for a language, even one designed to support low-level systems programming, t

Game engines (actually just a special case of HPC - but a commercially important one)

Embedded computing (think IOT)

Things that necessarily have to work low level, like operating systems and compilers

The kinds of big software projects you're citing, of which browsers will keep or increase their importance regardless of how much the IT world "cloudifies"

And they don't even have to take our word for it - just look at recent data figures published by the likes of GitHub and StackOverflow, which show that, while C and C++ are not dominant, they're pretty stable, and very miuch alive and kicking.

So that leads to the hypothesis that the statement comes from someone who doesn't have a clue what they're talking about. And yes indeed: "the president of one job leadership consultancy" is basically a motivational speaker with a one-person company featuring a poorly chosen name and a not-all-too-professional-looking website. Seems to me that "one InfoWorld journalist" decided to give a friend of a family member a big, free publicity boost by interviewing her on something she doesn't know the first thing about.

Indeed. It may be true that coding jobs on low skill level are often Java these days, but C is not vanishing at all. For example, I recently built a custom security filter component for a customer and while in theory it is possible to do this in Java (I think), it would be a lot of additional effort, the component would not perform well and maintenance would be a nightmare. Before that, I build a simulation tool, and again, nothing but C would do the job well for the core algorithms. Glue-code should of cou

The problem is that a lot of People who work in "Enterprise" are really deluded and think "Enterprise" is like superior and stuff....

Its more like they think their "Enterprise" realm *is* the entirety of software development. Their entire view of software is the components that fit together inside their enterprise runtime containers.

Low-end AI? Translating user requirements into working software that actually meets their needs is in the same part of the AI difficulty list as cold fusion and solving world hunger.

If you can actually interpret the business specs without a human putting them into a formal language, you don't need to translate them into computer logic at all. By then the AI can just execute them anyway.

The moment you need that intermediary step involving a human and a formalised representation.. we call that programming.

Consider a company the scale of Google, with hundreds, if not thousands, of software projects ongoing simultaneously. Suppose you assigned an AI to observe which user stories go in, and what code comes out as a result. How many programs would you have to complete before the AI is able to take over a majority of the work involved in building an application? Maybe it can't directly convert user stories, but it could probably handle many of the components that a user story is made of.

Suppose you assigned an AI to observe which user stories go in, and what code comes out as a result. How many programs would you have to complete before the AI is able to take over a majority of the work involved in building an application? {...} I'd honestly be surprised if they aren't already doing something like this.

Yes it's done. Not by google, but by others.The short answer is that the deep neural nets produce texts that looks like code on the first glance, but doesn't even compile.e.g.: The variables aren't even properly declared. it can write a formula (like "a = b + c")but isn't even able to realise the link with the declaration of the variable (that the "int a;" 10 lines above is linked to the "a").

The problem is the size and complexity of modern AI.The size of the context they can consider,the amount of abstract models hidden behind the code, etc.

Currently what AI has managed to recreate with deep neural nets, is on the level of WW2's Pigeon guided bombs [wikipedia.org].i.e.: leverage some image recognition net and similar basic tasks, and string a few together.

The complexity required to write actual code is several orders of magnitude bigger.Even some humans can't do it reliably, and you hope to do it with what currently is the equivalent of the visual cortex sub-part of bird's brain.Good luck with that.

Before achieving that we need :- more raw processing power (you'll need way much more neurons that currently used in today's deep neural nets)- advances in science to better understand how to combine together tons of such "function specific nets" to build a higher level of AI.(the same way a brain is a sum of lot of small specific region, each linked to a higher level/more abstract associative layer).

I agree - a lot of embedded devices are C and even assembly. Especially when you come down to small devices running an 8051 kernel and similar where every byte counts.

C is also one of the better languages to use if you want a deterministic behavior of your code as long as the coding is done right. Environments like Java and.Net aren't good enough in that area since you have background threads for housekeeping that causes interference.

Undefined behavior is the definition of the behavior. It is literally in the the spec. It says: Don't do this if you want it to be portable; the behavior cannot be guaranteed from compiler/target to compiler/target. Complaining about it broadcasts a lack of understanding on your part of the standard, not a fault with the language.

I guess you've never heard of the.Net Micro Framework [wikipedia.org]. Java also supports embedded systems [oracle.com]. Java us actually quite common on embedded devices and they've actually made processors that interpret the Java Bytecode at the hardware level.

False, the idea was that Java was portable between embedded devices. It was never destined for small devices, actually quite the opposite. It was originally envisaged as an interface solution to screens. How big? Well... TVs. Far from "small embedded devices".

I hate this fact that "java programmer" is considered by some people a different job than "C++ programmer". A good programmer should be able to learn a language in a month and become proficient in three months at most. Functional languages apart, all languages are more or less the same. It doesn't matter if your hammer has a red handle or a green one, as long as you know how to hammer.

And understanding the domain and real world requirements often matters more than knowing any particular algorithms or data structures. However, someone who understands those things and knows well the programming language being used is still going to get better results than someone who understands those things but isn't as familiar with the programming language.

I hate this fact that "java programmer" is considered by some people a different job than "C++ programmer". A good programmer should be able to learn a language in a month and become proficient in three months at most.

Yesbut. There's also a difference between proficient and expert. Becoming an expert in either takes much, much longer. For example a friend of mine is an expert in Java. I can hack code in the language and do plenty of things. He seems to have committed half the standard library to memory and knows the JVM in depth too. There's all sorts of weird and wonderful stuff you can do if you know those manipluations.

It would take me many years to reach his level.

It doesn't matter if your hammer has a red handle or a green one, as long as you know how to hammer.

Unless the hammer has a wooden head and is used for knocking chisels. It takes a long time to learn how to do that effectively no matter how well you can smash rocks with a sledge.

I don't think so. Knowing a language does not only mean that you are able to write a syntactically correct program which compiles and does what it is supposed to do. It means that you have intimate knowledge of all the libraries and toolsets and coding environments that come with the language. And this is the real treasure of knowledge that makes the difference between a newby to the language and the seasoned programmer. If you have enough experience you know which things are already invented, and how to us

That's like saying that any welder should be able to be a proficient carpenter in a short time span. Or anyone who speaks English should be able to speak Mandarin on a proficient level in three months. Programming languages are indeed tools, but being proficient at handling those tools is a skill that takes time to develop. Retraining after many years isn't a given for everyone.

It doesn't matter if your hammer has a red handle or a green one, as long as you know how to hammer.

These two statements show just how narrow your view is on a large variety of topics. There are a wildly different set of hammers out there which need different techniques to get what you need. Would you use a sledge hammer like a watchmaker's tool chaser? Of course not. Both are completely different in every way including holding, body movement and problem.

Likewise I don't expect someone who grew up learning how to program brainfuck to understand the first thing about Java. For bonus points without googling

A good programmer should be able to learn a language in a month and become proficient in three months at most.

This isn't, and shouldn't be, the case. There is a huge demand for work in higher level languages that can be done by less skilled programmers. Most of them wouldn't be capable of programming in C, and that's ok. If ALL programming needed to be done by a programmer who could become proficient in any language in three months we'd be 50 years behind in our use of software as a species. We could be pedantic and call that "scripting" and not "programming" but that's obviously a specious distinction.

This is a popular conceit, but it's still a conceit. The skills and concepts and idioms you need to work with dynamically typed "scripty" languages to write a web app are quite different to the ones you need to to work with high performance systems programming code to write device drivers, and those are different again to the ones you need to implement a compiler in a functional programming language. Programming is a vast field, and experience in one part of it doesn't necessarily make someone any good at working in another part of it just because there's a reference manual.

I've been learning how to be a programmer for more than 30 years and doing it professionally for more than 20. I've worked on code from web front-ends to high performance number crunching, via databases and device drivers. I've written production code in my share of different languages and in a variety of programming styles.

However, I'd still say I'm only "initially getting my feet wet" when I start many new projects, compared to someone who really understands that particular area. I'm a competent programme

Well, I started off on a System/3 with a card reader, so I've had to keep my competencies up to date.In our industry more than most this is vital - I sometimes meet people bragging about having "20 years experience in x"; too often it's more like 2 years worth of experience x 10.

That said, I have some buddies still making decent coin on COBOL gigs...

But the regional dean of Northeastern University-Silicon Valley has the glummest prediction of all. "If I were to look at a crystal ball, I don't think the world's going to need as many coders after 2020. Ninety percent of coding is taking some business specs and translating them into computer logic. That's really ripe for machine learning and low-end AI."

Sounds like a fantastic opportunity to get rich—fleecing poor bastards who actually believe this dreck. Ninety percent of coding is indeed figuring out how to wedge some business wonk's hairbrained idea into the machine, but does this clown have any idea how broad a phrase "business specs" is? That's everything. I mean e-v-e-r-y-t-h-i-n-g.

"Make my MRI machine work." Business spec. "Make my combine harvester work." Business spec. "Make my search engine work." Business spec. "Make my toy robot work." Business spec. "Present as many goddamned ad impressions as physically possible." Business spec. He's trying to claim that do-what-I-mean-not-what-I-say computers are just around the corner, readily (and cheaply) available. HA. No. You might, MIGHT be able to train a neural net to do a piece of one of those tasks. All of them? And all parts? Not even close. Not in three years.

I'm sure nVidia's new Titan Xp is a marvelous thing, with its dedicated tensor accelerator hardware, but it's not do-what-I-mean hardware. It was just released last month, which means nVidia's next card is a year away. Does anybody think it's going to be do-what-I-mean hardware? No. How about the generation after that? Maybe another node shrink? Still no. How about three generations from now? If historical Titan benchmarks are anything to go by, it'll be twice as fast as a Titan Xp. It takes nVidia about 36 months to double performance. Is it going to be able to do-what-you-mean? Mmm, no.

The world is going to need just as many coders in three years as it does now. It will probably need more. The coming wave of automation is not going to be self-programming, but it is coming. Somebody is going to have to write all that code. And baby all of those neural nets.

The president of one job leadership consultancy argues C and C++ coders will soon be as obsolete as Cobol programmers. "The entire world has gone to Java or.Net. You still find C++ coders in financial companies because their systems are built on that, but they're disappearing."

The entire world has done what now? I work in the computer vision/data processing world. It's all written in C++ on the back end, often with python driving code on the front. Currently C++ is the only language with the expressivity, speed and resource frugalness required for the job.

I've also worked on deep embedded stuff. Hell, some of the compilers don't even do C++ (looking at YOU IAR C/C++), so I wrote it in C. Otherwise I'd use C++, because there aren't any other languages with the resource control which will do the job.

Lots of other stuff seems to run on the browser. All major browsers are implemented in C++ because... well you get the idea. About the only thing which could potentially displace C and C++ is Rust since it's basically the C and C++ model but with a syntax that excludes many common bugs. But it's a way from being there yet.

A data scientist at Stack Overflow "says demand for PHP, WordPress, and LAMP skills are seeing a steady decline, while newer frameworks and languages like React, Angular, and Scala are on the rise."

There's a difference between decline and fall. The displacement is certainly happening, but you can't replace WordPress with Angular and Scala because one is an entire CMS, the other are a library and language. That's not the same thing.

Science and engineering continue to move towards doing more simulations. Everything from chemical simulations to flow simulations. The more accurate these simulations are the more computationally intensive they get but also the more money you can make since you have to do fewer real world experiments to isolate the true running conditions and the simulations can also be used as control systems allowing you to operate closer to the true danger area.

In most chemical plants reactions are run FAR from the actual danger points in terms of product yield, purity, reaction speed etc because things like PID controllers just can't adapt to how chemical systems really work.

The problem is that for this kind of work java and.net are SLOW. They can easily but 100x to 1000x slower than a program written in C, C++ or Fortran. The tooling to support High Performance Computing type applications really doesn't exist outside of C, C++ and Fortran. They have the most advanced optimizing compilers, profilers, debuggers, libraries etc. What I often see is something like MATLAB for visualization, Python for command and control and C/C++/Fortran for the actual simulation running on clusters.

These newer microchips that have more cores per chip are only going to continue to push things in that direction. It is easy to gain a little scaling with threads but if you want to really get a program to run fast you need to either have direct memory control or you would need a far more efficient runtime than has ever been created so far.

This may come as a surprise but almost no normal software uses more than about 1% of a cpu's capabilities. Even most games are 5%. You can see this when you run them under a good profiler like VTune. Sure the CPU is technically busy running the software but it is mostly just waiting for data and working with unoptimized data structures. To get over this barrier you need to do thousands of small changes to your program.

If you need a program to run FAST you need to eliminate false sharing. If you have two threads write to different indexes in an array but the items are too close to each other in memory they could be sitting on the same cache line and this will cause the cores to have to resync and retry calculations based on which one committed first. The more cores you add the worse this problem gets. I have worked on a program that went from 30 seconds on 128 cores to 0.03 seconds on 128 cores by removing all the false sharing.

You also need fine grained control over parallelization. You need to be able to decide that a function should only be parallelized and to what degree it should be parallelized based on the amount of data being handed into that function. That is why things like TBB and OpenMP allow those to be controlled at runtime. If you make a parallel version of quicksort and run each division in parallel recursively you reach a point where you are creating parallel tasks that are far too small and have too much overhead. This means you need to understand how many cpu cycles an operation normally takes and can parallelize based on this information.

At this point I don't see any other languages really moving in to really compete with C and C++. Sure there are languages that do a lot of the high level stuff that used to be done with C and C++ but the world has also moved to harder problems and C and C++ have moved onto those harder problems also. This is a problem you can't just buy more hardware to fix. Many of these simulations take days to run in highly optimized C and C++ code and the java/.net versions would take a year to run. The time alone would kill the programs usefulness but forget ever optimizing your system using the simulation.

I am not talking about regular programs. I am talking about the types of high performance computing software that is usually run on large clusters or supercomputers.

For the simplest possible case lets say you have to add up an array of tens of millions of doubles. On an x64 arch cache lines are normally 64 bytes which allows you to store 8 doubles at 8 bytes each. You can also use vectorization in a modern CPU and haswell and above can do do vector operations per cycle. If you break up the work to something small like 128 threads you need to break the whole thing up into very large chunks memory such that each chip gets a large contiguous memory region and such that each core in that chip also has a large contiguous memory region. You also need to allocate the memory to hold each cores temporary result such that none of them are on the same cache line or you will cause invalidations on every summation operation and have a large impact on performance.

If you do all of this completely correctly then you are doing linear memory with unit 1 stride and that will allow the memory controller to optimally load in data while you are processing. You also use every entry in the cache line and a cache line never has to be fetched again. At the very last step you would need to read from 128 memory locations on different cache lines to do a final add but you did eliminate all false sharing.

This is of course a trivial example meant to illustrate a point. Even just adding up lots of numbers can be quite complicated to make high performance. Sure a java program can do this but it won't be anywhere in the same ballpark speed wise. In a more realistic example you have to do this kind of optimization work but the problem is not so simple. Just designing a data structure that correctly works across all the different usage and ordering cases can take weeks of work and profiling.

More commonly you have parallel processing that happens in phases where you have large parallel areas followed by sequential areas and then followed by parallel areas again that operate on the same pieces of data. If you are VERY careful you can keep the information you last used in the cache still and so long as you assign all the work to the same cores as they where assigned the first time all the memory access will be local. If you screw it up performance is often 10x to 100x worse on a ccNUMA system with 128 cores.

This type of programming is programming is HARD but it is an area where C/C++ and Fortran completely dominate with no competitors around. Everything from molecular dynamics simulations, quantum simulations to chemical simulations and machine learning. This is an area where Java does not play in and Oracle is not even trying to push it in this area. When Oracle does high performance they do it with C/C++ usually and OpenMP. They have given some very interesting talks on OpenMP optimization.

I actually program in the field and while I admit.NET is pretty awesome if you program specifically in Windows and if you aren't looking for performance! It isn't meant for the high performance edge applications that C and C++ are able to handle. Besides, all four languages.NET, C, C++ and Java all have some similarities with each other. It's about as stupid as saying because I drive a SUV that I can't drive a truck. Sure there's differences but anyone who calls themselves a programmer should be able to switch fairly easily. Java itself is a horrible use of resources, the few games I've seen written in them run like molasses and that problem in itself hasn't really changed over the years.

As far as AI taking programming jobs, don't make me laugh. AI is nowhere near that level yet. An AI that can program is an example of self-aware AI that can program itself. Even if it could, I'm not sure if anyone would really want to risk it. Think about it, let's fire all the programmers and get the computer to program itself to help us. Eh, what if something goes wrong? What if it decides to go the way of skynet?

While the cloud may have it's benefits, not everything will go there. Communication isn't that fast and some information is best kept off the web.

Written by a business type who has no idea how programming works except for the next big thing. Not even realizing how much stuff is based on legacy architecture.

I interviewed a guy recently for a C/C++ role and while he had C down for a skill he couldn't tell me the most basic things about multi-threading, GUI programming, or even C++ fundamentals. He had spent the last 15 years working on some kind of terminal based software and simply didn't have a clue. Yeah I get his day job probably didn't require those things but it says much of his temperament and inclination that he couldn't be bothered to learn in his own time either. He didn't get the job.

...clearly these are people who know absolute f*** all about creating software.

C supposedly died long ago, and yet I find myself using it in critical situations to use libraries such as xmlsec in order to build the underpinnings of an IDP mechanisms - now, those underpinnings are used by golang, but that's been the way of software since the early 90's.

I don't dream of writing C, and I think golang,.NET, Rust, et cetera, are great and useful languages; but, just because I've got some sexy new impact wrench, I still find myself reaching for my 30 year old adjustable wrench on occasion...

I have been doing Perl development for a long time, and in the last two years, it has straight out disappeared. You can still find Perl as a job requirement, usually as part of DevOps positions, but actually writing apps in it, they are gone.

I have noticed that the new fad for LAMP is Python, it has shown up everywhere and years before it PHP, but Perl has been relegated to being a systems administration tool.

I get a lot crap for stating this on Slashdot... but I thought Perl disappeared years ago. References to the LAMP stack was always to PHP or Python. Perl isn't being used to administrator the Windows systems at my current government IT job. I haven't ran into Perl in any of my private sector jobs in the last 20+ years.

It was always around, but it was not bringing in younger programmers. Instead, those guys were being pulled into the next fad. I cannot blame them, they needed a job and a lot of the problems they were supposedly solving were more BS than real issues.

It would keep getting deprecated by Java and Ruby, and then PHP, now it Python (this is over the course of the last 13 years or so).

It has always existed in the *NIX administration space and is it installed by default and it does things that BASH simply canno

For the record, I don't like you. However, I am pretty sure that Perl is dying and that its niche as a scripting language has mostly evaporated or been filled with other things. Also, that particular AC is a fuckwit, and you responded well.

Whether or not technologies are still in wide use isn't necessarily the best measure of health for the technology. WordPress is widely used, but technologically it's horrifically obsolete. It's written in a declarative style, and can't be converted to an object-oriented c

* The CEO and co-founder of an anonymous virtual private network service says "The rise of Azure and the Linux takeover has put most Windows admins out of work. Many of my old colleagues have had to retrain for Linux or go into something else entirely."

* In addition, "Thanks to the massive migration to the cloud, listings for jobs that involve maintaining IT infrastructure, like network engineer or system administrator, are trending downward, notes Terence Chiu, vice president of careers site Indeed Prime."

Everyone (and half the ones quoted in the OP) talk about programming, not IT like in the question.

The IT dept. worries about desktop, data management (NAS/backups), security, connectivity from the desktop to the rest of the company/world, remote access, email and other business apps (including database). I think that kind of IT will be around for awhile. The apps/email might move to outsourced. The desktop will probably be Windows in most cases for a long time unless MS really makes it unusable for most users.

We're already seeing some examples of traditional IT moving away from Windows. I look at my kid's school. All chromebooks & cloud. The school IT needs to do networking/WiFi and account management. I expect that the data management and software upgrades is minimized. There would be security in network configuration and policies for teachers/parents/students. Probably some internal applications (building management and phones?) that can't be outsourced to a web app. Everything else is outsourced to Google. They save lots on IT compared to Windows/iPads that I've seen at other schools.

"Thanks to the massive migration to the cloud, listings for jobs that involve maintaining IT infrastructure, like network engineer or system administrator, are trending downward"

There are many businesses that will never be in the cloud. Many companies are not comfortable putting their data on someone else's computer. Also some companies are legally required to keep their data local.

Finally, even businesses that have embraced the cloud (my organization is one of those) still have local infrastructure that needs support - switches, firewalls, telephones, security systems, building access systems - etc. Those simply can not be put in the cloud - the devices need to be local - and those devices still need to be managed.

So I no longer need to struggle with C/C++ because I need consistently reliable sub-2ms response times and any auto-vectorization I can get. That's awesome! But... what exactly is the alternative supposed to be?

Also, all these people are talking about languages, not jobs. They appear not to understand that programmers can switch to other languages relatively easy and probably already use many languages.

Until I got to sysadmin. This is so mindbogglingly stupid that I'm amazed this guy can tie his own shoes in the morning. I didn't even bother looking at the rest of the list.

If you think your company doesn't need sysadmins anymore just because your infrastructure is 'in the cloud', I REALLY REALLY want to see you do that. Just so I can laugh as your entire company collapses.

The president of Dice.com says "Right now, Java and Python are really hot. In five years they may not be... jobs are changing all the time, and that's a real pain point for tech professionals."

I think back to situations like steel workers or coal miners whose jobs disappear...and to the combination of where these people live, the lack of variety of the local economy, and the difficulty translating their skills to other industries. These things combine to make it nearly impossible for them to maintain their livelihoods. Conversely, in the tech field, that constant rate of change makes it not only relatively easy to change specialties, it eliminates any stigma that comes from having done so.

Yes, this means that fields and skills sometimes go out of favor...but at least you're not stranded when they do. You have options. Whether or not you exercise those options...that's another thing. I'd rather have options, and have it left up to me whether I fail or succeed.

The Swift compiler and runtime are Open Source. They've been ported to Linux and Windows (I believe the Windows port is unofficial). IBM has picked it up as a server language. It's no more controlled by a single company than Java is.

Why are we teaching kids to write when there won't be as many jobs for scribes in 3 years? Because there are very few jobs that don't benefit from some level of automation and it's increasingly essential that you are able to formulate solutions to problems as programs.

Why are we teaching kids to write when there won't be as many jobs for scribes in 3 years?

In at least one state, we don't teach kids to write anymore, at least not with a pen or pencil.. Handwriting has become an optional part of the curriculum.

I thought it was bad when students no longer were taught how to read old cursive or blackletter, so they no longer understood a letter from grandma nor could read old books. But now they don't have to be able to read anything except sans serif, nor write anything that isn't typed on a computer. It does not bode well for when emergencies occur. I also

They are looking at their narrow market, their company and thinking it is everyone. The best example is the VPN retard saying that Windows has gone away. Ummmm..... no. The massive Wannacrypt outbreak at companies is prima facie evidence that is wrong. There is lots of Windows all over the place at companies from tiny mom n' pop shops up to the biggest in the world. It is on desktops, servers, controlling equipment, etc, etc and people are still needed to run it.

I'm sure in his little world, there are no Windows admins. A VPN service likely uses Linux for its server OS, and he just rents VPS's from places like Azure. So in their little company they are all Linux all the time. That's nice, but not at all representative of what is going on in the larger world and if he had any amount of perspective he'd know that.

Anyone who thinks a trend seen at a single company, even a big one, can be generalized to the whole world is silly.

the best performance (C) or with a reasonable compromise between abstraction and performance (C++).

No. C++ is used where you need performance and expressiveness and compile-time safety. C is used where you either don't have a decent compiler for your system, or you can't handle the compile times, or you're no good at writing C++.

Not so, computers still are stupid, stubborn machines that need to be forced to behave. And for this, low level languages still reign supreme.