The only reason why I haven't quit the field a long time ago, is because even with all of the broken software, open source software mainly LINUX, allows me too look at the broken stuff right in the face and actually have a chance in hell correcting it.

If I would have had to spent the last 20 years being a OK or CANCEL monkey on a Microsoft centric industry, I would have quit long ago.

So in your 20+ years, what have you actually fixed? I'm curious because most people I observe who make similar comments don't have any track record of fixes & merged patches to the linux kernel, subsystems, drivers, or anything else.

Btw, implying that people who use Microsoft software are just OK/Cancel clicking monkeys is pretty pathetic.

Most recently, on my P570W when I first purchased it, I had to turn off the intel sound support channel selector code because it didn't work and blew the machine up whenever an attempt was made to play sound.

I didn't fix it, but because I have source code it allowed me to turn off the code (just a simple /* */ comment around the code block casuing the problem did the trick and I used a different USB sound source until it could be fixed later.)

I then posted the code block to bugzilla.redhat.com I commented out so the developers could look at it.

Eventually it was fixed.

But many things happen when you have source code. The customer has a chance to participate in the engineering process to improve the product.

Now, many of you probably do not and the typical LINUX admin I see has no clue about source code. However, it doesn't take a lot of engineers to make a difference...in this case, just one who has access to the source code, identify the code block that is causing problems and comment it out.

Notice I helped the guys at RedHat though isolate the problem and saved them time debugging it themselves as I already did that process in my bugpost write up where I pointed out which source file, code and lines and what values certain variables had causing the problem.

The whole point though not everyone has have the skill set to look at operating system driver code and debug it. Many LINUX admins don't care. But, when things do not work right, and you do have the source code it doesn't take an army of people to fix it, just the possibility it can be identified and pointed out.

By little ole me.

The way proprietary operating systems work, this sort of process can't happen.

It is a BIG reason why I even bother with computers because the day I can't have an open system source/hardware combination to build solutions with for my customers is the day I hang up the towel and open a Indian restaurant chain.

The only reason why I haven't quit the field a long time ago, is because even with all of the broken software, open source software mainly LINUX, allows me too look at the broken stuff right in the face and actually have a chance in hell correcting it.

Here, I disagree with you: a chance to correct it???
Really, when to really fix things we should change the languages they have been written it!
For example, take C: I love C but its default behaviour (performance above everything else) only made sense when CPU performance was a big deal (computers were much, much less powerful than current low end smartphone) and cracking wasn't an issue, now it is the complete opposite: everything is connected so cracking is a big issue but CPU performance isn't such a big issue.
So how do you plan to correct this?
Fix every bug there is in the huge amount of C software? Good luck!
I think that what is needed is: a change of compilers to "harden" C and a change of language to eventually replace what we have, but Ada had a free compiler for a long time yet we're still using C/C++ not Ada so I'm not especially optimistic that the situation will improve..
Rust *sound good* until you actually try to use it: http://www.reddit.com/r/rust/comments/269t6i/cxx2rust_the_pains_of_...

The C base dialect is very minimalistic, making the act of crafting bootstrap compiler for new platforms very easy. This is a really important part of OS development, because it enables portability.

Also, using C in the early stages of OS development release the developer from using another even more harmful language: Assembly.

So we have a positive feedback loop in place: the OS is in C, the base API is in C, so it is all too natural that the base system software and userland be in C too.

Craft a OS from nothing with a safe language is hard. All efforts that i know failed due compiler complexity hindering portability or making hardware abstraction cumbersome. And sometimes the hardware itself is bugged or badly designed.

We have the option to restrict all userland software development using a manageable language, this is the approach that Android takes. But crafting a OS from scratch using this approach is a huge effort that only big companies can afford: it requires to craft the kernel, the bootstrap compiler, the drivers, the new language and the whole high level API for it and finally the userland using this new language.

Yeah, not that he is wrong, but he needs to take his medicinal alcohol.

And more seriously somebody should point out to him that the same thing apply to everything not software related. Laws are released without even being tested and often after being disproven logically. Chemicals are added to our food as soon as we see one benifit though they may interact poorly with other chemicals. Cars break down as soon as one mechanical failure happens (which most software wouldn't). It would be funny to see his world shatter even more. Everything is broken, but somehow things still work, living in a non perfect world, is living.

Software sucks. It really, really sucks. I have yet to meet a piece of software that didn't make me go "...eh." several times per hour - whether it be a videogame, a browser, or an operating system.

Fair warning: I haven't read "The Medium" piece, I am just responding to Thom's statement

Really? Because I and most of the people I know, get up start working and just get stuff done. There is a mildly annoying Chrome issue that means I have to close and restart it every few days, but that doesn't mean it sucks.

The vast majority of the hardware and software I use does its job, does it well, and doesn't really cause any problems.

I genuinely don't mean this as an attack, but your comment about everything sucking sounds more like a personal outlook issue than a real world problem.

And it was ADA that crashed the Ariane 5 because of an integer overflow. Ok, maybe it was incorrect use of ADA, but ADA did not prevent the programmers from using it that wrong way.

It was most definitely a failure to understand ADA. ADA has one of the most sophisticated type systems around. It was the failure of the programmers to make use of it that resulted in the problem not being detected.

No tool prevents problems if the user doesn't know how to use it. There are no magic bullets.

Everything can be screwed up, but that doesn't mean that everything sucks (to link back to the premise)

Was not the language used, was the reuse of some of Ariane 4's weight calibration tables, as is, that rendered Ariane 5's embedded calculators completely berserk, thus doing successive computation overflow of the redundant flight systems, one after the other. Till the final back flip.

It was most definitely a failure to understand ADA. ADA has one of the most sophisticated type systems around. It was the failure of the programmers to make use of it that resulted in the problem not being detected.

I disagree with you here: no type system can tell what will be the maximum acceleration you'll have..

At the risk of sounding jingoistic, the VERY first thing I would do if I ever set up a website or public-facing server is to blacklist EVERY IP address from China and Russia (and probably Nigeria as well) that I could. I'm sure that wouldn't solve all of my security problems, but I bet it would put a big dent in it, and might get a lot less spam in the process.

The problem with your statement is if you lose 20% of the readers out there but block 80% of the internet borne problems, then you still come out ahead.

Yup, and I would encourage others to do the same. If the citizens of these countries get tired of being blocked everywhere, perhaps they will start pressuring their governments to do something about the vermin that are making life miserable for the rest of us.

Yup, and I would encourage others to do the same. If the citizens of these countries get tired of being blocked everywhere, perhaps they will start pressuring their governments to do something about the vermin that are making life miserable for the rest of us.

Haha, I doubt their governments would listen to them any more than ours do to us.

You are nonchalant about punishing the innocent here. Of course you can do whatever you want when you own a site, but do you have any evidence that China is the source of more hackers than other countries per capita? I'd genuinely like to know because mathematically speaking China should have 4.3 times the hackers in the US due to population ratios alone (all else being equal).

> You are nonchalant about punishing the innocent here. Of course you can do whatever you want when you own a site, but do you have any evidence that China is the source of more hackers than other countries per capita?

Before I set up fail2ban on my home router, my /var/log/auth.log would routinely fill up to a few megs full of failed access attempts from Chinese and Russian IPs.

SSH?. Using a non-standard port always solves this problem for me. The easiest way to decrease attacks, and thereby decreasing chances of someone breaking in. Port 443 works great for me, and it's always open no matter where you are. I run an HTTP server, but not HTTPS.

At the risk of sounding jingoistic, the VERY first thing I would do if I ever set up a website or public-facing server is to blacklist EVERY IP address from China and Russia (and probably Nigeria as well) that I could. I'm sure that wouldn't solve all of my security problems, but I bet it would put a big dent in it, and might get a lot less spam in the process.

And then this people will start or spoof IP addresses to defeat blacklisting.

Yeah, good luck with that. The people you actually have to worry about, unless your site/server is a royal clusterfsck of ineptitude, aren't going to be stupid enough to use an IP address from those countries.

Technology is a reflection of people. Software sucks because people sucks. People can't decide how they want it to work. They want everything - intuitiveness, efficiency, aesthetic - but they don't demand anything of themselves, like reading instructions and spending effort learning something, learning to ignore things that don't matter.

Well, mostly true. I've 20+ years of coding behind me, and as a lazy developer, I try to get the code as concise and readable as possible using good practices, naming conventions, etc...

While I can say my code is neat, well structured, tested before release, I've to deal on a daily basis with junior (or not) engineers that just code for the income, and obviously their only reference factor is the LOC !

When copy/paste goes bad and you have to clean the mess, maintain the code afterward, you cannot imagine the pain involved. Watching Hellraiser is painless. And those dudes gets 20% wages more than me because, you know, they are EN-GI-NEERS, and I'm not.

I'm only working off your example, not the one you had in mind but simplified for confidential reasons. For a very small function like that, in general programming terms, not just embedded, it doesn't make a difference in readability.

Especially in such a small function, I prefer functions that exit as early as possible, provided all resources are easily freed, rather than having to read to the end of the function.

If the function was more complex, I'd do it differently and do what you said.

And when you are coding in embedded when you have only a few breakpoints available, it's easier to just put one on the final 'return' than on each one.

What's wrong with putting a breakpoint after where the function is called and the value is returned? Same result.

I'm only working off your example, not the one you had in mind but simplified for confidential reasons. For a very small function like that, in general programming terms, not just embedded, it doesn't make a difference in readability. "

This was an example out a function that features 80+ cases...

"And when you are coding in embedded when you have only a few breakpoints available, it's easier to just put one on the final 'return' than on each one.

What's wrong with putting a breakpoint after where the function is called and the value is returned? Same result. "

Nope, because there could be several calling function to this one, and you should still put several breakpoint to catch all the parent calls. So having just one 'return' at the end you might then catch the faulty 'error_code' and step out and see what was the calling function that broke everything.

Nope, because there could be several calling function to this one, and you should still put several breakpoint to catch all the parent calls. So having just one 'return' at the end you might then catch the faulty 'error_code' and step out and see what was the calling function that broke everything.

Geeez, I'm teaching you your job, I should quit that...

Kochise

As you can tell, my debugger-fu is not strong. I personally don't like using debuggers. I prefer to design my code so that debuggers aren't necessary.

I understand your point and in C is totally valid (though the "goto" keyword does its job there too), but in C++ where RAII is in charge to release all resources used by any instance in a block or in languages where an exception can be thrown anywhere, the idea of having just a return point per se, does not make any sense to me anymore.

What if you put your breakpoint in the final return in your function if an exception was threw rendering your breakpoint unreachable?

His objection was that the returns inside the cases should be changed to breaks, so that only the end of function return is the only return. It makes sense for what he's doing, which is on an ARM system with only two hardware breakpoints.

Like I said, his real world code that he's basing his example on probably is more complex than the example. I would say to him that different contexts for a switch statement makes the whole "only return at one point" not a hard and fast rule.

Long ago I ran the program IEFBR14. Ran it many times. This program did absolutely NOTHING - it's sole purpose was return control to the operating system.

The reason why it existed was to satisfy the requirements of valid JCL: Every job deck had to have at least one EXEC statement to be considered valid (JOB card, EXEC card, optional // card). IEFBR14 was there to provide this program that could be EXECuted. A nice side effect was to use this program to issue file (correct: "dataset") dispositions without actually needing any "processing" program that did things with them, to manipule the file catalog or to build libraries (later obsoleted by dedicated programs). It could also be used to alter the job flow because the return code could be evaluated in later job steps with the COND= parameter. It could also serve to "tidy up" after the last job step, eliminating unused or "accidentally" kept files). Finally, it could be used to make sure, when being used in the first job step, that following steps would be able to allocate sufficient storage capacity (and if not, let the job fail early).

This is just another illustration of how broken things were - when the null program actually had several useful purposes. Having written this, I feel old now. I should run IEFBR15. ;-)

- you need an internationally recognised software engineering qualification to release software to the public (except for testing purposes).
- all code needs to be fully documented
- all software needs a very stringent and formal testing process before being RTM
- all public domain code needs to undergo a comprehensive design process before the first line is written
- legal liability for any damage caused by defective software
- professional indemnity insurance

The result would be:

-99% of software authors would disappear overnight.
-salaries of competent software designers would rise dramatically.
-the rate of new software releases would drop dramatically.
-The quality of software would improve exponentially within a few years.

No, the answer isn't more regulation; the answer is more cooperation. Restricting software development would only slow things down and give entrenched organizations total control over our technological future. If we work together, we can make some damn good software.

Do you really think people is software so special that they should ignore the lessons learned by every other profession in history? It is rather amazingly arrogant for most software engineers to think so, all the while always complaining about everything (treated like crap by business people, poor quality, poor working conditions...)

I hate to break it to you, but most people aren't that special. Power and organization matter a heck of a lot more in influencing the world.

Not to mention, we don't live in some free world.
This has massive issues that you can't simply say that freedom will result in more innovation. Good people can and will choose other careers streams which offer a better life with higher professional standards. Without long term careers in the field, it is possible to face a lack of deep innovation as few would want to invest their time. Social web app innovation is good, but it doesn't need years of deep study.

One point: as far as I recall all of those professions you mentioned were around a long time before they learned the right way to formalize. For centuries (or in some cases millennia) people learned by getting apprenticed, or by reading the classic works and then putting up their shingle.

Software engineering is still in it's infancy, we have barely learned to stand compared to those other fields. It isn't clear we know the right way to formalize yet. I don't know that we could come up with an equivalent of the bar exam that was worth anything. Lord knows the MSCEs are no predictor.

We call ourselves engineers, but that was just a term to distinguish ourselves from the older more clerical definition of programmer. We are really more like craftsmen — sitting at the intersection of art and engineering.

I am not saying we shouldn't formalize, but I am saying that it is going to be a lot harder than most people here seem to be asserting.

"I hate to break it to you, but most people aren't that special. Power and organization matter a heck of a lot more in influencing the world."

You don't like to read history books do you?

I stood and watched in disbelief at the height of the SCO trial as McBride stood in front of a jury and said it is impossible for LINUX to have come about without a PhD and the resources of a institution, therefore LINUX was copied.

Digging through more ancient history we have exhibit B, a one George Boole.

Who was told by the organizations, professional at the time at key Universities to go back to digging ditches because his idea of dual state logic was HOGWASH.

Not too mention Mr. Boole didn't have the cash to buy his way into the ELITE CLUB called a University education.

I hope I do not have to point out, how important Boole's work is.

Exhibit C a one named Albert Einstein. Sent home by the organized institutions of professional "learning" at the time that he was unteachable. Who spent his "unorganized" time as a patent clerk, working on theories that have changed our world forever.

Exhibit D a one named Nikola Tesla, who after spending 1 year at a "organized professional organized University" couldn't stand it anymore and left forever declaring he would never return and built the entire foundations of energy science by his little ole self.

AC current which ironically allows you to type ...

SUCH HALF ASSED COMMENTS as

"I hate to break it to you, but most people aren't that special. Power and organization matter a heck of a lot more in influencing the world."

All of the advances that for what I would call the modern age came from people who rejected professionalism or "organization" as you call it.

This fundamental rejection by these and many more people are what drives the huge advances in technology and REAL science, not the crap they do now at Universities for grant money roller coaster rides.

The individual built our modern age. Professional organizations or Universities did everything the could against these individuals to kill it.

-99% of software authors would disappear overnight.
-salaries of competent software designers would rise dramatically.
-the rate of new software releases would drop dramatically.
-The quality of software would improve exponentially within a few years.

-Innovation dries up, as some of the most interesting individuals in tech, Steve Jobs, Bill Gates, Woz, Zuckerberg, and a host of others would never have met your criteria.

-IE 6 shows that software that doesn't release, stagnates. If the pace of development slows, innovation follows.

-A lot of OSS and Free software got it's start from some guy scratching a particular itch, not a comprehensive design process. Unix is a good example, it was written by Ken Thompson and Dennis Ritchie because Multics failed, on a scavanged minicomputer.

-Businesses couldn't afford to have inhouse software written, if all their developers had to follow strict engineering guidelines.

-Innovation dries up, as some of the most interesting individuals in tech, Steve Jobs, Bill Gates, Woz, Zuckerberg, and a host of others would never have met your criteria.

Henry Ford, Thomas Edison and Isambard Kingdom Brunel wouldn't be hired now either. Times have changed.

-IE 6 shows that software that doesn't release, stagnates. If the pace of development slows, innovation follows.

You don't need constant major changes to improve software. You can continually polish and improve existing software. Many highly competent physical engineers spend their entire careers making tiny improvements to existing designs and processes.

-A lot of OSS and Free software got it's start from some guy scratching a particular itch, not a comprehensive design process. Unix is a good example, it was written by Ken Thompson and Dennis Ritchie because Multics failed, on a scavanged minicomputer.

Ritchie was a Harvard PhD. Thompson was a UC Berkeley MS. Both were extremely competent professional programmers. They weren't self taught amateurs.

-Businesses couldn't afford to have inhouse software written, if all their developers had to follow strict engineering guidelines.

In every other profession you have to follow strict guidelines even for in house work. Airlines can't train pilots to some arbitrary in house standard to save money. Dentists can't use unapproved methods to sterilise their instruments. Accountants can't use non standard audit processes.

Physical engineers are fallible. That is why they are so conservative. They thoroughly plan everything (even very minor 'hobby' projects) before starting. They use proven technologies, allow very large safety margins, and multiple redundancies.

Large road bridges collapsing due to design faults are so rare that they are used as engineering case studies. This has occurred a handful of times in the last 100 years.

The vast majority of car recalls are to repair very minor problems that have a low probability of causing a serious hazard. If the same consumer protection laws applied to consumer software Apple, MS and every other major consumer vendor would have been bankrupted years ago.

-Innovation dries up:
-IE 6 shows that software that doesn't release, stagnates. If the pace of development slows, innovation follows.

Since when is innovation the trump all value? how about stable work, good wages, deep knowledge, time with their kids...

-A lot of OSS and Free software got it's start from some guy scratching a particular itch, not a comprehensive design process. Unix is a good example, it was written by Ken Thompson and Dennis Ritchie because Multics failed, on a scavanged minicomputer.

A lot of OSS got its start from big powerful entities. Either monopolies like the old ATT, bells, universities...

-Businesses couldn't afford to have inhouse software written, if all their developers had to follow strict engineering guidelines.

So? What's wrong with that? There's nothing wrong with being a service provider. Most companies don't provide their own legal, power utility... services...

-Heres a list of bridges that have fallen
Sure, things go bad. But a completely open system has things go worse. Places with high professionalism and engineering.

- you need an internationally recognised software engineering qualification to release software to the public (except for testing purposes).
- all code needs to be fully documented
- all software needs a very stringent and formal testing process before being RTM
- legal liability for any damage caused by defective software
- professional indemnity insurance

That would just make the field more bureaucratic IMHO. If you really want software developers to have to buy insurance policies like doctors, then be prepared for a shock when it comes time to pay for professional insurance.

Your physician spends 10 cents on malpractice insurance from every dollar you pay for health care, according to Diana Furchtgott-Roth, a senior fellow at the Manhattan Institute. Furchgott-Roth notes that premiums vary from $20,000 annually in low-cost states to $200,000 annually in high-cost states. According to a survey published November 2011 in "Modern Medicine," family and general practitioners paid premiums of $12,100, and pediatricians' premiums averaged $11,800. OB-GYNs paid an average of $46,400, and plastic surgeons reported median premiums averaging $30,000.

Of course doctors often have more at stake, but still. Just as with ambulance chasers, "Software engineer chasers" would become a new thing. No thanks.

That's one theory, but many developers like myself would somehow have to absorb these costs while competing with offshored firms that don't have them.

- all public domain code needs to undergo a comprehensive design process before the first line is written

You are going to enforce standards on public domain software? Who's going to enforce that?

-the rate of new software releases would drop dramatically.
-The quality of software would improve exponentially within a few years.

I get that many developers are just no good and the result is a shoddy product. Yet the companies themselves are often just as responsible. There's usually a rush to get products done and little to nothing in terms of a budget for proactive security/bug fixes. We as software engineers end up doing the bare minimum because that's all that the company gave us time to do. I think every experienced software engineer will already know that it's not a one-sided problem.

I would argue that there is a use in distinguishing software engineering as a profession from software development. Software engineering (like other engineering professions) would be held accountable for failures.

I wouldn't propose that all software construction needs to be regulated that way, only the ones where it makes sense, i.e., safety and life critical systems, shutdown systems for nuclear reactors, etc.

In that context it makes perfect sense to have some kind of standard and protocol to deal with failures and some way to ensure quality.

In Canada a "professional software engineer" requires graduating from an accredited university and taking a second exam (after some years of experience) to qualify for the designation of "professional engineer". Its the same body that regulates the other types of engineering.

Th point is that you wouldn't be competing with shoddy offshore operators. You would be competing with people who would be legally required to be competent.

Virtually every car, bridge or skyscraper is designed by highly paid Western/Korean/Japanese engineers not outsourced to lowly paid locas. Even in countries like India salaries would rise dramatically because there would be a far smaller pool of qualified software engineers.

No American/European/Australian/NZ IT Project Manager is going to hire a cheap shoddy worker if the Project Manager can be sued for negligence.

Th point is that you wouldn't be competing with shoddy offshore operators. You would be competing with people who would be legally required to be competent.

I'm saying that when a company looks at their options, they'll see domestic costs increasing and foreign development will look that much more attractive. Adding new overhead for domestic software engineering will make the imbalance even worse.

A lot of offshoring has been motivated by cost reductions, which often goes hand in hand with quality reductions (ie laying off the already qualified domestic engineers and replacing them with far cheaper offshored ones).

I realize you have good intention here, but companies seeking to maximize profits just won't care.

- you need an internationally recognised software engineering qualification to release software to the public (except for testing purposes).
- all code needs to be fully documented
- all software needs a very stringent and formal testing process before being RTM
- all public domain code needs to undergo a comprehensive design process before the first line is written
- legal liability for any damage caused by defective software
- professional indemnity insurance

The result would be:

-99% of software authors would disappear overnight.
-salaries of competent software designers would rise dramatically.
-the rate of new software releases would drop dramatically.
-The quality of software would improve exponentially within a few years.

For me to agree with any of this I would 1) have to use drugs, and 2) be insanely high on them to think any of that is a good idea.

For me to agree with any of this I would 1) have to use drugs, and 2) be insanely high on them to think any of that is a good idea.

That's because you don't have a professional mindset. True professionals always want higher standards in their field. I've never heard surgeons say "surgical raining takes way too long...how about we cut out the residency and medical school requirements."

I care about developing the best software for our users with the resources i have at my disposal. It might not fit your list although there is some overlap, but what it does do is help our users perform their job better and easier, saving them money they can spend on better things. I am sure i would do things differently if i was designing the software for the next lunar lander, but i am not, and neither is the great majority of the IT industry. Instead we do our best to actually solve real problems, today, not in 10 years.

Sure we focus on code quality and try to fix bad stuff, but only when it makes sense and is worth the effort.

I might not be a true professional by any of your metrics, in fact i am probably the opposite as i also am the one who have to find the cost saving shortcuts that prevents others in my team from being your true professional, but i will happily continue doing what makes everyone (important) happy since we can't print our own money so i could hire 20 extra developers to accomplish basically the same as what we manage to do with 5 right now.

There are boatloads of contractors who are not very professional, but who do the work at prices and timelines people are willing to pay. But you'd hardly call these people professional.

Well, that's true of any field. Some of us are professionally behaved while others are not. That will never change whether we're certified or not, agreed?

Do you really think people is software so special that they should ignore the lessons learned by every other profession in history? It is rather amazingly arrogant for most software engineers to think so, all the while always complaining about everything (treated like crap by business people, poor quality, poor working conditions...)

Software in many ways is special. There are plenty of analogies used to compare software to our physical world, but these always fall short because software really is different from everything else in the physical world.

This isn't to say we should not have standards, but any argument that software engineers should do X because other physical engineers do X is going to fall flat IMHO. You would need to make a convincing case that the software field would improve without overly relying on comparisons to other kinds of professionals.

Software in many ways is special. There are plenty of analogies used to compare software to our physical world, but these always fall short because software really is different from everything else in the physical world.

It would be like if engineers had to deal with different laws of physics if they were building a bridge vs a car vs a skycraper vs a 787 and each at different engineering firms.

No, that's not it. Your recommendations completely ignore the root problems, which have been pointed out by several other users already. It's a simple theory that looks good on paper only when you put blinders on. Regardless of any over-simplified analogy you think makes sense, your theory applied in the real world wouldn't have anywhere near the affect you're convinced it would. I would be willing to bet it would do the opposite.

I do however give you credit for acknowledging there is a lot of room for improvement. On that point we can agree at least.

Software engineering needs to be like physical engineering:
- all software needs a very stringent and formal testing process before being RTM

Pray tell, where is that formal testing process defined? I know we have unit testing and QA scripts and stuff like that, and I am one of those developers who tests his code before commiting... but all that is just one eyed people in the land of the blind. No one has any idea how to fully test correctness for software.
In engineering however, you have materials science and can predict what kind of forces will be involved and how to build your stuff to widthstand them. And even then they will cut corners to save costs, liability be damned

Also, you will pay 20x what you're paying now for commercial software. Maybe 100x. Including crap like office suites that aren't really life support. Free software (as in beer) will simply disappear because no one will take insurance out of their own pocket. Only the rich will be able to afford software

You do exactly what 19th century engineers did. They went from guesswork to testing every single material and component down to individual rivets until they failed. They used this knowledge to develop rigorous procedures and standards. Each new project was a minor iteration of a existing design using proven materials and techniques.

Software engineers often try and build a Bugatti Veyron when they still have barely worked out how to build a reliable Model T Ford.

A car maker will spend about $2 billion developing a new model. It will invariably use proven existing technology and be a very minor variant of an existing car. They will make dozens of prototypes and literally test them to destruction under far harsher conditions that they are ever likely to encounter in the real world. Then and only then, will the car go into production.

Software would be no more expensive. It would simply evolve more slowly with a far greater emphasis on quality rather than implementing unnecessary new features or style changes.

At the local train station, there has never been a week where an escalator isn't out of order. Sometimes two out of the four available.

Today, bridges and buildings are of increasingly poor quality at the end of construction. Even in the developed world, newer bridges collapse and pipes burst.

People don't even want to pay enough for hardware quality. They're not going to pay for software quality unless it's really important like space technology software, aircraft/traffic controls and nuclear power stations.

Hell, not even the banks want to mess with their COBOL base even if the code ABENDS all the time.

In most cases the problems is cost cutting or lack of time for maintenance rather than genuine design flaws. My local shopping centre has worn out travelators because they increased the trading hours to a decade after the centre was built. Originally they had a 36 hour to perform deep maintenance. That was cut back to about 12 hours when Sunday trading commenced meaning that only ad hoc repairs can be done.

AFAIK all Australian banks were legally obliged to completely rewrite their backend software to much higher standards many years ago. [In the 1980s the smaller retail banks were all consolidated to form four huge national banks.]

Yes, and no amount accreditation can override the forces of cost cutting, which is also what plagues software development rather than engineering standards.

As for your travelators, my escalator example is a bit different because they were supposed to be designed for more use than a shopping centre and they still fail. They were newly built for the newly opened Mandurah Line, and they didn't even reach the decade of use of your local shopping center.

No doubt if software development took place in the 19th century, we may have had the luxury of "over" engineering everything.

It isn't the 19th century anymore. Every other manufactured product on the market is regulated for safety, quality and reliability. If you want to sell a car or a choc chip cookie you have to comply with the extremely onerous regulations.

If you want to release software you need to do what professional physical engineers do - design, develop and test to rigorous formal standards. If you fuck up you should pay the consequences by losing your job or getting sued. If you can't comply stick to being a hobbyist or get a new career.

Regulation actually increases innovation because it forces engineers to think laterally. Extremely strict international emissions control, fuel economy and safety standards cars for new cars have produced more innovation in the last decade than in the previous 40 years. Americans were (are) still building cars with crude pushrod engines and a separate chassis 50 years after they were obsolete. The US car makers spent decades and billions of dollars fighting new regulations while other countries went about designing and building better vehicles. [We know how that turned out.]

If you want to sell software you should have to pass an internationally accredited course to become a Licenced Software Engineer. Every physical engineer has had to do so for well over a century.

Regulation actually increases innovation because it forces engineers to think laterally. Extremely strict international emissions control, fuel economy and safety standards cars for new cars have produced more innovation in the last decade than in the previous 40 years. Americans were (are) still building cars with crude pushrod engines and a separate chassis 50 years after they were obsolete. The US car makers spent decades and billions of dollars fighting new regulations while other countries went about designing and building better vehicles. [We know how that turned out.]

If you want to sell software you should have to pass an internationally accredited course to become a Licenced Software Engineer. Every physical engineer has had to do so for well over a century.

Well, your talking about how regulating the end product helps create better end products, but then you completely switch gears and say software engineers need to become accredited with some kind of license. The later does not follow from the former. In your example, the reason cars got better emissions control and fuel economy has nothing whatsoever to do with engineers being licensed or not.

Same thing goes with energy efficiency requirements of refrigerators. The standards have made an enormous difference, but the fact of the matter is HVAC engineers were certified both before and after the energy standards. The problem in these cases obviously isn't certification, it's that companies in a free market tend to build things as cheaply as possible to maximize profits. Some industries actually have *negative* incentives to increase efficiency (ie who killed the electric car).

All physical engineers have been formally trained and licenced since the late 1800s. [The Mullholland Dam near Los Angeles failed, killing over 1000 people, partly because it was designed by an entirely self taught 'engineer' with no formal training.]

A formal qualification sets a minimum standard. Many physical engineering graduates never become any more than a than technician because they fail to develop more advanced skills.

A large percentage of software designers, particularly in India, are 'failed' civil, chemical or mechanical engineering graduates.

This is a very passionate topic, so it's good that you brought it up. I just feel that you are ignoring some of the counterpoints many of us have expressed based on our real world experience in the field.

An important fact is that the engineers usually aren't the ones running the show. I've seen poorly tested products go to production, and even though I voiced a strong objection, in the end it doesn't matter when management is determined to impose insufficient budget/time constraints. It seems entirely unfair to blame software engineers for decisions that (I believe in the majority of cases) they didn't make.

Frankly it doesn't take a software engineering license to see what's gone wrong most of the time. We already know it, but don't have the budget to do anything about it. I really wonder why you aren't taking a stronger position on regulating the end product instead of licensing the software engineers? Otherwise we will end up increasing overhead expenses for software engineering licenses while ending up in the same underfunded position as before with regards to product quality.

you need an internationally recognised software engineering qualification to release software to the public (except for testing purposes).

This would lead to the creation of more parasite businesses being happy to sell you a "well recognized certificate" if you pay, and the result would be exchanging knowledge, experience and qualification for money. This is already the situation in Germany where you can easily get certified without actually needing to understand stuff - it's sufficient that you (or your current employer) is willing to pay for the "certification" so you become "more valuable" to the "free market". I've seen that system in action so many times that I can assure you: It doesn't work. It only makes stupid and wealthy people occupying important and responsible jobs which they aren't qualified for. The results can be seen in reality. It's more scary than it sounds.

This would lead to the creation of more parasite businesses being happy to sell you a "well recognized certificate" if you pay, and the result would be exchanging knowledge, experience and qualification for money.

No it wouldn't. It would requite a standardised software engineering degree from a reputable university, supervised practical work experience, written exams (or submission of code) and continuing education modules. This is how all physical engineers are certified.

"This would lead to the creation of more parasite businesses being happy to sell you a "well recognized certificate" if you pay, and the result would be exchanging knowledge, experience and qualification for money.

No it wouldn't. It would requite a standardised software engineering degree from a reputable university, "

First problem here: Not everyone capable of actually delivering all it takes to become a good engineer will be able to afford attending university. It will probably be like "education for the rich ones".

You can already see this today (even though it's not 100 % related to "engineering"): The most capable, most skilled, most advanced and most experienced IT professionals are usually those who do not bother "sitting there" and getting a degree (and pay money for it), but instead educate themselves, learn, train, try, do stuff and become better every day. They are often called "master hackers" or "rockstar programmers". They can outperform almost every university graduate in a comparable field.

supervised practical work experience, written exams (or submission of code) and continuing education modules. This is how all physical engineers are certified.

I would actually really like to see this become reality, as it would eliminate the question of "he has a degree, but does he really know stuff?" question which we currently have here in Germany with university graduates and "professionals" with shiny certificates: The reality is that only a small subset of them are actually qualified in a manner that we would expect. Today, you can be lucky if a "Dipl.-Inf." (Diplominformatiker, an IT degree) is actually able to use "everyday software" or has a basic clue about programming. Sadly, industry assumes that those who show up with shiny papers, curly signatures and coloured stamps are experienced, qualified, motivated and good programmers (because the job description says "programmer, university degree required"), but what they get is usually "hit or miss", and in many cases, it's just "miss". Strangely, unqualified people still get the jobs here as long as they are certified, and in the result you can see the creation of non-working software, hostile work environments, unusable tools and buggy processes. Still nobody cares because they are certified! And of course they are much better paid than those who can actually do the work that is required, usually skilled engineers without that many degrees - people who simply can do "all the stuff". Of course there's additional confusion created in HR where "engineer" and "programmer" are not understood. The hiring process starts with simple pattern matching, and qualified candidates who miss a degree or previous job title get excluded quickly.

I know that a reliable certification would solve lots of problem, but as you can see, it introduces many others. This is immanent to the system. The system allows abuse, and therefore abuse will happen. This is where the "parasite services" enter the stage: They offer what industry requires in a market manner: Want degree? Pay money, get degree. Problem solved.

All the things you've mentioned, supervised practical work experience, written exams (or submission of code) and continuing education modules, are positive and useful things. But they cost time and money. Industry is not willing to invest that, and the individuals subject to that education probably won't. A common argument is "I've been at university, I don't have to learn anything new!" Having a degree is more important than being able to do a good job, as it seems.

But as I said, that is mostly the situation here in Germany as I have experienced it. It's a very sad description, I know. It may be totally different in other countries where a university degree or certification actually means something, and is worth more than the paper it's printed on. :-)

Now, hold on a second. I'm a "physical" engineer (mechatronic medical devices engineer to be specific), and I have to say much of what you're saying is rubbish, mainly because most engineers don't work alone. It's not the engineer that is responsible, it's the company the engineer works for, and therefore it's up to the company to ensure that a) the engineer is competent, and b) the product released to the public is fit for purpose.

If it isn't, it's the company's fault for releasing it, not the engineer's fault. I have developed many things like aortic stents, optical blood analysis systems and other medical-level products, and have developed software and firmware for some blood analysis equipment too. This is serious stuff, yet:

- I don't have a software development accreditation other than my Mechatronics degree
- My commenting and documentation are decent but I've no idea if they meet *your* standard
- I've no public indemnity insurance
- I've no say on whether my products (software or hardware) get released or not

Are you saying that I shouldn't be allowed to develop these products? What absolute rubbish.

If you want software quality to improve, simply tighten up the QC and QA procedures applied. Software in the medical devices field is buggy, as it is in every sngle field, but it has gone through years of validation and verification until the company is happy that it's not going to kill someone. Only then is it released to market. Of course most applications could be put through the same development cycle, but then the majority of people would be complaining that there hasn't been a new version of Word in 8 years, and the existing version cost them £5,000 per licence. In reality the market dictates that software needs to be cheap and frequently updated, and since that's what companies want, they enforce it. It's nothing to do with bad programmers, it's to do with enforcing tight deadlines and minimal QA and QC processes. After all, a silly bug in Word (and yes, there are many) might piss a few people off, but it's hardly likely to kill them, is it?

Are you a graduate of an accredited college with an engineering degree? Is it a Masters or just a BS? If it is a Masters, do you hold a State License as an engineer? Did you pass the required exams? If not, do you therefore report at work to those men and women who do?

Your field is an interdisciplinary one that includes aspects of software engineering. And yes, you as a member of your company, would be liable for any issues especially with a medical product, which by the way must meet certain professional level qualifications in order to be approved by the FDA, right?

I work in two fields - psychology and economics. In both, I have had to follow the same professional requirements as UncleFester rightly suggests software engineering and development should now follow. I, too, find myself programming quite a bit given the nature of my work and research. I gladly would welcome greater requirements of professionalism in that field. I would take the time to achieve those added requirements. It would benefit me, my research, my university, and those whom my work helps.

The resistance to this is laughable. Everyone who disagrees dreams of becoming Gates or Jobs while no one wants to aspire to being Ritchie or Knuth. With recent severe issues like Heartbleed and eBay's security breach, with Appple wanting home automation as its next big thing, and with Microsoft wanting inside of automobiles, it far past time for the computing field to grow up passed the adolescent 'freedom' phase and into the adult professionalism phase of its evolution. Only teenagers living at home with mommy and daddy footing the bill believe what they have is 'real' freedom.

Well, first off I'm not in the US so terminology is a different in some cases. We do adhere to FDA requirements of course but our main concern is with ISO regulation.

I don't have a Master's, it's "just" a Bachelor's degree - BEng actually. I don't have a state licence or any equivalent here, nor does any other member of my team.

How the medical field works is that someone ultimately approves the work that I do, and since I'm a senior engineer, that somebody is a combination of management, QA and Regulatory Affairs, none of whom have software engineering qualifications. If I do a half-assed job and produce buggy software which causes someone to be overprescribed Warfarin or something like that, the responsibility is with management for allowing such shoddy products to reach the market without adequate testing, and it would have to be proven that I deliberately and maliciously circumvented the quality systems for it to come down to me personally. They don't need to have software degrees to approve my work or the work of my team, they need to have the correct controls in place to ensure that any issues get caught. That's their profession and they're good at it, be it hardware, software or implant.

I don't know where you got the idea that I want to be the next Jobs or Gates. I most certainly don't. What I do believe however is that professional products regardless of the field they originate from, need all steps of the development and manufacturing process to achieve a standard appropriate for their use. That means planning, design, development, verification, manufacture, validation, QA and post market surveillance. Manufacture alone does not make a quality product, the quality control steps (verification, validation, QA and post market surveillance) are every bit as important, and together have the single biggest impact on end product quality. Mission-critical stuff always needs far more rigorous testing, and such testing should have caught things like Heartbleet etc., but I think a lot of people are expecting mission-critical quality from products which are obviously not mission-critical, such as MS Office or games. I wouldn't expect my washing machine to be built to the same quality standards as an aircraft engine for example, and therefore would expect that lower standards are applied across the board at every level of the washing machine's manufacture. And I'm ok with that - I only paid €400 for the washing machine and if it breaks down, chances are it won't kill me or steal my bank details.

Perhaps some more experience of the manufacturing industry would make the concept of quality products a little clearer to people who live exclusively in the field of software or academia.

I don't think anyone is arguing against QA steps. In fact, I would agree that those need beefing up as well as the professionalism of the earlier stages of development.

I am quite aware of the concepts you are putting forth despite my professional background. Trust me, there really aren't as many differences as you imagine.

Planning, conducting and publishing research for review is not that different. It follows similar QA and specific cross-profession standards.

It isn't just mission critical needs, it is reliability. If that wash machine in your example has poor QA and buggy software, it will be recalled or manufacture fixed. If it isn't, then there are liability issues that arise. If those aren't addressed, the company in question runs the risk of lawsuits and even going out of business. The same kind of standards and professionalism just don't exist currently in OS, application, and game design.

Hello I am Warner Bros, and I could give a shit about fixing the bugs in Batman, all I care about is taking your money from DLC purchases. That's the kind of bullshit that needs to stop, and is not allowed in other regulated professions.

I think many more examples including Heartbleed and others could be found. There is a problem. There are solutions. Many software developers unfortunately do live in a fantasy world of specialness and freedom at all costs.

It isn't just mission critical needs, it is reliability. If that wash machine in your example has poor QA and buggy software, it will be recalled or manufacture fixed. If it isn't, then there are liability issues that arise. If those aren't addressed, the company in question runs the risk of lawsuits and even going out of business. The same kind of standards and professionalism just don't exist currently in OS, application, and game design.

Hello I am Warner Bros, and I could give a shit about fixing the bugs in Batman, all I care about is taking your money from DLC purchases. That's the kind of bullshit that needs to stop, and is not allowed in other regulated professions.

I agree that these are bad outcomes. Yet aren't you forced to concede that it's the company's decision not to fix the bugs for financial/resource reasons? It's not like the software developers themselves are unwilling/unable to fix them. The question of funding keeps on getting overlooked.

I think many more examples including Heartbleed and others could be found. There is a problem. There are solutions.

Heartbleed was serious, and there are plenty more examples of exploitable vulnerabilities in both proprietary and open source domains. Yet you realize that OpenSSL was/is free software? You did not pay it's developers for it, right? I'm willing to bet that you didn't have any kind of support contract from any of it's developers, right? Did you donate anything to help fund QA, code reviews, or even developer training/certification? If not then it seems completely hypocritical to tell software developers how to do our work, and then not come through with ways to fund these things (even commercial projects can face internal resource problems as highlighted by your links).

Many software developers unfortunately do live in a fantasy world of specialness and freedom at all costs.

With all due respect, maybe the fantasy is that having state license requirements would somehow give software developers the resources we need to improve software from it's present state. This is the actual problem that we could use a solution for!

All software sucks... Yet people panic when they don't have access to it.

All computers suck... Yet they're hopelessly lost without them.

Technology reflects people and that's why it sucks... Only technology doesn't suck, and neither do people.

The sky is falling... Except that it isn't.

My advice to anyone who thinks computers and technology is so horrible is to stop using them. There are plenty of places in the world where computers and technology aren't a part of daily life. If you think the grass is greener, by all means go find out. Just prepare yourself for the anxiety you'll likely have to live with.

The guy basically says that because things can break or be used in a wrong way it means they suck and, well, that's just fucking moronic. Does a hammer suck just because you can use it to bash yourself on the fingers? Does a calculator suck just because you enter wrong operands in it and end up with an answer you didn't intend to get? Do other humans suck just because they value things differently than you and therefore do things differently?

Well, no. If everything was designed in such a way that they couldn't ever be used in the wrong way or they could never, ever break then they'd be useless, they'd be so locked down that they wouldn't be able to do anything and they wouldn't allow humans to use them in the first place. And no, something being able to be used in a wrong way or by wrong person or whatever doesn't automatically mean it sucks.

Having room for improvement doesn't mean something is broken. Not being perfect doesn't mean something sucks. The author isn't the first to have a melodramatic outburst, and surely won't be the last. One thing is certain -- the world, and technology, will keep moving right along through all of it. Maybe some day technology will allow for time travel and he can then go back to live during the middle-ages where he wouldn't have to deal with the horror that is computers & technology.

The guy basically says that because things can break or be used in a wrong way it means they suck

No, he says that no matter how you use your software, and no matter what software you are using, it's flawed and can be attacked and abused by others without you knowing it. He's right.

Now if I used the hammer to hit nails, and occasionally hit my finger, but then at night the hammer is used remote controlled by someone else to bash in someone elses skull (or mine, and then steal my wallet), *that* would be a better analogy...

Well, software DOES suck. A lot. Software today is still rather primitive, especially things in the web (html/css/javascript) space.
The number of times per day I get frustrated with some bug or design deficit is pretty damn high.
But then I relax and realize that it's also pretty awesome at the same time and I *don't* write a whiny blog about it.

The main reason why software is broken is because there is a lack of any kind of professional association.
It would enforce standards, a certain skill level, and probably some kind of liability.

Yes, it is a trade off. Things would probably move slower if we had some kind of organization.

Consider FaceBook. Zuckerberg was some kid in college. Wrote up a webpage. Threw it out there. People used it.

Now imagine a world with a professional association controlling software. First, Zuckerberg would have to finish school and obtain his association. Then due to the standards in the profession, he would have to write proper documentation and code to an acceptable standard. He would have to have his code audited by a security professional as it would be dealing with personal information. The staff he hires would be expensive as they are also part of the association and certain staffing/skills/training levels would be maintained.

And we'd probably get Facebook in like 10 years time.

I actually don't say that as a negative. The reality is we can dismiss Facebook thinking it is just some silly app. But if you have been in software, you know that is almost how the majority of software is written. From banking to networking.

Perhaps we could separate these domains or something. I am not sure.

In the end though, it seems the world has not come to an end. Software gets built. The world moves on very quickly. So maybe it has all been for the best.

Perhaps the sheer complexity of any system is too complex for anyone and the only way to get through it is to just grind our way through it.

Of course, maybe if we had a professional association, we would have had better standards so it would be less complex. But who knows.

Facebook started as nothing more than an online yearbook. The degree of real innovation was close to zero.

The real 'value' of Facebook is that it allows intelligence agencies open access to the activities and thoughts of potential 'subversives'. [In fact Facebooks earliest angel investors were closely associated with the intelligence community.]

The software problem is more like some kid making a jetpack in his garage and selling to all and sundry without testing or certification.

- Perfection is an invention of human minds, there is no such thing unless you make some presumptions about what it should looks like. Greeks did it about circumference / 2 * radius and ratio between natural numbers, it did not end well. Similar mistakes abound on science (see the development of modern physics or the attempt to conceive a tautological model for all math, for example);

- As already said, there is a relation between time to develop / time to deployment translated to money expended. Blow it and you end with no software delivery;

- Engineering (civil, mechanical, etc) are far from what people think. They work because we test over and over again and we keep improving our models/analyses and our manufacture and its controls. Oh, and there are our security factors to account for the big "unknown / unpredictable" (and believe, on civil engineering it may be a factor of 10 when dealing with soil).

Even though we see the same process described on last item also on software, we see also two serious drawbacks that are rare occurrences on engineering (though there are examples of them):

- We keep changing the foundations (i.e., APIs and languages) and deploy the new ones on a very fast pace even when they are clearly no ready yet;

- We frequently brush aside/ignore the failure reports or fail to follow a strict procedure to mitigate unavoidable mistakes instead of using the incremental process engineers learn to follow.

So, instead of ameliorate our tools with passion we prefer to throw them away and (almost) start over again and for some reason do not investigate properly when something is wrong. Guess this is what give us the troubles we have.

Luckily, it does not happens this way on all software stack (or we would have an unsustainable situation). Compilers and basic functionalities of OSes follow a more strict model. The problem is that we interact with the top of the stack where things are not that great.

The claim about the state of human computing machines is one level of abstraction above the actuality on the ground. It is therefore on a level with other, similar such claims about concrete realities. So I would like to quote Noam Chompsky concerning a similar claim about not only language, but about reality in general.

"The spine is extremely badly designed. From an engineering point of view it's a wreck... This is true of just about every system of the body. We'd have been able to save ourselves from a lot of Saber Tooth Tigers if we had a eye in the back of the head... if we'd had wheels. Organisms just do the things they can do, and it's usually a mess.

"The inorganic world turns out to be completely different. The driving intuition of physicists and chemists is that somehow it's all gonna come out extremely simple... and its been remarkably successful. There's probably something to it, but why that is nobody knows...

"Language is very much like that kind of thing we find in the inorganic world... That's kind of curious because langage is the most complicated thing the organic world could do. So why does language look like its analysis should be discovered to be simple? In fact right at its core you already have this property; language is a system which is technically called a 'system of discrete infinity'...

"Every aspect of language is discrete. Its a strange fact because there is essentially nothing in the organic world that meets the condition of discrete infinity.... Systems which are both discrete and infinite are very rare.

"I think the only place you find anything like a discrete and infinite system in the biological world is when you get down to the level of big molecules molecules; and then you're back in the inorganic world. Furthermore language has other properties that are rare (but not non-existent) in the biological world: if the economy conditions are real, and they seem to be, those of you who know anything about computability will recognize right off that if you introduce economy considerations, if you say that one computation is blocked if there's another more economical one, you introduce huge computability problems -- unsolvable computability problems. So the question is How does anybody know it if such problems are unsolvable? Like how do you know if some derivation is going to block another one [economically]. It turns out very quickly to be a problem of a very high level of computation and complexity way beyond what's solvable. If that's true, what it means is that large parts of language must simply be unusable because they will involve computability problems that can't be solved.

"In itself that is not a surprising conclusion because we know its true -- we know that large parts of language are just unusable. In fact the good part of psycholinguistics (the study of how language is processed) picks as its data things that are unusable, things like a "garden path sentence". Those are all selections from parts of language that are unusable, and they're interesting to to test for that reason; they give you interesting things about processing that are learned from the part; these can be quite simple sentences, they just can't be understood. So we know that large and scattered parts of language are completely unusable. That shouldn't bother anyone who things about language from the point of view of real biology, not pop-biology. In pop-biology everything is supposed to be usable and well adapted, remember? But in the real world everything is a mess and nothing works at all.

"It just works well enough so you don't die. Its got to work well enough so that you can reproduce, so you can go on; but nothing has to be usable, and most things aren't. Its true of the study of vision. Its true of the study of motor mechanisms. Its true of every aspect of psychology and physiology.

"There is that mysterious core that are just the questions that we are humanly interested in (for the most part) in which you can only stare and puzzle. These may reflect other aspects of human cognitive capacity. It may mean they are the kinds of questions that we are not designed to answer. It could be. Or it could be some other reason. But that's where things stand at the moment."