A long article but very worth reading. Pilots don’t directly operate the controls of their planes. YOU don’t directly operate the controls of your car. In both cases you are telling software to operate controls and software has a habit of growing and becoming overcomplex—spaghetti code. Then, suddenly—your brakes don’t work.

This affects everyone.

I did one major piece of coding: I wrote an order entry and invoicing program in BASIC on my TRS80. We used it for years, until the TRS80 just wasn’t enough computer as the business grew and we bought one of the early Macs. BASIC—Beginner All-purpose Symbolic Instruction Code is liable to spaghetti code. I fought to keep it clean. When finished, I had to buy a compiler else the printlines looked neat and orderly until the 64th character when everything ran together, no spaces etc.

Was pretty neat and it worked for years with only a few minor changes needed. More reliable than the software we bought for the Mac! It produced invoices with the printing very nicely balanced, space for auto generated messages (“Your subscription ends in XX months,” and “Your subscription has expired, renew now” etc) and there was space for a custom message but I don’t think we ever used it. It printed a stub to be returned with payment, etc.

But it took forever—BASIC is crap for writing real programs in. Business partner couldn’t understand why it took so long. These days you could set something up over a weekend in Excel. We did have the precursor to Excel—Visicalc but you couldn’t use it like you can Excel, just a spreadsheet. Same with Word—you could make a letter that was the invoice and create a text file of client names, orders and so on. None of this existed in 1985.

Abbott & Co are going to cause the mother and father of all recessions—be prepared!

It’s been said that software is “eating the world.” More and more, critical systems that were once controlled mechanically, or by people, are coming to depend on code. This was perhaps never clearer than in the summer of 2015, when on a single day, United Airlines grounded its fleet because of a problem with its departure-management system; trading was suspended on the New York Stock Exchange after an upgrade; the front page of The Wall Street Journal’s website crashed; and Seattle’s 911 system went down again, this time because a different router failed. The simultaneous failure of so many software systems smelled at first of a coordinated cyberattack. Almost more frightening was the realization, late in the day, that it was just a coincidence.

Our standard framework for thinking about engineering failures — reflected, for instance, in regulations for medical devices — was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing. Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”

Technological progress used to change the way the world looked — you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code. When you press your foot down on your car’s accelerator, for instance, you’re no longer controlling anything directly; there’s no mechanical link from the pedal to the throttle. Instead, you’re issuing a command to a piece of software that decides how much air to give the engine. The car is a computer you can sit inside of. The steering wheel and pedals might as well be keyboard keys.

Like everything else, the car has been computerized to enable new features. When a program is in charge of the throttle and brakes, it can slow you down when you’re too close to another car, or precisely control the fuel injection to help you save on gas. When it controls the steering, it can keep you in your lane as you start to drift, or guide you into a parking space. You couldn’t build these features without code. If you tried, a car might weigh 40,000 pounds, an immovable mass of clockwork.

Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of code. But just because we can’t see the complexity doesn’t mean that it has gone away.

The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning. As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world — and Dijkstra thought they had perhaps overestimated themselves.

There will be more bad days for software. It’s important that we get better at making it, because if we don’t, and as software becomes more sophisticated and connected — as it takes control of more critical functions — those days could get worse.

The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little. There is a small but growing chorus that worries the status quo is unsustainable. “Even very good programmers are struggling to make sense of the systems that they are working with,” says Chris Granger, a software developer who worked as a lead at Microsoft on Visual Studio, an IDE that costs $1,199 a year and is used by nearly a third of all professional programmers. He told me that while he was at Microsoft, he arranged an end-to-end study of Visual Studio, the only one that had ever been done. For a month and a half, he watched behind a one-way mirror as people wrote code. “How do they use tools? How do they think?” he said. “How do they sit at the computer, do they touch the mouse, do they not touch the mouse? All these things that we have dogma around that we haven’t actually tested empirically.”

Abbott & Co are going to cause the mother and father of all recessions—be prepared!

Been told a few personal anecdotes by people who said they been going along the highway at 100km/h at nighttime, then suddenly.....everything dies froma computer glitch. No headlights, no brakes - nothing. Luckily it switched back on before there was a crash.

I'm not an engineer but I'm not sure I see the need for computer-controlled brakes. The physical, mechanical connection type worked fine - is there some sort of supposed improvement to be had by doing away with the physical connections and making it all electronic? Toe the brakes of your car and then see if you think electronic ones would work any faster - the physical ones respond virtually instantly as it is.

The only need I see for electronic brakes is if the vehicle is to be fully self-diving and autonomous.