Anyway, back to Computer Chronicles. In an episode about Word Proccessors, the man they were interviewing said "There's a lot of talk about making people more computer literate. I'd rather make computers more people literate." There's a phrase that resonated with me in a big way.

It sounds like the kind of semantic buzzword shuffling so common in standard corporate speak, but I got the impression that the guy that said it, believed it. He believed that computers had gotten powerful enough that they no longer had to be inscrutable.

There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.

The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.

The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.

That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.

Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.

Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."

Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.

Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.

That, IMO, is the distinction.

I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.

It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.

Letting people build their own tools means letting people control their own destiny.

If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?

The last 10 years of development in computers were a mistake. Maybe longer.

Instead of making computers Do More, or making them Feel Faster, we've chased benchmarks, made them more reliant on remote servers, and made them less generally useful. We brought back the digital serfdom of the mainframe.

@ajroach42 I want to respond, elaborate, & discuss at length here. I spent about 10 months some years ago immersed in the computing literature around the history of debuggers, during which I went from EDSAC to Visual Studio, but also all the other half-dead ends ends of computing history such as, e.g., Lisp machines.

Naturally, I came out of it a Common Lisper, and also naturally, with Opinions about modern computing.

@ajroach42@ciaby This was the Great Debate that was largely won by Microsoft. "Everyone can 'use' a computer.". That is to say, everyone can operate the appliance with preinstalled software. *everyone*. Apple pioneered the notion, but it turns out to be the preferred mode for businesses, who really rather don't like having specialized experts.

@Shamar@ajroach42@ciaby ... continuing. Next year, maybe you want a different model, so you break off and redo it a bit. Now you have to figure out how to juggle two incompatible models in your code, and you're on your way to inventing an abstract interface system.

here's my claim: software is crystallized thought, with all the complexities, ambiguities, and changing over time of thoughts. we can gut the whole shaky tower of modern computing, and we'll still be confronted with the core problem (even assuming a workable and standard bit of hardware for the engineering problems, themselves non-trivial sometimes)