The whole video of Engelbart's Online System (NLS) is available on youtube. Some of it is *really* interesting. Most of it is unfortunately dry. It's easy to forget that this was 50 years ago, and also mindblowing that it was only 50 years ago.

Anyway, back to Computer Chronicles. In an episode about Word Proccessors, the man they were interviewing said "There's a lot of talk about making people more computer literate. I'd rather make computers more people literate." There's a phrase that resonated with me in a big way.

It sounds like the kind of semantic buzzword shuffling so common in standard corporate speak, but I got the impression that the guy that said it, believed it. He believed that computers had gotten powerful enough that they no longer had to be inscrutable.

There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.

The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.

The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.

That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.

Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.

Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."

Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.

Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.

That, IMO, is the distinction.

I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.

It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.

Letting people build their own tools means letting people control their own destiny.

If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?

The last 10 years of development in computers were a mistake. Maybe longer.

Instead of making computers Do More, or making them Feel Faster, we've chased benchmarks, made them more reliant on remote servers, and made them less generally useful. We brought back the digital serfdom of the mainframe.

@ajroach42 I want to respond, elaborate, & discuss at length here. I spent about 10 months some years ago immersed in the computing literature around the history of debuggers, during which I went from EDSAC to Visual Studio, but also all the other half-dead ends ends of computing history such as, e.g., Lisp machines.

Naturally, I came out of it a Common Lisper, and also naturally, with Opinions about modern computing.

@ajroach42@ciaby This was the Great Debate that was largely won by Microsoft. "Everyone can 'use' a computer.". That is to say, everyone can operate the appliance with preinstalled software. *everyone*. Apple pioneered the notion, but it turns out to be the preferred mode for businesses, who really rather don't like having specialized experts.

@Shamar@pnathan@ciaby I feel like you think this was a clever point, but I don't understand what you mean.

Programming is a specialty, and some people have other specialties. Expecting them to also become expert programmers because our current expert programmers can't be arsed to make extensible and understandable tools is unreasonable.

@ShamarI feel that it's not only a matter of research, but also a point of throwing away some tech that we take for granted (x86, for example) and rebuild from scratch with different assumptions in mind. In the current economic system I find it quite hard to do... @pnathan@ajroach42

@Shamar@ajroach42@pnathanThat's possible, and somehow is already happening.What I'm talking about, however, goes much deeper than that. I'm talking about open hardware infrastructures, where every component is documented, there are no binary blobs or proprietary firmware .Very important is also the instruction set, because what we have now (x86/amd64) is incredibly bloated and full of backward compatibility shit.RISC-V is a step in the right direction. If only the hardware wasn't so expensive... ;)

the mail servers fail. the administration is confusing because docs aren't perfect, so it gets misconfigured. the network goes down. baby pukes on server and it fails to boot. server is overloaded by volume of spam.

then the task is outsourced to a guy interested in managing the emails....... whoop whoop we're recentralizing.

@Shamar@ciaby@ajroach42 my Inner Young Geek wants to argue that actual configurable systems are actually not used in the home outside and that mail servers cross that barrier between appliance and administrating-needing machine.

but let's not rabbit trail onto that. ;-)

more my contention and question is: should we expect a member of cyberspace to be knowledgable in minor sysadmin?

I argue yes! we expect people to be able to refill their oil in cars, right?

a better metaphor is cooking. everybody is expected to know enough about cooking to feed themselves. some people cook at a much more expert level, and people who are capable of feeding themselves pay those experts to feed them occasionally. cooking for yourself has benefits over eating out even if you aren't very good, because you can cater to unusual preferences.

@enkiv2@pnathan@ajroach42cooking for yourself also keeps the cost of eating out down, because professionals are competing with free. if all professional chefs started doing something (like cooking 'rare' burgers as well-done to avoid liability), home cooking isn't subject to those rules.

It's possible because cookbooks are mostly for the intermediate talented-amateur cook.

@pnathanAt scale, yes. Although I feel that software is not evolving because:The effort to develop new OS is too great, given the amount and complexity of the modern hardware (and closed specs). Without a new OS, you can't develop new paradigms, and so we're stuck with ideas from the 70s (unix mostly, plus VMS-influenced Windows). Programming languages are going to use the OS, and that's why we're not really progressing...My proposal: simpler hardware, open and documented. Build on top of that. No backward compatibility. :) @ajroach42@Shamar

@pnathanOr, why not just use SFTP, with a FUSE filesystem on the server?You can use a file manager and text editor, and you can have interaction, as well as authentication through SSH.I think it even supports FIFOs...