Category: Computers

Python is a language that does not show up on my resume, because my last professional exposure to it (at Panta, in 2004-5) was very negative, and I’ve found if you have something on your resume — no matter how obscure/brief the mention is — someone is going to try to hire you for it. That said, I’ve been doing a modest bit of it lately for a project at my present employer and between support for it in IntelliJ and and ~13 years of improvement, I can tolerate it.

On a second round of use, I’ve found I really appreciate its expressive power, and I can … just barely… tolerate the dynamic typing.

On the down side, I have strongly confirmed my initial impression that it has some of the most unpleasant aesthetic syntax choices of any modern programming language — possibly the worst I’ve come across, barring esolangs, and surviving pre-C languages. It literally manages to make Perl seem like a pleasant alternative.

Share this:

Like this:

Nightly auto-upgrades are great, except when they aren’t. In this case, going from app-misc/screen-4.0.3-r7 to app-misc/screen-4.2.1-r2 changed the default SCREENDIR from /run/screen to /tmp/screen. Normally wouldn’t care, but I had running screen sessions and couldn’t get in; oops!

Only realized about the SCREENDIR variable after I’d already downgraded back to 4.0.3-r7.

Share this:

Like this:

I recently realized that a very important aspect of what I have done for the past year in my day job is echoing how I got started in my career. That is, I spent a bunch of time last year and this year justifying a large technical project — in writing for a less-technical audience — and then working with other people to get it organized and deployed. I’ve also recently in my work gone back to that project — documenting the project so that other folks could finish it, and a non-IT PM could manage it — so that I can get back to programming.

Realizing that, it inspired me to see if I still had the documents I’d written for that original project. It turns out, I did — both the original proposal, and a mid-year budget for the actual ordering once we got the project approved.

So, what was this project? Getting my high school computer lab on a LAN, and on the internet — the latter isn’t mentioned in the original proposal, so I guess it was scope creep, but it was awesome. The project lead to my first full-time summer job and my first full-time job when I took my break from Dartmouth (both doing Novell server admin work plus some desktop support) and I’m pretty sure the project itself — still underway — made a difference in my college applications.

Finally inventoried the parts after years in storage. Lot of chips, though nothing that rare or interesting. Somewhere I have the schematics I designed. Will scan them when I find them. I doubt it really would have worked, and I never got an EPROM programmer working. Very tempted to try to figure out how to reuse some of this building something like N8VEM.

Now I know that there is already plenty of old-fart nostalgia on this blog — a lot of people have interpreted Whatever happened to programming? as a yearning for the days when you had to do everything from first principles, which wasn’t really what I meant. But I do, I really do, miss the days when it was possible to understand the whole computer.

I’m not claiming that I ever had the level of mastery that people like West and Butterfield had. I was never really a big machine-code programmer, for one thing — I wrote routines to be called from BASIC, but no complete programs in 6502/6510. And I’ve never been a hardware hacker at more than the most trivial swap-the-video-cards level. But I did have a huge amount of Commodore 64 lore internalised. I knew which memory location always held the scan-code of the key that was pressed at that time; and where the screen memory-map began; and which locations to POKE to play different pitches on the various channels of sound, or to change the screen or border colours. I knew hundreds or thousands of those little bits of information, as well as being completely familiar with the horrible BASIC dialect that was in one of the ROMs. In short, I knew the machine in a way that I don’t even begin to hope to know any of my computers today; in a way that, I am guessing, no-one anywhere in the world knows their machines.

Share this:

Like this:

A couple of years ago, links this article went round the office as an example of how bad Vista was, and how not to design software. It didn’t sit well with me.

Anyway, I ran into it today and re-read it, and this particular point stood out:

So now we’ve got exactly one log off button left. Call it “b’bye”. When you click b’bye, the screen is locked and any RAM that hasn’t already been copied out to flash is written. You can log back on, or anyone else can log on and get their own session, or you can unplug the whole computer.

…as the culmination of his whole argument. To me, in context, it just reads as a “reductio ad absurdum” against the very showing why Microsoft did the RIGHT thing in making a flexible UI.

Then again, I am at least a sigma, and maybe two, into the “control, customizability and flexibility freak” side of things when it comes to computers. I run Gentoo Linux on my server, and if I pitched Windows on my day to day machines, it would be for Gentoo (or some other very customizable Linux distro) and not for something more out of the box like Fedora or SuSE (let alone the MacOS!)

The REALLY interesting question, to my mind, is how do you design an interface that scales in depth – to be accessible enough for someone newly sitting down at a system to be able to use it while at the same time allowing an experienced user to optimize his or her own processes – for one trivial example, I don’t want to have to pull the battery in order to get a “real” shutdown or hibernate of my laptop before a flight or a long day away from it: how long it’s going to be before I need it again is something that the software isn’t going to know, but I’ll usually have a pretty good guess of when I shut down.

As these things go, I’ve found Microsoft’s “big” products (Windows and Office) to be some of the better software out there in that respect, although I haven’t needed to play around nearly as much with Office customization since I moved to Office 2007. Vista and Windows 7 were (IMO) HUGE UI improvements over 2000/XP in my view.

Share this:

Like this:

I doubt many folks are still reading it, as I know my handful of old friends who read regularly are all on Facebook now and seeing my updates there. Nevertheless, my file/web server has been down for a week or two and is only back up. It’s pretty much been totally replaced, hardware-wise.

Like this:

Online retailing can be a rough sport. The competition is rabid, customer loyalty is fickle, and IT expenses can go through the roof.

That’s why The Register can appreciate an e-tailer with a unique business model.

A hawk-eyed El Reg reader points out that UK online retailer Ebuyer.com appears to be cutting costs by running its site on servers dating back to the late Cretaceous period – roughly speaking. According to internet monitoring company Netcraft, the e-tailer has bypassed run-of-the-mill legacy servers for some serious heirlooms.

[…]

Next, you’ll find not one but two Commodore 64s. The Commodore debuted in 1982 with 64KB RAM, a 1.02MHz MOS Technology 6510 processor, and a 16-color, 320×200 resolution monitor. Not to mention a creamy BASIC 2.0 operating system.