Saturday, October 29, 2011

I've seen this point raised before, and it's a good one. My job is developing software, and programming - actually sitting down at the computer and typing out lines of code - is probably 25% of that job.

Lots of other excellent advice in that article too, such as:

You are not defined by your chosen software stack: I recently asked via Twitter what young engineers wanted to know about careers. Many asked how to know what programming language or stack to study. It doesn’t matter. There you go.

Do Java programmers make more money than .NET programmers? Anyone describing themselves as either a Java programmer or .NET programmer has already lost, because a) they’re a programmer (you’re not, see above) and b) they’re making themselves non-hireable for most programming jobs. In the real world, picking up a new language takes a few weeks of effort and after 6 to 12 months nobody will ever notice you haven’t been doing that one for your entire career.

This is one of those articles I've saved and will hand out to anybody that ever asks me about this career that I love so much. (So that's… what, three or four people over the next fifty years? Haha…)

Tuesday, October 25, 2011

Just spent several hours troubleshooting Adobe's Creative Suite 5.5 installer. It kept erroring out for me with the following unhelpful message:

"please insert disk AdobeDesignPremium5.5-English to continue"

Huh? Well, it turns out that their installer gets really confused if you install it from anywhere but the C: drive. Note I'm just doing a standard install to the C: drive. But you can't install from, say, a USB flash drive.

What I love is that I found the answer on The Pirate Bay. Mind you, I'm installing a legit licensed copy of Creative Suite. But Adobe's own resources had no information for me. Apparently nobody at Adobe owns a USB drive.

Maybe Adobe is confused -- somebody should tell them that just because Flash is dying, that doesn't mean that USB flash drives are dying.

A coworker bought a new Acer laptop at Best Buy. I'm setting it up and installing Office and Creative Suite for her. Now, is this a perfectly usable piece of hardware? Yes, mostly, if you don't count the trackpad.

But boy, do they cut just about every possible corner on a $500 PC laptop. Uneven screen lighting, unusable trackpad, bloated with crapware, crappy keyboard, ugh. I'm not that much of a snob, though. This PC will, basically, get you from Point A to Point B.

Why Do Automated Updates Require Me To Click "Okay" For Three Hours? Downloading all of the required Windows updates took two or three hours. The current state of Windows system updates is downright magical compared to the way they were say, ten years ago. Remember the days of Windows patches and hot fixes that had to be applied in exactly the right order, lest you render your OS more or less ruined?

The current system is commendably foolproof. But why do I have to hit "okay" to authorize a Windows Update reboot so many times? The procedure seems to be:

Launch Windows Update

Pull down and install all available updates

Wait between 5 and 30 minutes

Click "Yes" to allow a reboot (or wait for the timer)

System reboots

Oh, hey, a bunch more Windows Update updates are now available, presumably because the last batch of updates fulfilled a bunch of prerequisites/dependencies. Windows Update will find these in the background, eventually, or you can manually launch Windows Update in case you actually want your system updated, you know, now. Return to Step 1. Repeat four or five times.

What's aggravating is that this could be totally automated. There is no reason to even involve me, and no technical reason why there couldn't be a "Download and install every possible update and reboot as many goddamn times as you need to, and let me know when you're finished" button.

Jon Stokes, formerly of Ars Technica, has a new article up: Meet ARM’s Cortex A15: The Future of the iPad, and Possibly the Macbook Air.
It's a typical Jon Stokes article about CPU architecture - and that's a good thing. Lots of talk about pipeline depths and so forth. His conclusion, based on "paper" evaluations of the A15 and Intel's upcoming products, is that the A15 is going to be great but it won't be compelling enough for Apple to consider switching some of their notebooks to that architecture.

Why RISC? The question of a switch to an ARM-based RISC architecture for Macs is tantalizing. The short version of a long story is that, all other things being equal, RISC architectures are always going to offer more performance-per-watt than CISC architectures like Intel and AMD's x86 chips. That's why you don't see x86 chips in cellphones and you very rarely see them in consumer electronics.

Why Didn't RISC Win The First Time? In the 80s and 90s, everybody was going RISC - except Intel, who needed to keep their desktop x86 CPUs backwards-compatible. Intel was able to defeat RISC desktop chips like PowerPC with their economic might. Even though their x86 chips needed more transistors and more power to match RISC competitors, Intel's massive profits, economies of scale, and industry-best chip manufacturing processes allowed them to maintain a performance lead and sell those x86 more cheaply than their RISC competition.

Why Might ARM Win This Time? The increased power consumption of Intel CPUs is no big deal on the desktop, where a few extra watts don't matter much. But with everything going mobile, it matters again. And unlike Microsoft, Apple already has an operating system that's poised to run on multiple architectures.

So it's a little disappointing to read that Stokes doesn't feel ARM's latest CPUs are enough to start nibbling away at Intel's presence in mainstream notebooks. The thought of dropping the "CISC tax" is awfully appealing.

Friday, October 21, 2011

A techie -- or worse, a gadget hound -- is somebody who's in love with buying, using, and owning electronic things as an end and not as a means.

Let's not ever let ourselves be "techies."

Let's be people who solve problems and create new works with the best technology possible… and the least technology possible. Technology doesn't have to be electronic: a hatchet is the best technology possible for chopping small amounts of wood.

"Sell – even give away– anything you never use. Fancy ball gowns, tuxedos, beautiful shoes wrapped in bubblepak that you never wear, useless Christmas gifts from well-meaning relatives, junk that you inherited. Sell that stuff. Take the money, get a real bed. Get radically improved everyday things.

The same goes for a working chair. Notice it. Take action. Bad chairs can seriously injure you from repetitive stresses. Get a decent ergonomic chair. Someone may accuse you of 'indulging yourself' because you possess a chair that functions properly. This guy is a reactionary. He is useless to futurity. Listen carefully to whatever else he says, and do the opposite. You will benefit greatly.

Expensive clothing is generally designed to make you look like an aristocrat who can afford couture. Unless you are a celebrity on professional display, forget this consumer theatricality. You should buy relatively-expensive clothing that is ergonomic, high-performance and sturdy.

Anything placed next to your skin for long periods is of high priority. Shoes are notorious sources of pain and stress and subjected to great mechanical wear. You really need to work on selecting these – yes, on 'shopping for shoes.' You should spend more time on shoes than you do on cars, unless you're in a car during pretty much every waking moment. In which case, God help you."

This is bad. The difference between a good voice recognition system and a bad voice recognition system is huge. Same with the gap between lots of other features: GUIs, touch interfaces, and so forth. They're not things you can check off on a checklist.
It's like comparing two restaurants by treating bread as a checklist-able thing.

"Wow. The fresh-baked bread at Restaurant XYZ is amazing. Light, flaky, still steaming hot when they serve it."
"What's the big deal? They have six-packs of hotdog rolls at 7-11. Why pay more?"

What's depressing is that a lot of smart, knowledgable people treat technology that way. I feel like it's almost the dominant way of thinking, even among the technology-savvy.
How do we change this?

Friday, October 14, 2011

You could argue that the new features are mostly things that ought to have been there ages ago, and you'd probably be right, but none of that takes away from the fact that this is an excellent release.

Wednesday, October 12, 2011

In the 1970s, the idea of home computing seemed ridiculous. Even the hobbyists like Woz that wanted computers in their homes weren't thinking much beyond their hobby and other hobbyists; Jobs was the one who saw that these could be people-centric devices and that everybody should have one in their living room.

Even if all Jobs did was market the darn things better than anybody else, that's still indispensable. Without everyday consumers buying computers and pumping billions of dollars into the industry, very little of the last several decades of innovation would have happened.

But he went way beyond that, being intimately involved in every project he ever participated in, bringing a healthy dose of the liberal arts with him - music! typography! color!

Without Jobs… eventually, yes, computers would have probably gotten small and affordable enough to be in peoples' homes. But would anybody have wanted them? Without Jobs, that prospect would have been about as appealing as having your own forklift, cash register, or timecard punch at home. Even if you could, why would you? Look at how terrible the IBM PCjr was -- and they made that thing after they had Apple's people-oriented computers to ape.

1. We are human beings, and computers are here for us. Software and hardware should make our lives better. Everything will be evaluated within that context.

2. We practice pragmatism around here. All other things being equal, free (libre) software is better than a closed alternative - but only for pragmatic reasons: open-source software is more likely to receive iterative improvements, feedback, and bug-testing. Closed-source software that makes our lives better (see #1) is to be admired.

3. Operating systems and programming environments are like martial arts. They have strengths and weaknesses relative to one another but whichever one(s) you pick, you're going to be a badass if you pour enough effort into it.

4. Design is important. Design is how a thing works, not how it looks. A command-line environment can be well-designed and elegant.

5. Performance is fun but it is a means, not an end. More megahertz and megabits per second are good, but only if they help you do cool stuff better.