Posts from 2006

For whatever reason, I actually prefer Pringle’s “Right Crisps”, which is the lower-fat version of the chip, to the regulars. Still, Kat tries on occasion to introduce me to new things, and one recent attempt was a purchase of Pringles Prints.

Now, I think the idea of printing text on a potato chip is kind of cool, even if it’s in this kind of odd light Windex blue; I have to wonder what’s in the ‘ink’. Disappointingly, the facts printed on the chips were pretty basic, not to mention focused almost solely on dinosaurs and elephants for the first half of the can. I found this kind of funny, but admit it makes sense as I assume the target demographic for these chips is the kids, who love big animals. But I don’t know how the folks at Proctor & Gamble can live with themselves when they’re pushing facts like, “Did you know? Elephants do not live alone — they travel in herds.”

Maybe that’s a major revelation when you’re six, but I have to figure that if you know much of anything about elephants, you know that much. I smell filler text. Somebody was on a deadline to come up with a certain number of facts, and got desperate.

Anyway, I brought all this up because right near the middle of the can, I found out that even potato chip makers have to calibrate their printers.

As someone who obtained a minor in Astronomy in college, and one of the only people I know who can consistently name the planets in order without having to resort to mnemonics, I’d like to take a moment out from the whole W3C thing to comment on the de-planetization of Pluto.

It’s about time.

Its classification as a planet was never really justifiable, and recent discoveries like 2003 UB313 (Xena) have only served to underscore that fact.

Now, that said, I’m no fan of the “dwarf planet” compromise. That just smells of committee-think, and it’s got to go. For that matter, the newly adopted definition for “planet” is pretty terrible. If it were up to me, I’d go with a definition that was based on orbital characteristics and a minimum surface gravitational acceleration threshold—maybe size and density, too. But none of this “cleared its orbital path” crap.

Furthermore, I think all this a great illustration of how science works. Although it’s quite the fashion to talk about “scientific dogma”, what this shows is exactly how science works. There is no inflexible dogma. As new evidence emerges and is incorporated into the general body of knowledge, the “orthodoxy” changes. There are no absolute truths in science—only the best available information. Once we thought meat transformed directly into maggots; now we know otherwise. Today we think that no physical object can move faster than the speed of light in a vacuum, but tomorrow (or a hundred years from now, or a thousand) we may find we were wrong. It doesn’t mean anyone was wrong in their previous understanding. It means simply that their previous understanding was incomplete.

And that’s fine. In fact, it’s better than fine: it’s expected and, by and large, welcomed. I often wonder if the real conflict between religion and science isn’t that science stands in opposition to religion, which it does not, but that science embodies a way of approaching the world that could not be more different than that taught by most religions. There are no absolutes in science, no final immutable truths, nothing that cannot be supplanted by some new understanding. Change may happen slowly, and it always happens after there is clear and convincing evidence, but it does happen.

As with Pluto. At one time, it seemed like it could qualify as a planet. Now it does not. As we understand more about the universe, we will be able to formulate better definitions of what is a planet and what is not. Maybe that will mean one day re-planetizing Pluto, and if so, then fine. It’s all part of the process—excuse me, the method.

Maybe that’s a lot to hang on a change of classification for a tiny, frozen pile of rock, but it’s true nonetheless. Or at least it will remain true until someone can convincingly show otherwise.

That’s great to hear, but what’s perversely fascinating to me is that in that very same post, Molly herself lists the reasons why Jeffrey’s anger is in no way misplaced:

Am I defending the W3C’s slow-to-move process or its over-bureaucratized administration? Its lack of attention and sensitivity to gender (count the women, go ahead, dare you) and racial diversity, its frightening disregard for the real needs of the workaday Web world? Oh no, nor would I want to.

It’s that last point that lends the greatest support to Jeffrey’s argument: “…frightening disregard for the real needs of the workaday Web world”.

What more really needs to be said? It’s the most concise indictment possible that the first part of the W3C’s mission statement, the fragment they put right on their home page, “Leading the Web to Its Full Potential…”, has been betrayed.

Believe me, I’d prefer things to be otherwise. I’m still a strong believer in standards, and for seven years (1997 – 2004) put my time and energy into supporting and advancing them as a member of the CSS Working Group. When I left, it was because I didn’t have the time and energy to contribute any more, and rather than continue to be a deadwood listing on the group’s roster, I left. But most of the reason I couldn’t come up with the time and energy was precisely what Molly articulated. I no longer believed in the W3C’s ability to do what it promised, and what I wanted.

But the worst part? None of this is new. Look back two years, when David Baron and Brendan Eich walked away from a W3C Workshop in disgust. To a large degree, both men walked away from the W3C itself at that point—and if you’ve spurred David Baron to turn his back on the web’s central standards body, then boyo, you’ve got some deeply serious problems.

Let’s be frank: a whole lot of people who believe passionately in the web’s potential and want to see it advance fought for years to make that happen through the W3C, and finally decided they’d had enough. One by one, I saw some of the best minds of my generation soured by the W3C; one by one, the embittered generalsmarched forward, determined to make some sort of progress.

Perhaps my eyes have become a touch too jaundiced over the last decade, but I’m not sure I could disagree more with what Molly claims near the end of her post:

Jeffrey is wrong in his current assessment of the W3C.

If only that were so.

If the folks at the WaSP believe the Good Ship Consortium is beginning to change course, then I’m happy for them, really; I’ll be even more happy if they’re right. But when the ship is moving so slowly and has drifted so far out to sea, how much relevance can a change of heading really have?

As I ambled up Concourse C this afternoon, I spotted someone who looked an awful lot like Dennis Kucinich coming the other way. I thought for a moment about stopping him for a bit of congratulatory chat—he’s pretty far to the left of even me, but I admire his staunch refusal to compromise his principles no matter how unpopular they may be—but he didn’t have a welcoming air about him. Maybe he was having an off day, or maybe he’s always like that, but I figured a politician would always be open to meeting “the public”. It seemed like something that would go with the career choice, but perhaps not.

About ten minutes after I saw him, there was an announcement over the public address system calling for Dennis Kucinich to return to gate C-24 for a lost item.

So I guess that even if he wasn’t having an off day when I saw him, he did later. Based on where I saw him and the timing of the announcement, he was very likely beyond the secure area when it was made. I’m not sure it’s possible to get through security on a used ticket; it seems like too much of a security risk to do so. Then again, how would we know?

I wonder what it was he left on the plane. (Let the political jokes take flight!)

P.S. “Search all bags for liquids etc. at the gate” has become “search the bags of occasional random passengers at the gate”, at least in Cleveland. So either the rules are already relaxing, or they’re still firming up. I kind of hope it’s the latter, though neither one really appeals.

Last night, I returned from a week in Ojai, CA. The rules for my return were just a touch different than when I left.

For a moment on Thursday, I was seriously concerned, because the news reports made it seem like no books, iPods, laptops, or other time-fillers would be allowed on any flights in the U.S., and I was facing a flight home of four or more hours. Even worse, that meant I’d have to send my laptop through the baggage handling system. I was frankly far more concerned at the potential for damage or loss there than I was over the possibility that someone might blow up my plane.

Fortunately, things settled down and the truth emerged: no gels, liquids, or creams. Everything else is still permitted.

Although this isn’t true if you’re flying from the U.K. to the U.S. I was planning to be in London this November, but faced with the prospect of eight hours in a metal tube with nothing but the in-flight movies to occupy my attention, I’m starting to reconsider. I mean, come on: for my flight out to LAX, the movie was direct-to-video Dr. Dolittle 3. In comparison, their showing She’s the Man on the return flight almost seemed like a blessing. At least it was based on Shakespeare.

So anyway, the new security rules do actually improve a couple of things. For one, getting through the security checkpoint at LAX (terminal 6) in the middle of a Friday afternoon was a breeze, because the most anyone had was a briefcase, so there was a lot less struggling with bags and such. Also, the sudden lack of competition for overhead luggage space meant that boarding was quite smooth, with few if any aisle backups.

The downside, though, is that there is a final complete search of travelers’ bags at the gate (at least in LAX), and that part needs a lot of work. Instead of feeding people through the screening by rows, the way planes are usually boarded, they just told everyone to line up for screening. But they weren’t actually ready to let anyone on the plane, so the screening area was immediately clogged with already-screened passengers (with no real tracking of who’d actually been screened), which brought everything to a halt. It was a good ten minutes before the plane was open for boarding and the process unclogged.

Don’t get me wrong: if you’re going to search everyone for gels and such, doing it at the gate makes a lot more sense than doing it at the main security checkpoint. All I’m saying is that it needs to be done with a little bit of thought. As it was, the screening process at my gate was marginally less organized than an Easter Egg hunt conducted by a crowd of severely ADHD pre-schoolers. It’d be nice to see that improved before I get back on a plane. (That would be tomorrow, as it happens, so I’m not terribly hopeful.)

All this leaves aside the basic lack of common sense the whole situation evinces. Even if there were no more airport security than existed on 10 September 2001, the odds of my dying on a plane, whether by accident or design, would be several orders of magnitude smaller than the chances I’ll be killed driving to the airport. (This was triply true in my case, as I had to drive from outside Los Angeles to LAX in the middle of the day.) With the security that existed before this past week, my survival odds on the plane were greater still. I’m not saying we should just take away all the security, but personally, since Thursday I thought of at least two ways to take down a plane that the current system would be highly unlikely to catch.

At least, I think that’s so. It’s hard to be sure, because airport security is like the ultimate closed-source application. I can’t just say, “Hey, here’s a way to get a bomb past airport security using a medium-size ball of twine and 17 Hello Kitty stickers; how can we address this?” because then maybe I’ve given an idea to the Bad Guys, as though the Bad Guys haven’t been thinking about this a lot longer and harder than I have. The black hats know all about the system’s weaknesses, but we common users have no way to check for bugs without being hauled off to jail—or, if we simply speculate aloud on possible weaknesses and ways to patch them, get accused of giving aid and comfort to the enemy, whatever the hell that means. (Oh, that’s right: it means doing anything the current administration doesn’t like, including criticism of their decisions and actions. Sorry, I just forgot for a moment.)

Anyway, ze frank and New Scientist said it better than I can, so I’ll just shut up now and let you check them out. Just make sure neither has any liquids or gels on them.

A couple of weeks back, I was hanging out in a New York hotel lobby with Tantek, who was either working on his AEA slides or enhancing the overall usefulness of the web in his spare time; I’m not sure which. On the far wall, a plasma display ran CNN continually, softly, offering up such choice crawl text as “N. Korea Missile Test Fallout”. One of the stories running was about alleged plagiarism on the part of Ann Coulter.

We got into a brief discussion over whether such people should be rebutted or ignored. Tantek took the former position, whereas I took the latter. My stance is probably a holdover from my long years of Usenet and mailing-list participation, where one of my most iron-clad rules is “Don’t feed the trolls”. Better they starve for lack of attention, that’s how I see it. Perhaps this is a defensible strategy in the “real world”, and perhaps not, but I will freely admit that it’s one of my default behaviors.

Thus, my first instinct was to completely ignore John Dvorak’s screed about CSS. Mr. Dvorak is an admitted troll, and so my default tendency is to simply ignore him. But “troll” is, in my world, an alternate spelling for “fool”, and as Winston Churchill reminded us, one of the great lessons of life is to know that even fools are sometimes right.

So is Mr. Dvorak right? Not in what he has to say, no, but there is still something there worth hearing.

It turns out that none of his complaints about CSS are really valid, even when you consider only the ones that have a factual basis. Sure, he can complain about the cascade being confusing, but that’s like criticizing Windows because of all those stupid windows that open up everywhere and get in the way of the desktop wallpaper. It’s an inherent feature of the system: either accept it and move on, or reject it and walk away, but don’t waste your time complaining about it. The best part, of course, is where he blames CSS for inconsistent browser implementations, which is rather like criticizing Microsoft because Windows doesn’t run properly on a computer whose processor isn’t compatible with Intel’s architecture.

But step back and let your eyesight blur a bit, and the shape of a worthwhile point begins to emerge. The closest Mr. Dvorak gets to expressing it, possibly by accident, is this sentence: “Can someone explain to me exactly what kind of ‘standard’ CSS is, anyway?”

I could do so, of course, as could most of you, but that’s not the issue. What we’re seeing here is the initial reaction of a CSS newbie, not too different from many others when they first begin to style, and all brought closer to home by the high-profile nature of the newbie. (Whatever you may think of Mr. Dvorak, he has prominence in the industry.) CSS is not as hard as some make it out to be, but it isn’t easy as cake, either.

A good part of that problem is the natural expectation that all browsers should act the same. It’s a strange thing to expect if you’ve been in the field long enough, since browsers have never really been consistent on anything, from HTML error handling to PNG support. But someone who’s coming in fresh is almost certainly going to expect that if they do things a certain way, the result just works. Why would one expect anything less?

That’s why the Web Standards Project was founded, of course; and its existence, history, and current efforts put paid to Mr. Dvorak’s assertion that nothing is being done. As I’ve said, none of his individual points are on target. What his outburst does is remind us of the problem to which so many have grown numb, and which we still—for all the progress that has been made—face on a daily basis. Consequently, it reminds us to keep advocating for greater consistency between browsers, to praise the efforts of browser makers in that direction, and to help them correct their course when they move in the wrong direction—and to do so constructively, not destructively. For while we may gain insights from the rantings of trolls, we should never be so foolish as to adopt their tactics.

I’ve been largely offline for the last couple of days due to an inexplicable failure of my DSL modem. I was certain that it was another case of the DSLAM dying on me—it’s happened a few times in the past—and when the Covad techs claimed it had to be a modem failure, I was deeply skeptical. Score one for the topical experts: they were right, and I was not.

While I waited for the replacement modem that I was sure wouldn’t change anything, I was using dialup. Man, I never want to do that again. Talk about sipping the Internet through a cocktail straw. To make it even worse, I was tethered. To a phone jack. There was no wifi infusing the house, letting me work anywhere. It was like having lost a perceptual sense. It was wrong and confining and I didn’t like it. No more of that, thanks. If the Republicans are so hot to amend the Constitution, how about they be useful for a change and add “the Right to Unfetter’d Bandwidth”?

So. Nothing much happened CSS-wise while I was gone, did it? No controversies or anything? Good.

As design migrates from the web to mobile devices, our approach must also shift. Learn how companies are using ethnographic-based research to design smarter interfaces.

I’ve seen Kelly speak in the past, and she’s always funny, smart, and relevant. I’m really looking forward to hearing what she has to say about ethnography and design.

I’ll be offering updated versions of my highest-rated talks in New York, “Hard-Core CSS” and “One True Layout”, and Jeffrey will be talking about selling standards to difficult clients (especially when the client is a boss) and the importance of writing to good design. All this and Stan too! If you’re fixin’ to come see us, the early bird deadline is still a ways off, but don’t wait too long.

Over the past year-plus-a-half, S5 has grown from a small hack of a compact slide show script written by Tantek Çelik into a relatively complex bit of work. In the beginning, there was simply a way to take a single document and turn it into a series of slides. I added basic keyboard controls, a navigation menu, and the ability to have the navigation controls show and hide, and then threw it out into the public eye. People loved it, and with a lot of help from a lot of people, all manner of features were added: slide bookmarks, much better keyboard controls, incremental progress, a notes view, and more.

Despite all this community involvement, though, the code base was in a single set of hands: mine. Anything that was added to the “official” S5 code was done by me, as time and understanding allowed. As anyone could have predicted, this has slowed the advancement of S5 over time, and of late it’s brought advancement to a near standstill as I’ve struggled to keep up with other demands. The only thing I’ve added since 1.2a2 is the ability to blank the screen by hitting the “B” key, and that change has yet to become public.

Of course, the code is explicitly in the public domain, so anyone can add to S5—and many have. ZohoShow, for example, outputs S5 1.1 code. I’ve seen S5 used for product tours of medical software and board games. Jonathon Snookadded a “live preview” version of the notes view, which I totally want to see in the primary code base. David Goodger made a bunch of useful Docutils-compatibility additions that I never managed to fold in. I also know of four different implementations of remote-control functionality, where one person runs a slide show and changes are reflected in remote copies. This is a feature perfect for distance learning, corporate netconferences, and other situations.

And all this time, there was still no way to have those enhancements, or any others, “come home” to the source of S5 unless I did it myself. Until now.

Thanks to Ryan King, we now have S5 Project, which will be the official home of S5. Besides the blog and mailing list S5-discuss, there will be a wiki, a source code repository, and a bug-and-feature-request tracking system. If you’re an S5 hacker, or even a frequent user, please do join the mailing list (I know, I know—another one?) or at least subscribe to the S5Project RSS feed to keep track of what’s going on. I expect the mailing list to become the place for coders to talk about additions they want to make and bugs they’re trying to squash, even after the bug-tracking software gets set up, and it will be a primary source of content for the wiki-to-come.

While it’s been the case that anyone may add to S5 in their own way, for whatever purpose they see fit, now there will truly be community access to what’s always been a community project. I hope you’ll join us there!