This article will discuss what happened, and why Blogger's policy (and therefore Google's policy, since they own Blogger) is so terrible.

I've got an article coming up the new iMacs and related announcements, as well as an article from an HCI (Human-Computer Interaction) perspective on Copying and Pasting on the iPhone. The latter has been 90% finished for weeks (almost a month, actually, now that I look at the date), and I really hope to have it finished and up soon. Really I do.

--

So what happened to this blog?

I got an email on Tuesday from someone saying they couldn't reach my blog. More specifically, when they tried http://jdeber.blogspot.com/, they got a "404 Not Found" error, which is a webserver's way of telling you that the webpage you tried to access doesn't exist.

Well, since my blog should indeed exist, I went and took a look at it to see what was up. And, sure enough, I got a 404 error.

Hmm. Well, I hadn't changed anything recently (in fact, I hadn't even posted recently), so I knew it wasn't something that I had done. And, even if I had made any recent changes, I shouldn't have been able to muck things up in such a way that I would end up with a 404 error1. So, the only logical explanation was that something was wrong on Blogger's end of things.

Why is my blog disabled?
If your blog is disabled, it will be listed on your Dashboard, but you
will not be able to click on it to access it. If this is the case,
there will be a grace period during which you can request that it be
reviewed and recovered. The disabling is a result of our automated
classification system marking it as spam. Because this system is
automated there will necessarily be some false positives, though we're
continually working on improving our algorithms to avoid these. If
your blog is not a spam blog, then it was one of the false positives,
and we apologize.

I see.

Clicking on the provided link takes us to a page with the following:

Blogger's spam-prevention robots have detected that your blog has
characteristics of a spam blog. (What's a spam blog?) Since you're an
actual person reading this, your blog is probably not a spam blog.
Automated spam detection is inherently fuzzy, and we sincerely
apologize for this false positive.
You won't be able to access your blog until one of our humans reviews
it and verifies that it is not a spam blog. Please fill out the form
below to get a review. We'll take a look at your blog and restore it
in less than a business day.
If we don't hear from you within the next 20 days, your blog will be
permanently deleted.

I submitted an appeal Wednesday at 7:48 PM, and the blog was reactivated Thursday at 6:29 PM. Which isn't that bad a turnaround, I suppose.

--

A brief aside: What's a Spam Blog?

Basically, it's a fake blog whose sole purpose in life is to promote the shady websites associated with spammers and scammers. They either copy text from other blogs (or some site like Wikipedia), or simply make up random gibberish. They then link to their own shady site, with the goal of increasing the number of pages that link to it. The motivation is that Google (and other search engines) assign greater relevance to sites that are linked to with greater frequency, since the (usually valid) assumption is that a site with more links to it is more popular, and therefore more significant.
Blogger's help page on Spam Blogs defines them as follows:

As with many powerful tools, blogging services can be both used and
abused. The ease of creating and updating webpages with Blogger has
made it particularly prone to a form of behavior known as link
spamming. Blogs engaged in this behavior are called spam blogs, and
can be recognized by their irrelevant, repetitive, or nonsensical
text, along with a large number of links, usually all pointing to a
single site.

OK, OK, I get it. Sure, my writing is irrelevant and/or nonsensical. But repetitive? I didn't think that it was repetitive. I mean, do I really repeat things over and over? Am I really redundantly repetitive, with continued superfluous uses of identical phrases?

Bart: You're right, Mom. I shouldn't let this bother me. I'm in
television now. It's my job to be repetitive. My job. My job.
Repetitiveness is my job. I am going to go out there tonight and
give the best performance of my life.
Marge: The best performance of your life?
Bart: The best performance of my life!Episode 1F11 Bart Gets Famous

--

Anyway, the short version is that Google's anti-spam analyzer flagged my blog as spam. In of in itself, this is actually kinda funny2, and I understand that these things happen. Algorithms make mistakes, particularly when it comes to doing something as subtle and subjective as analyzing the motivation behind a blog. It's the same as the way that everyone's had at least one real email message get marked as spam3.

But, what's not funny, and what really gets me, is the way in which this was handled by Google.

They never emailed me to inform me that my blog was flagged as spam.

Basically, if Google decides that your blog is spam, they disable it without telling you. You're guilty until proven innocent, since you have to petition to get it reinstated. And, if you don't file a petition within 20 days, they delete your blog. However, the only way you know that you need to file this petition is if you log in and look at your blog. They don't email you. They don't even post anything on the blog itself (they just give the aforementioned generic 404 error). They just disable it.

The obvious solution is to change the notification system so that if a blog gets flagged as spam, an email goes out to the blog's owner. This email tells you that you've been flagged, and gives you a link to appeal the decision. If you don't appeal within a few days (or say a week), then they can feel free to disable the blog, and the "20 days to deletion" policy can kick in. If you do appeal, your blog gets "recertified" without ever having been taken down.

I've sent an email to Blogger saying as much, and I'll let you know if I get any response. I somehow doubt I will.

I'm also thinking of moving off of Blogger to another blogging platform. I had chosen Blogger solely due to its connection to Google (a company that I respect), but this experience has left me a bit soured on the platform. If anyone has any thoughts about alternatives, I'd appreciate hearing them.

Readers of Google Inc.'s Custom Search Blog were handed a bit of a
surprise Tuesday when the Web site was temporarily removed from the
blogosphere and hijacked by someone unaffiliated with the company.
The problem? Google had mistakenly identified its own blog as a
spammer's site and handed it over to another person.
...
"Blogger's spam classifier misidentified the Custom Search Blog as
spam," [a Google spokesperson] said via e-mail on Wednesday. Typically
Google notifies blog owners when it has spotted content associated
with spam on their Web sites to give them a chance to clear up any
misunderstandings.
However, that didn't work out in this case. "The Custom Search Blog
bloggers overlooked their notification, and after a period of time
passed, the blog was disabled."
When blogs are disabled like this, their URL becomes available to the
general public. That's when Srikanth [the person who hijacked it]
swooped in and wrote the joke post.

Here's another article about it, which includes a different official Google statement:

Whoops! We accidentally classified ourselves as spam, and our
ever-perceptive Blogger settings caught us. The Custom Search Blog has
since been restored, and we’re taking steps to ensure this doesn’t
happen with other Google blogs in the future. Other Blogger users can
make sure this doesn’t happen to them by reporting any problems to the
Blogger support team via the Blogger Help Center at
http://www.blogger.com/problem.g. We can then investigate.

I honestly don't know what they mean here. "Other Blogger users can make sure this doesn’t happen to them by reporting any problems to the Blogger support team...". Was I supposed to send a message in advance asking them to not randomly delete my blog?

Furthermore, it appears that a flagged blog gets opened up for anyone to hijack (by allowing them to register a new blog with the now vacant name). This doesn't make any sense, and in fact makes matters even worse. I could (maybe) understand if that happened after the 20 days have elapsed, but in this case that didn't seem to happen -- the blog was hijacked immediately after being flagged as spam4. Now, I didn't try to hijack my own blog by starting a new one with the same name from a different account (it hadn't occurred to me to try), but I would be curious if that would have worked.
And lastly, as I already mentioned, I didn't receive any notification by email (or by any other means). I checked my spam folders to be sure (although I wouldn't have thought that Gmail would classify an urgent Blogger message as spam), and there was nothing. This flatly contradicts the official statements in the aforementioned articles. So, either the spokesperson is bullshitting, or there was some bug that impacted a bunch5 of blogs. I'm guessing it's the latter, and a bug in Google's anti-spam system caused a bunch of non-spam blogs to get flagged and deleted immediately.

The fact that my blog's deletion seems to stem from a bug, rather than from a brain-dead policy, makes things a bit better, but I'm still not too impressed by the situation. I suppose that I can't complain too heavily about a free service not working correctly (that whole "gift horse" thing), but still, I remain extraordinarily unimpressed.

Friday, June 15, 2007

Onward we go, with my third post about WWDC 2007. In this article, I'm going to finish up my random tidbits and observations about the keynote. The next one will discuss each of the ten Leopard features covered in the keynote.

For anyone tuning in late, my first WWDC 2007 article talked about the gigapixel image of the Library of Congress used during the 64-bit demo, which turns out to have been created on Windows. My second article covered some other miscellaneous observations, including some thoughts on the iPhone as a Blackberry competitor.

--

Steve mentioned Apple's "famous column view" during his discussion of the "New" Finder (24:42 or so into the keynote). Famous? Um, OK... Did I miss Apple's press release about the coveted File Browser View of the Year Award?

Remember the "I'm a Mac / I'm a PC" ad from a year or so ago when PC makes up an award after hearing about Mac's glowing review in the Wall Street Journal? It's not up on Apple's site anymore, but here's a YouTube version. Steve's comment somehow reminds me of that.

I can see the new ad now:

[Upbeat music plays. We see Mac looking at a bunch of classy Romanesque columns]

3) In the "Easy Bookmarks" bullet point on that page. (I'm not sure if that really counts, since it's just showing the default Yahoo! link in the Bookmarks Bar, but I'm including it for completeness sake.)

7) During the keynote, it's the first web page he goes to in the Windows version of Safari (at about 1:10:25).

I really don't get why this is. Don't get me wrong - I have nothing against Yahoo!. However, Google would seem to be a more natural fit with Apple, and Yahoo! is one of Google's direct competitors. Aside from the fact that Apple is heavily promoting the Google Maps application on the iPhone, Dr. Eric Schmidt, Google's CEO sits on Apple's Board of Directors.

I also talked about this strange partnership with Yahoo! a bit in my write-up of Macworld 2007 keynote, when Apple announced that "push email" (the kind of email that notifies you as soon as it comes in, aka the thing that makes the Blackberry so popular) would only be supported with Yahoo! Mail, and not its own .Mac service or Google's Gmail.

--

Why did Steve have World of Warcraft running on his demo machine? He didn't use it at all (except in the Spaces demo, which didn't explicitly mention the game but showed it running in its own Space). I would have thought that the less a beta release of an OS was asked to do during a big demo, the better (i.e., the less you're doing with it, the less likely something is to go wrong and have it crash). And in fact World of Warcraft did crash during the demo. If you blinked, you might have missed it, since Steve dismissed the crash notification dialog very, very quickly.

--

You get a good look at Steve's notes at several points during the keynote (I'm referring the physical paper notes he uses during his demos). Other demoers1 could learn a lot from them. They're quite large, and appear to be on a hefty, laminated stock. But, more to the point, they're in a spiral bound notebook that opens vertically. This makes it easy to change through them, and the cards you're finished with don't get in your way. They also have a tab for each section of the demo, so it's easy to change to any specific part of the notes.

--

During his intro to Core Animation (starting at around 39:15), Steve says:

I find the phrase "completes that suite" to be interesting. To be honest, I'm not sure what else Apple could provide (sound, pictures, movies, and animation seem to cover just about every type of multimedia), but I found it interesting nonetheless.

--

At 44:10 or so, there's an rare bad edit in the webcast where they don't show the screen when Steve is talking about something on it that needs to be seen.

--

At 1:05:05 or so, everyone applauds when they're told that they'll be getting a copy of Leopard that day, right after the keynote. I don't understand why they seemed to be so surprised by that. Every year, we've gotten the new Developer Preview immediately after the keynote. Not to mention the fact that Apple has been explicitly advertising that you should come to WWDC to get your copy of Leopard.

--

Immediately after that, when Steve talked about having a "Basic" version for $129, he actually had me going for a few seconds. This part was very nicely delivered.

By the way, if you didn't get the joke here, it's a jab at Vista, which has a rather confusing myriad of versions that all differ from each other in some subtle (and not so subtle) ways.

Apple already poked fun at this in one of their "I'm a Mac" ads, Choose a Vista.

--

Steve's user name under Windows XP (for the Safari demo) was "Administrator", as opposed to something like "Steve Jobs". A reasonably subtle jab at the inelegance of the Windows security model, perhaps?

--

The webcast video didn't have a smooth ending. Rather than fading out to black (or to an Apple logo), it simply cut off abruptly. That's a bit uncharacteristic given Apple's normal production values.

--

Apple has finally updated the look of the top row of tabs on their web site. Tiger changed the default look of Aqua tabs more than two years ago, but the Apple site was never changed. This resulted in the somewhat silly situation of having the interface on the Apple website looking like an old copy of Mac OS X.

--

When Steve first demos Safari for Windows, he makes an off-hand comment that "I'm obviously going to have to change computers here".

Uh, Steve, you spent several minutes (and one of your ten Leopard features) talking about how you can run Windows on your Mac. Now, he wouldn't want to use Boot Camp for the demo, since rebooting would take too long, but he could have used Parallels or VMWare. Perhaps he didn't want to be seen as plugging one over the other?

At any rate, I found this a bit amusing.

--

I'm not sure why id software was presented as a "new Mac developer", since they've had titles on the Mac for years. I know that some of the older ones were ported by third parties (e.g., MacSoft), but Doom 3 and Quake 3 were both released on the Mac by id themselves.

That being said, John Carmack (the main tech guy behind id, and the guy who was on stage at WWDC) is a genius, and it is always a pleasure to see what he can do with the latest advances in hardware.

3) More Core Animation eye candy. Mostly gratuitous, but a few useful ones.

I'm going to say yes (the 3D effect for the Dock would be a gratuitous example).

4) Cinema Displays get built-in iSight cameras.

No.

5) No multi-touch on MacBooks or MacBook Pros. No touchscreens, no gestures on the trackpad (other than two fingered scrolling), no tablet Mac.

Yes

6) No major revision in Aqua (nor a replacement for it). No "Illuminous".

Tough call. Brushed Metal was indeed taken down to the basement and shot, but from what I can tell, there are no revisions to the basic controls and widgets. So I'm going to say yes.

7) No iPhone SDK. Yet.

Yes. More on this in a future posting.

8) Adobe CS3 and Office 2008 Demos. A no brainer, to me.

This was one of the ones that I was almost certain about, and it turns out to be wrong. All we got was a not-so-subtle jab at Adobe and Microsoft for taking so long to get Universal binaries out.

9) .Mac Backup meets Time Machine. .Mac disk space usable as storage for Time Machine, or at least the .Mac Backup functionality gets merged into Time Machine.

Not really. The "Back to my Mac" file sharing leverages .Mac, but not as a backup. So that's a no.

10) Subnotebook (MacBook mini?). This one's more of a long shot, but I'm still expecting to see a replacement for the 12 inch PowerBook. I'm defining this one as a laptop that's notably smaller (in all dimensions) that an existing model.

I've got quite a bit to say about the WWDC Keynote, so I'm going to break up my commentary into several articles. The first one, A Gigapixel of Irony was a bit of digging I did into the gigapixel image of the Library of Congress that Steve used during his 64-bit demo. As it turns out, the image was actually created by a photographer who does all of his work on Windows. As I concluded in that article, it seems a wee bit ironic to me that the image used to show off the power of 64-bit Macs wasn't actually created on one.

This article will cover some of my more general observations about the keynote (and related info gleaned from Apple's site). I'll follow that up with a few more random observations that aren't really all that relevant, but I thought were interesting nonetheless.

Next, I'll post a commentary that briefly touches on each of the ten Leopard features demonstrated in the keynote. Finally, I'll go into three areas in more depth: the "sweet" iPhone SDK, Safari for Windows, and the "New" Finder.

--

Overall, I thought that this keynote was a bit disappointing. We really didn't see much in the way of the "Top Secret" features that were promised last year. The few things that qualify as new are really more of incremental improvements, as opposed to the actually new features that were presented last year (Time Machine and Core Animation, in particular). The biggest one is, in my mind, Stacks. I'll talk about this in my forthcoming article on the Leopard Finder.

But, what I think is even more surprising is that there does not appear to be support for Mac OS X Server and "push" support. If Apple wants business customers to replace their Blackberries with iPhones (not an unreasonable assumption, given that Steve directly contrasted the iPhone with a RIM smartphone, and given the iPhone's price), they need to allow those customers to integrate with their own mail servers (as they currently do using software from RIM). A law firm is not exactly going to be sending out their emails from a yahoo.com address (unless they're from Nigeria with untold millions, I suppose). Apple even has the foundations in place for this, inasmuch that they already have industry-class hardware and software (Xserves and Mac OS X Server) that can (and do) run enterprise mail services. All they need is a new module in Leopard Server, and they've got a platform to compete directly with RIM, and a great opportunity to get a Mac into the door of corporate data center. A company that never considered Macs might be motivated to do so when a senior executive insists on replacing his "ancient" Blackberry with a shiny new iPhone.

So this one's officially on my prediction list for Leopard. You heard it here first, unless someone else has said it, at which point you heard it here sometime after that.

We didn't see any public announcement about it (it's times like this that I really miss being at WWDC and being able to ask Apple engineers things), but I did notice a little tidbit on one of the Leopard Server pages:

Mail services

For fast email access while on the go, Leopard Server supports IMAP IDLE to notify Leopard users immediately when new email arrives.

Hmmmmm....

--

Being the curious type, I took a look at the contents of the Mac Safari 3 installer before installing it (using the excellent utility Pacifist). Surprisingly, the beta of Safari 3 overwrites your existing version of Safari. That's right; if you install the beta, you lose the supported version of your browser.

Now, to be fair, the installer does archive your existing version of Safari prior to the new installation, and Apple does provide an uninstaller that restores your original Safari. However, while the beta is installed, you have no way to access the original version.

This is a terrible, terrible idea, and has the capability to cause some other system problems as well. Because parts of Safari (particularly something called WebKit) are used in other application (like Mail or Dashboard), installing the beta might adversely impact them as well.

MacFixIt (the premiere Mac troubleshooting site), is, rather predictably, filled with horror stories of the beta breaking all kinds of widgets and other apps.

Installing a beta version of an OS (i.e., doing what a bunch of developers at WWDC are doing to their laptops this week) is one thing, and users should expect things to not work correctly. After all, that's what a beta is. However, installing a beta version of a single application should not affect the entire system.

This is particularly frustrating because Safari already has a mechanism in place to install a standalone version that doesn't impact the system. It's part of WebKit the aforementioned Open Source project that is the foundation of Safari. If you download a beta version of WebKit, you get a version to play with that is entirely self-contained.

I'm going to play around with the Safari 3 beta and see if I can make a standalone version, based on the WebKit install. I'll let you know if I get anything working.

Interesting aside: the archiving and restoring of your original Safari is handled in a rather ad-hoc manner by a series of shell and perl scripts. Most of it is written in shell, but there is also one perl script. However, the script is very straightforward (consisting of a single foreach loop), and I don't understand why they bothered writing it in perl. Or, put another way, I don't understand why they bothered writing the other scripts in shell; there isn't a reason to mix the two. The only reason I can come up with is that this installer was put together rather quickly by several different people, or at least was copied and pasted from work by various people.

This highlights one of the standing problems with Mac OS X, which is the lack of a proper package management system. I'll write an article in the future on what this means (something else for my todo list!), but in short a package manager is a program that tracks what files get installed by what application. Mac OS X doesn't have a very powerful package manager, so developers are forced to manually perform the kinds of housekeeping that the Safari beta installer does. There is a new PackageMaker (part of Apple's Installer application) coming in Leopard, and although I can't talk about in detail, it does start to fix some of these shortcomings.

--

Back in September, Apple released a version of their Software Update application for Windows. At the time, it seemed a bit odd to me, since the only software they had on Windows were iTunes and QuickTime, and both already had their own built-in updating mechanisms. So I couldn't figure out why they were bothering to port Software Update.

However, now that Safari is out, it makes a bit more sense. iTunes has its own updating mechanism built-in to the Mac version, so the Windows port would have gotten it "for free", so to speak. QuickTime on Windows long predates iTunes, and it's had its own updating mechanism on Windows for years.

However, Safari doesn't have any updating mechanism built-in, since it makes use of Mac OS X's Software Update. So, if you are going to port Safari to Windows, you need to write a new updating mechanism for it. And if you need to do that, you might as well save some work and just port your entire Software Update application, and then use it for the rest of your Windows software as well.

So that's evidence, to me at least, that the Windows version of Safari has been in the works (or at least in the planning stages) since at least August of last year. (Software Update for Windows was released in September 2006.)

An aside, which started off as a footnote and became too long:

As a Computer Scientist, I dislike applications having their own built-in updating mechanism. From a technical point of view, it makes sense for there to be a central updating mechanism shared by the whole system. In fact, this is what many Linux systems do. However, this brings in all sorts of political issues. For example, how does the OS vendor (say Apple) decide which third parties get to show up in Software Update? The vendor has to make that decision, since they need to put the updates on their server. But, once they do that, is there any guarantee, implicit or explicit, about the quality of that update? Is the vendor then expected to test the third party software to ensure that is works correctly?

One solution to this is to have each company create their own software updating mechanism. So Microsoft has one program that updates all of its software, Adobe has one that does all of its software, etc. This is now what Apple is doing on Windows. However, they still have the built-in functionality in several of their applications. For instance, each of the iLife apps will independently tell you about updates. From a Computer Science point of view, this duplication of effort is a terrible idea, since it creates additional software that doesn't really serve any useful purpose. In addition to wasting man-hours doing the same thing over and over again, we have the problem of dealing with bugs.

Remember MoAB Madness Part 1? We talked about how all software has bugs, period. Death, taxes, and bugs in software. So the more software you have, the more bugs there are. And if you get two people to write different pieces of software that do the same thing (in this case update a program), you have more of an opportunity for bugs.

It's also aesthetically unpleasing. And that has to count for something, right?

--

Steve talked about how 67% of the 22 million Mac OS X users are using Tiger. I'd be curious how many of those users bought a new Mac that already had Tiger installed, versus how many of those actually went out and purchased Tiger to upgrade from Panther.

In some ways, this figure is the critical number when it comes to OS adoption rates. Think of it this way: it says less about how compelling your OS is if people "upgrade" by buying a new computer, rather than "upgrading" by actually going out and buying an upgrade. This is the classic problem that Microsoft has; comparatively few people actually purchase new versions of Windows to install on their old PCs. This means that it's very hard to grow the installed base of your new OS (i.e., Vista) at a higher rate than the rate of new computer sales.

So, my question boils down to this: how many of the 67% of users running Tiger are simply due to the fact that the Mac has become increasingly popular in the last 26 months (the amount of time that Tiger has been out), resulting in a higher rate of new Mac purchases, which just so happen to have Tiger installed.

This is significant because developers can't make use of the new features of an OS (e.g., Core Animation in Leopard) if there aren't a large number of people using it. And if a majority of the 22 million Mac users have shown a propensity to shell out $129 for an OS upgrade in the past, it makes it more likely that they will do so again in the future. And if that's the case, then we will likely see a faster adoption of Leopard-only features.

The other option is that Apple needs to continue to increase the rate of growth of the Mac, which is also fine by me.

--

A little more market share math:

At 1:07 in to the keynote, Steve talks about Safari's market share, and he tells us that there are 18.6 million Safari users. That's presumably counting all versions of Safari. Let's see what kind of market share that is.

Now, the oldest release of Mac OS X capable of running Safari is Jaguar (10.2), which could run Safari 1.0.

Earlier in the keynote, Steve told us that 90% of the 22 million Macs are running either Tiger or Panther, with the remaining 10% running something older. He didn't break that 10% down, but let's be unrealistic for the sake of being generous to Safari's market share, and say that all of those computers are running 10.1 or earlier, so they can't run Safari at all.

90% of 22 million is 19.8 million. So, 18.6/19.8 = 94% of Mac users are using Safari.

This isn't too surprising given that Safari is the only browser that ships by default on Mac OS X, but I'm still a bit surprised by how high this figure is. Recall that Internet Explorer shares the same status on Windows, and yet Firefox has considerably more than 6% market share. Although I suppose that Firefox doesn't have the same security advantages over Safari on the Mac as it does against Internet Explorer on Windows, and thus there is less motivation to use it.

--

And a final thought on market share numbers:

When Steve talked about growing Safari's market share, he showed a pie chart with the percent shares of Internet Explorer, Firefox, and Safari. This starts at about 1:07:15 or so.

There is a 78% slice for IE, a 15% slice for Firefox, a 5% slice next to it for Safari, and then a 2% slice for "other".

Then, when he says "well, we dream big, we would love for Safari's market share to grow substantially", the chart changes to only show Safari and IE (and drops the text showing the numerical percentages). But here's the thing. The IE slice doesn't change in shape at all. All that happens is that the Safari slice subsumes the Firefox slice and the "other" slice.

In other words, in Steve's vision, he sees IE's market share remaining the same, and Safari replacing Firefox and other assorted minor browsers.

Now, this can't actually be what Steve wants, but I found it rather strange that he would choose to present it in this way.

--

It's interesting how Apple is breaking down the versions of Safari 3.

Steve explicitly said (at 1:12:45 in to the keynote) that there will be "three different versions of Safari 3. One that runs on Leopard, one that runs on XP, and Vista".

Counting the Vista version and the XP version as separate releases is a bit odd. By this logic, there should be four releases, since Tiger and Leopard should be counted separately.

Or is this a subtle hint that the final version of Safari 3 won't be available for Tiger?

[2 minutes later]

As it turns out, the Safari download page only lists two versions, not three1. One for the Mac, and one for Windows (XP or Vista).

Not sure what to make of this.

--

Although I'm not much of a gamer myself, I'm always happy to see more games on the Mac. So it was welcome news to see that EA is porting several of their A-list titles, even if Bing Gordon's (the exec from EA with the pretty cool title of Chief Creative Officer) jokes fell a bit flat.

I assume that EA's games will be Intel only, since doing a port to Intel Macs is trivial compared to a complete Mac port. And given the fact that Apple is now only selling Intel machines, I can't see EA doing that extra work. There isn't any specific mention of system requirements on the EA site, so we don't know for sure, but I won't be surprised when they do announce it.

Another nail in the coffin for us PPC lovers.

Death lies on her like an untimely frost
[u]pon the sweetest flower of all the field.2

Or something.

It's also not an auspicious sign that EA misspells the name of the platform on their press release. The call it "Apple MAC OS X", rather than "Mac OS X". Yes, this truly is a minor nit to pick, but I somehow think that this sums up the situation well.

--

That's it for this post. Next, I'll put up the remainder of my random thoughts on the keynote.

OK, technically there are three options, but only because one of them is a bundle of Safari and QuickTime for Windows. ↩

Wednesday, June 13, 2007

I've got quite a bit coming regarding Steve's Keynote. However, since this was a small, self-contained post, I thought I'd get it online now.

[Update: My second article is up. Among other things, it talks about the iPhone as a competitor to the Blackberry.]

[Update: My third article is up. Among other things, it proposes a new "I'm a Mac" ad to highlight what Steve called the Finder's "famous column view".]

--

The gigapixel image of the Library of Congress image used in the 64-bit demo was referenced in the legal boilerplate at the beginning of the webcast.

Gigapixel image provided courtesy of Max Lyons

This seemed kind of odd to me, since I don't recall any previous Keynote featuring an image (or video, for that matter) credit in that particular location.

But, more to the point, it means that we can look up the creator of the image, Max Lyons. Google points us to his personal site, and to a site for his company, TawbaWare, which sells shareware digital photography applications. Both have links to several galleries of images, which are all quite stunning. All of the images are created through a process known as "stitching", which involves taking a series of overlapping pictures of a scene and combining them together to form a single image.

The Library of Congress image that Steve used appears to be one of Mr. Lyons' favorite pictures. It's on the front page of his personal site, and different views of it appear on his Technical and Bio pages as well.

But here's the funny part. Although Mr. Lyons does not discuss the computer hardware he uses, it is quite clear that all of his work is done on Windows.

The actual stitching is performed using an Open Source package called Panorama Tools, which is available for all three major platforms (Mac, Windows, and Linux). However, Mr. Lyons also uses some of his own software, including a helper tool called PTAssembler. All of this software is Windows only. In fact, PTAssembler is written in Visual Basic 6.

It seems a wee bit ironic to me that the image used to show off the power of 64-bit Macs wasn't actually created on one.

Monday, June 11, 2007

Before we begin, I'd like to apologize for the recent lack of postings. I've got several articles that are 80% done, and I hope to have them up sooner rather than later. That includes the next installment in the increasingly inaccurately named QoAB (Quarter of Apple Bug Articles) series, MoAB Madness.

--

This WWDC1 marks the first time in a few years that I won't be there in person. Since I can't gossip with thousands of people waiting in line for the keynote to start, I figured I'd post a few of my keynote-related thoughts here.

However, as the hours wane and the keynote draws near, I come to the realization that I haven't finished writing this article. Part of the problem is that I've been trying to write a good chunk of background material on some of the predictions. For example, there are at least two of them that I've already written 500 words about. And, if I wait to finish up my long winded (er, I mean "detailed") summary of each, I won't have this done before the keynote actually takes place. And that would seem to somehow defeat the point of a predictions article.

So, I've decided to make this a very short list. Or, more accurately, I've decided to make each entry rather short. I'll then follow up with a second article with all of the background content that didn't make the cut2. So, my apologies to my less technologically-oriented readers -- the stuff I'm writing for you will be up soon.

Several of these predictions are anti-predictions, in that I'm predicting that something won't happen. These are all things that I've seen others mention as possibilities.

1) Boot Camp Supports Virtualization in Leopard.

2) ZFS will not be the default file system in Leopard.

3) More Core Animation eye candy.
Mostly gratuitous, but a few useful ones.

4) Cinema Displays get built-in iSight cameras.

5) No multi-touch on MacBooks or MacBook Pros.
No touchscreens, no gestures on the trackpad (other than two fingered scrolling), no tablet Mac.

6) No major revision in Aqua (nor a replacement for it).
No "Illuminous".

7) No iPhone SDK.
Yet.

8) Adobe CS3 and Office 2008 Demoes.
A no brainer, to me.

9) .Mac Backup meets Time Machine.
.Mac disk space usable as storage for Time Machine, or at least the .Mac Backup functionality gets merged into Time Machine.

10) Subnotebook (MacBook mini?).
This one's more of a long shot, but I'm still expecting to see a replacement for the 12 inch PowerBook. I'm defining this one as a laptop that's notably smaller (in all dimensions) that an existing model.

And there you go, ten predictions for tomorrow's keynote.

I'll have a post-keynote commentary as well.

Apple's World Wide Developer Conference, the annual week-long show where Apple shows off the future of Mac OS X. ↩

Wednesday, February 28, 2007

January 2007 was home to a project known as the "Month of Apple Bugs", aka MoAB. This is the first article in MoAB Madness, a multi-part series about the project. I had intended to run some of these articles in January, during MoAB, rather than in the weeks following it. It was supposed to be a MoABA (Month of Apple Bugs Articles). But, note to self, I've discovered that it's a good idea to start writing something in January if you intend to post it in January. Since I'm not exactly on schedule, I think that this is going to have to be a QoABA (Quarter of Apple Bugs Articles).

At any rate, MoAB received a fair amount of press, albeit mostly in the tech community. So what am I hoping to add to this, especially at this late date? Well, I think that a lot of the coverage didn't cater to the audience that I hope to reach, namely the educated layperson. Many of the computer literate yet non-technical people I've spoken to hadn't heard about the project, or didn't understand what it was about. So, what I hope to do in MoAB Madness is provide both a big picture view of the computer security landscape, as well as a non-technical discussion of the bugs themselves.

We'll start MoAB Madness with this installment, which will talk about what the project is, and start to give some background on the principles at play. Part 2 will talk about the people who look for bugs, and how they disclose them once they find them.

Additional articles will look at the bugs themselves. Part 3 will give a brief overview (in non-technical terms) of each of the bugs revealed in January. Parts 4, 5, and 6 (and possibly more, if I need them) will look at some of the bugs in greater detail. From a technical perspective, many of the bugs can be grouped together. So, each of these articles will examine one class of bugs, and, in layperson's terms, discuss what makes them tick. Each of these classes are textbook examples of extraordinarily common programming mistakes, and I hope to use the MoAB bugs to explain some Computer Science principles.

Finally, the last article will look at the lessons learned from the project.

It should be noted that I'm not going to link to the actual MoAB website. Call me paranoid1, but I never visited the MoAB site in a normal web browser2, and I don't want my readers to either. My rationale was that I didn't want to visit a website of a project dedicated to (flamboyantly) pointing out security problems in Mac software using software on my Mac. As it turns out, my gut instinct turned out to be right, since on Day 29 the MoAB entry contained an image that locked up Safari. Anyway, if you really, really want to visit the actual site, it's a simple Google search away.

--

So what was the Month of Apple Bugs? I'll give a brief synopsis here, and leave most of the commentary and editorializing to future articles.

Back in December, before the project got off the ground, we had a fair bit of info from the MoAB website:

This initiative aims to serve as an effort to improve Mac OS X,
uncovering and finding security flaws in different Apple software and
third-party applications designed for this operating system. A positive
side-effect, probably, will be a more concerned (security-wise)
user-base and better practices from the management side of Apple. Also,
we want to develop and provide tools and documented techniques to aid
security research in this platform. If nothing else, we had fun working
on it and hope people out there will enjoy the results.
(LMH and Kevin Finisterre, 2006).

So what's the initiative they're referring to? It's pretty simple, really. Every day in January (that's 31 days, for those without access to a calendar), the MoAB project was going to release details about a previously undisclosed bug in Apple software. Well, it wasn't all going to be Apple software. From their FAQ:

Are Apple products the only one target of this initiative?

Not at all, but they are the main focus. We'll be looking
over popular OS X applications as well.

Why were they doing this? Well, they tell us that it isn't out of malice:

Is this an attack, revenge, conspiracy or some kind of evil plot
against Apple and the users of Apple products?

Not at all, some of us use OS X on a daily basis. Getting problems
solved makes that use a bit more safe each day, for everyone else. Flaws
exist, with and without people disclosing them. If we wanted to make
business out of this we would be selling the issues and the proper
exploit for each one. Thus, business-wise, we are wasting a good cake
with this project (although software by Apple isn't really of interest
in these terms, except iTunes and other high-profile applications).

A tiny bit of editorializing: I will grant them that if they were really "out for evil" they would have been selling the information about the security bugs to the highest bidder. However, their actions were still somewhat irresponsible. We will talk about about this in detail in Part 2.

At the beginning of the month, a developer (and a former Apple employee) Landon Fuller launched a "Month of Apple Bugs Fixes" project where he hoped to provide unofficial patches to fix the previous day's bug. After a few days he set up a Google Group to coordinate his efforts with other volunteers. On Day 6 the MoAB organizers contacted Landon (note that this link contains a link to a MoAB page, which I don't recommend following) proposing that they give him early access to the bugs in order to expedite his repairs. After some deliberation, he declined due to a possible perception of conflict of interest. In the end, this group did indeed provide unofficial patches to many of the bugs.

Finally, it should be noted that Apple never made any formal comment on the MoAB project. I'll have a lot more to say about that in the final article of MoAB Madness. To date, they have released two security updates (Security Update 2007-001 and Security Update 2007-002) that give credit to the MoAB project for discovering the bugs, and fix a total five bugs.

--

As mentioned in the introduction, before we get to the bugs themselves, we're going to talk about some basic principles of computer security. But before we do that, we need to figure out exactly what we mean when we say "computer bug"?

That one's easy, right? Ask someone on the street, and you'll likely get an answer along the lines of "a bug is when that stupid computer doesn't do what I told it to". Or maybe "a bug is a computer glitch". But, to a Computer Scientist, neither of those answers is quite correct.

In order to understand what a bug actually is, we need to look at what I've termed Deber's First Law of Computer Science:

Deber's First Law of Computer Science

Computers do exactly what you tell them. No more, no less.

At a glance, that might seem at odds with our intuitive definition of a bug. After all, a bug is when something you didn't intend to happen actually does, right? Well, the answer lies in the meaning of you. In most cases, you is the developer who wrote the computer software that you (the user) are using. In other words, the computer is doing exactly what the programmers told it to do; the problem is that the programmers screwed up.

It's important to recognize that all computer software has bugs3. Period. Full stop. Death, taxes, and bugs in software. Software is simply too complicated for us imperfect humans to write correctly; modern software can contain tens of millions of lines of computer code. But, more to the point, computers are literal entities, while we humans are not. We know how to interpret the world around us and extract meaning when our information (or instructions) are fuzzy or unclear. Computer do not. Take a household example:

Apply to hair, lather, rinse, repeat.

A person reading those instructions knows that the shampoo manufacturer intended that you use two applications of their product. A computer reading those instructions would keep applying the shampoo until the bottle ran out, and then crash since there was nothing left to "apply to hair". Furthermore, the instructions don't specify all sorts of details that we humans automatically interpret, but would need to be explicitly stated for a computer. Do you have to wet your hair first? How wet? What temperature water? How much shampoo? How do you "lather"? How long do you lather for? How do you know when you've "rinsed" enough? And on and on and on. If a programmer doesn't specify each of these things correctly, the program might crash.

Finally, it should be noted that working on software is a Sisyphean task. Every time you add a new feature, you add new bugs. In fact, sometimes when you fix one bug, you add several new ones.

--

Some bugs are minor and insignificant (maybe some text is displayed 1 mm too far to the right), and some are major and potentially disastrous (maybe all of your files get deleted). It's some of the bugs in this second category that we're concerned with today. Or, more specifically, it's one type of major bug that we're concerned with today: the security bug. Security bugs are bugs that, in plain English, let nefarious evildoers do bad things to your computer. Best case, the bug can simply let them crash your computer and cause you to lose whatever you were doing at the moment. Worst case, the bug can let them take full control of your computer, and do anything from stealing your credit card numbers to deleting all of your files.

Terminology wise, we say that a security bug causes a security flaw, also known as a vulnerability, or a security hole. The computer code crafted to exploit these vulnerabilities are called, unsurprisingly, exploits.

We can further divide security bugs into three categories, depending on the amount of user interaction necessary to enable an attacker to exploit the security hole.

--

The least severe are those that require what I like to call active user interaction. In other words, an evildoer sends you an application of some sort (or gets you to download an application of his choosing), and you have to do something explicit (like double clicking on it) in order for the bug to be exploited. The key here is that the file in question is of a type that is known to be potentially risky (e.g., an application, not a picture or music file).

Some would argue that these types of security bugs aren't an issue at all, since once an adversary convinces you to run a program he sent you, all bets are off. Remember Deber's First Law of Computer Science? If you are running an application, your computer is doing exactly what someone else (the developer of the application) told it to. Is it a bug if the things that this developer told your computer to do are evil? Consider an example where an evildoer sends you an email message saying "Please delete all of your files immediately!". Would it be considered a security flaw in your email program if you followed these instructions and deleted all of your files? Think of it this way: any time you run an application, you are placing your trust in the fact that the authors of the application did not have malicious intent. Most of the time that's true, but when it's not, you can be in real trouble.

In some cases, I feel that bugs in this category are actually security problems (particularly those involving "privilege escalation", as we will see later on in the series), but in many cases they are not. But, call them what you want, these types of security problems are the most common in the wild. Most of the "email viruses" (e.g., classics like ILoveYou, MyDoom, etc) are in this category.

--

The second type of security flaw are those which requires passive user interaction. These are flaws where the user still needs to do something, but that something is an apparently innocuous action. In most cases, that action is visiting a web page. In other words, bugs in this class can cause something bad to happen to your computer if you go to an evildoer's webpage (or a webpage an evildoer has hacked and taken control of). Other paths of exploitation are viewing email messages (note that we're talking about viewing the email message, not opening any attachments) or opening "harmless" files such as Microsoft Word or Excel documents. Recent examples are the WMF image exploit or the recent spate of vulnerabilities in Microsoft Office products.

The astute reader might notice an ambiguous gray area between the aforementioned opening of a "harmless" file, and the opening of a "known risky" file mentioned in the first category. And that astute reader would be quite correct. The division isn't always clear, especially when security flaws are discovered in previously "harmless" formats, such as Microsoft Office documents. So yes, the division between these two categories can be hazy. But the point is that some actions (e.g., opening an application) clearly fall into the first category, while some (e.g., viewing a webpage) fall into the second.

That very same astute reader might also point out that they've been told that viewing a "cute picture" or a "funny video" that they get via email is a risky activity that might get them infected with a virus (an example was the Anna Kournikova virus that promised pictures of the tennis star). And doesn't that contradict what I've just said about "harmless" files? Well, not really. Almost every case of a virus appearing in a "cute picture" is actually a case of an application masquerading as a picture. In other words, the user might think that they are opening a picture, but they are actually opening an application. There is a security flaw in play here, but it's not a vulnerability in the way the image file is displayed. Instead, it's a poor (one might even say stupid) design decision in the Operating System that allows an application to masquerade as a "harmless" picture.

--

The final category is the worst. Those are bugs that require no user interaction. In other words, an evildoer can wreak havoc on your computer simply by virtue of it being turned on (and connected to the Internet). These are obviously the worst, since a user can behave perfectly and still be attacked. Evil programs that exploit these types of bugs are often called worms, and can spread themselves without user interaction (once a computer is infected, it automatically seeks out other computers to infect, thus perpetuating the infection). An example is the vulnerabilities used for the Code Red virus a few years back.

--

There is one final criterion that we have to consider, which is the idea of a default configuration. The meaning of the word "default" in this case is a relatively new usage coined by the Computer Science community, and not the more common "due to the exclusion of other candidates"4. In Computer Science parlance, it means the "standard way" or the "preset setting". So, a default configuration is a configuration that ships with the software from the manufacturer, and the configuration that will remain in place if a user doesn't explicitly change it. For example, the default configuration in many Word Processors is to use a 12 point Times New Roman font.

When it comes to computers, many users (especially less technical ones) will never change those default settings. So, a vulnerability that exists in a default configuration is far, far more significant than ones that exists in a configuration that a user has to explicitly set up.

--

With that, we're at the end of the probably-too-lengthy Part One of MoAB Madness. Part Two will talk about the people who look for security bugs, and the methods they use to disclose them once a bug is found.

I used a web browser called lynx, a text-only browser than runs on the command line. ↩

The etymology of the term "bug" isn't particular clear. (Do I get bonus points for using both entomology and etymology in the same article?) It certainly predates WWII, although at the time it referred to problems with electronic gear (e.g., radio equipment), rather than computers (since computers didn't really exist yet). A famous story (and one that is often incorrectly cited as being the genesis of the word) relates to Grace Hopper, one of the early greats of Computer Science (and one of the very few women in the field in that era). While trouble shooting a problem in the vacuum tube equipment, she determined that it was caused by a moth that had gotten into the components and met an untimely demise. The log book entry read "First actual case of bug being found", and had the moth taped to the book. This page is now in the Smithsonian. More info about the history of the word can be found here, if you're interested. ↩

Friday, February 2, 2007

I've changed the style of footnotes that I'm using in this blog. I've also undertaken a bit of revisionist history, and updated the footnote formatting (but not the content) in all of the previous blog posts. I've also taken the opportunity to clean up a bit of other formatting.

Each footnote number in the main text is now a link to the footnote at the end of the document1. (Actually, I suppose that makes them endnotes, but in a single-page format (such as the web), the difference is tiny2.) Each footnote contains an arrow, ↩, which links back to the place in the article where the footnote came from.5

At least in theory it's an arrow. The arrow isn't a picture; it's actually a Unicode symbol (number &#8617;, to be exact). I'll write an article some time on what that means, but for now just think of Unicode as a way to describe characters that are more complicated than your run-of-the-mill "a" or "$". Most importantly, it allows for the characters necessary for non-English languages (e.g., Cyrillic characters, or those characters found in any number of Asian languages). However, it also provides all sorts of other cool stuff, ranging from musical notes to math symbols6. Now, this is all well and good assuming that your web browser is smart enough to display Unicode characters. If it's not, the arrow may appear as a box (which is the generic "I don't know how to display this character" character). That shouldn't happen on the Mac, nor with Firefox on either Windows or Linux. It may, however, occur with some versions of Internet Explorer on some versions of Windows. If you're using IE, might I use this opportunity to suggest trying out Firefox, a third-party web browser that is far more secure (and capable) than IE.

Anyway, this style of footnote is directly based on that of Daring Fireball. A discussion of these footnotes can be found in this article.

On a related note, I've started to use Markdown to prepare the text of this blog. It's a system that makes it easier to write the text of an article while still allowing easy access to features necessary for web publishing (like links). If you like using a text editor (as opposed to something like a Word Processor), it's worth a look. I've also hacked together a small script to expedite footnote processing, since Markdown does not provide "native" footnote support. This omission is particularly odd given that Markdown is written by John Gruber of Daring Fireball fame, whose footnote style is the direct inspiration for my script. When I get the code cleaned up a bit, I'll send it along to John.

What's This Blog?

A Computer Scientist writes about the Mac, technology, Human-Computer Interaction, usability, and whatever else comes to mind, all while working at least one relevant Simpsons reference into every post.