Josh Rhoderickhttps://joshrhoderick.wordpress.com
I'm Josh. I like software and fiction and stuff.Fri, 18 Aug 2017 05:03:04 +0000enhourly1http://wordpress.com/https://s2.wp.com/i/buttonw-com.pngJosh Rhoderickhttps://joshrhoderick.wordpress.com
Will iOSification kill Mac OS X?https://joshrhoderick.wordpress.com/2011/08/14/will-iosification-kill-mac-os-x/
https://joshrhoderick.wordpress.com/2011/08/14/will-iosification-kill-mac-os-x/#respondSun, 14 Aug 2011 19:43:26 +0000http://joshrhoderick.com/?p=76Continue reading →]]>With the recent release of Lion, it is clear that Apple is trending toward the “iOSification” of Mac OS X. And why shouldn’t they be? Sales of iPhones and iPads have made Apple a ton of money. iOS is obviously doing something right, so why shouldn’t Apple bring some iOS goodness to the Mac? Yet, after using Lion for a few weeks, I admit: I am not impressed.

Not only do Lion’s new iOSy features feel out of place, they make me worry about the future of Mac OS X. Is iOS going to overshadow Mac OS X so intensely that Apple is increasingly tempted to bring iOS “back to the Mac”? There is an old adage: “When all you have is a hammer, every problem looks like a nail.” For Apple, iOS is one hell of a hammer and I am afraid that the Mac platform is starting to look a lot like a nail.

“Natural” scrolling is not so natural

Take “natural” scrolling for example. (And, yes, the word “natural” must be in quotes at least once because this is entirely subjective.) Even if Apple convinced Mac users to alter their behavior, do you honestly believe Microsoft or the Linux community would join Apple in setting things “right”? Mac users are fairly open-minded about changes like this: they tend to trust Apple’s judgment. But Microsoft has a huge stake in not pissing people off (who cares if they are happy, just don’t make them angry), and Linux users are notoriously conservative about interfaces (just look at the tempests in a teacup over changes to KDE4 and GNOME 3).

I don’t care how natural it is, if you think Apple can single-handedly convince the world to adopt natural scrolling, I’d like to introduce you to a man named Dr. August Dvorak. There are legions of keyboard junkies who swear by the Dvorak layout, but most of us just don’t give a damn. Why should we? We are doing just fine, thanks. Using a Dvorak keyboard layout, we might be able to squeeze out an extra few words per minute but it doesn’t justify relearning what we already know. Besides, unless everyone is going to switch, we would be forced to deal with QWERTY anyway. Natural scrolling faces the same hurdles.

With a touchscreen, natural scrolling makes sense. As Ellis Hamburger explains, that’s because we feel like we are physically moving the object with our finger. Yet, with a trackpad or a mouse (especially a wheeled mouse), there is a disconnect between the onscreen object and our finger.

Because your computer screen is on a completely different three-dimensional axis as the surface you’re touching, “natural scrolling” is jarring. I feel like I’m in the movie Inception and I’m trying to walk up a wall in front of me.

I did an experiment by tilting my laptop screen as far back as possible. Once I did that, “natural scrolling” felt more natural. Once your screen is on the same plane as the surface you’re sliding your two fingers across, it works.

But because your screen is oriented almost perpendicularly to the trackpad, it doesn’t work. It works when you’re touching the actual content (like on an iPhone or iPad), but not if your hands are manipulating a space that doesn’t mentally signify what’s happening on another plane.

So before you say that natural scrolling will eventually dominate, I ask you this: what benefit does it provide? In a world full of touch devices, it may offer more consistency but, much like Dvorak, there is nothing to be gained by switching. We are doing just fine, thanks.

Launchpad, meh

The only highly visible feature brought over from iOS is Launchpad. To be fair, I really did give it a try. I spent a good half hour organizing it like I would my iPhone, but it just didn’t feel right. Launchpad is rife with problems, one of the most obvious being that applications often have multiple components on the desktop. Just about every Adobe application will install a half dozen Adobe “applications” into your Applications directory, often including uninstallers. Until all apps are delivered via the App Store, Launchpad is just going to feel funky. And since not all apps are ever going to be delivered via the App Store, Launchpad is doomed to always feel funky.

Launchpad also suffers from serious interface incompatibility. Unlike on iOS, Launchpad functions a bit like a modal overlay: it only pops up when the user initiates it. On iOS, it is the primary interface through which the user interacts with the device. On Mac OS X however, Launchpad is more akin to Dashboard. Also, Launchpad’s folders are like iOS folders, not Mac OS X folders. The user can’t drop and drop folders into it, remove applications from it, or even organize it in any unambiguous way.

Besides, I have already have a hundred ways to launch my applications. The Dock is a pretty damn good one, and there is the Applications folder (also in the Dock). Spotlight also works great. Or any one of the half dozen application launchers like Alfred and Quicksilver. Frankly, Launchpad serves no practical purpose on the desktop. The only realistic use-case I can imagine is if a user decided to hide the dock and set up Launchpad to fire on a hot corner and I have little doubt that all four of those people are thrilled with Lion and Launchpad.

Full-screen… why?

Nitpicking aside about what exactly Apple’s full-screen implementation is intended to be, it is rather obvious that Apple is trying to imitate the interface of the iPhone and iPad: one application at a time. Yet, one of the cool things about having a 27 inch monitor is that I can easily do many things at once. Joshua Topolsky claims that multitasking is “not easy”, thus making Lion more attractive to users but, for all but mobile devices, this is just false. Multitasking on a desktop is fairly easy and even computer novices can listen to music, chat, and browse the web at the same time. It isn’t the technology that makes multitasking problematic, it’s our own innate inability to juggle multiple tasks at once. In this respect, the iPad and iPhone keep us focused on one thing at a time because we have no choice.

We need full-screen applications on small devices because there simply isn’t enough room to do more than one task effectively. This just isn’t the case on most computers. Going full-screen in Google Chrome on my 27 inch 16:9 monitor is a hilarious experiment in blinding whitespace. I am willing to bet most users with screen sizes above 13” would find the same to be true: full-screen doesn’t really work when most application can only conceivably use 50% of available screen real estate. Full-screen is great for keeping us focused on the task at hand, but it isn’t so great if it were to artificially limit our ability to multitask.

The good stuff

Lion isn’t all bad. Auto Save and Versions are good stuff. There are some use-cases in which autosaving wouldn’t be desired, but since this feature is implemented on application-by-application basis, I doubt those applications focused on these niches will either opt-out or offer the user the option to do so. Mission Control too is a clever way to integrate Expose and Spaces, which reduces conceptual complexity for a lot of people. While I made heavy use of Expose and Spaces in their former incarnation, I think more casual users will find Mission Control useful. And, of course, there are dozens of improvements under the hood that make Lion shine but since this falls outside the domain of this article (and also because I couldn’t possibly explain it better than him), I suggest you read John Siracusa’s excellent review of Lion.

I worry for the future of Mac OS X

While it may appear that the iOSification of Mac OS X is thus far limited to a few additional features in Lion, all of which are optional, I fear that it indicates a clear trend: Apple wants Macs to work more like iPads. There are a hundred little indicators that things are moving in this direction, including everything from the Mac App store, to Launchpad, to dropping Xserve, to quietly leaving the “Mac” out of “Mac OS X” on Apple’s Lion promotional page.

Granted, it is obvious that many computer users are trending toward smaller, more mobile devices. And it is also possible that, in the next decade, touch-based interfaces will become dominant. Apple is obviously banking on this by introducing more gesture support in Lion and devices to support gestures like the Magic Mouse and the Magic Trackpad. Much of Lion is obviously targeting MacBook users, while we desktop users — the “truck” people, as Jobs famously referred to us — are no longer being actively courted.

Yet there will always be a significant portion of the population who utilize a computer for more than simply browsing Facebook and watching movies. The iPad is great is because it directly caters to these people. For years, these users were forced to use large, complex machines that could do just about anything, from running an enterprise database and web server to reading e-mail. With such a wide variety of tasks available, the level of abstraction must obviously be reduced and, thus, casual users struggled with concepts in computing (such as the filesystem and networking) that they didn’t want to be exposed to. Developers, software companies, and hardware manufacturers were simply not satisfying these users’ needs.

For these people, iOS is a miracle. They can do everything they want to do without all of the fuss involved with a big, confusing computer. I already know a few people who, after buying an iPad, no longer use their home computers. That is awesome… for them. But now I see that their world is encroaching upon mine and I don’t like it. I don’t want my desktop computer being treated like a big iPad. As a developer, I need my computer to do work. I need to be able to run a web server. I need the complexity. With Mac OS X, I get the best of both worlds: I am able to do everything I used to do in Linux but I can also reliably hook up to peripherals like projectors and watch Netflix movies and have a quiet machine that instantly sleeps and resumes without blowing up. And even I, a confessed geek, hate fixing things that should just work, like the hours I wasted trying to get my damn mouse buttons to work properly on Linux.

As a professional geek, I know I am a niche user. But, like John Martellaro points out, “If Apple ever got to the point where UNIX professionals, serious influencers, were to publicly give up on OS X, then all of Apple’s prior work to establish the prestige of OS X would go down the drain. In an era of social networking, that influence can snowball out of control.”

I am just praying it doesn’t come to that.

]]>https://joshrhoderick.wordpress.com/2011/08/14/will-iosification-kill-mac-os-x/feed/0rhoderickjWhy I Love iTunes (and Why it Matters)https://joshrhoderick.wordpress.com/2010/09/02/why-i-love-itunes-and-why-it-matters/
https://joshrhoderick.wordpress.com/2010/09/02/why-i-love-itunes-and-why-it-matters/#respondFri, 03 Sep 2010 02:07:52 +0000http://joshrhoderick.wordpress.com/?p=71Continue reading →]]>At Apple’s recent special music event, Steve Jobs announced the release of another iteration of Apple’s venerable media player, media store gateway, device manager juggernaut iTunes — iTunes 10, to be exact — with yet another feature shoehorned in: Ping, Apple’s music-oriented social network. I have little interest in Ping itself — okay, no interest at all — but this gives me the opportunity to discuss something that has been on my mind for a while: why I like — nay, love — iTunes.

For the record, when referring to “iTunes,” I am referring to the application itself, the media player, not necessarily the iTunes Store, although they are so tightly integrated as to be sometimes indistinguishable.

When I tell people that I love iTunes, I often get raised eyebrows or looks of incredulity. This might be because iTunes has become so common and well-known that actually liking it seems redundant. It would be a bit like admitting to your friends that, after much consideration, you do indeed find paved roads much more convenient for automobile travel. Even if it’s true, who cares? Surely the individual espousing such an opinion is either completely out-of-touch with the rest of the world or holds his own opinions in much higher regard than those of his peers.

Or maybe it is because it has become fashionable to complain about iTunes. When anything becomes ubuquitous, be it Twitter, Crocs, or iTunes, it is accepted by most with blasé and the grumbling desire to be rid of it. Since a large number of people use it in some capacity, iTunes is no longer a choice for most consumers. It’s just the thing you need to make your iPod work. What’s to like?

When described as just a media player, it’s easy to understand why iTunes is sometimes berated. After all, if you are just trying to play a few MP3s or watch a video clip, using iTunes is like shoving your head into a vice to pop a zit. However, when you need to manage a large and varied digital media library, and you need to synchronize this library with one or more portable digital media devices (iPods or iPhones, naturally), nothing on the market comes close to competing with iTunes.

A lot of criticism regarding iTunes is actually deserved. For example, a common complaint is that iTunes is slow, which is definitely true on Windows, at least it was the last time I tried it a few years ago, and is sometimes true on a Mac. Others claim that iTunes is bloated, which it most certainly is (and the addition of Ping has exacerbated this yet again). Still others gripe because Apple refused to design iTunes as a native Windows application. Again, this is true.

Despite its flaws, I believe iTunes to be one of the best applications ever developed for any platform. iTunes occupies a difficult niche at the crossroads of the digital and physical worlds, a precarious place where many other pieces of software often fail miserably. But, before I can explain why I think iTunes performs so well, I have to explain why so many other media players perform so poorly.

The Early Days

In the late nineties, storage capacity was growing, bandwidth was increasing, and the MP3 format had become commonplace. It was in this environment that Winamp stormed onto the scene and the digital media player became an indispensible piece of software for many, myself included. Prior to MP3s, my experiences with digital music were limited. I had kept a small library of downloaded MIDI files mostly for their novelty value — “Hey, wanna hear Axel F?” — and I had spent countless hours composing (tracking?) my own music using Scream Tracker (good memories, that), but that was pretty much it. Thus, before MP3s, I didn’t have much use for a media player. There just wasn’t anything to play, much less manage.

But Winamp changed everything and, have no doubt, it really did “whip the llama’s ass.” In its early years, there was nothing like it. Even the venerable Windows Media Player, which had existed in one form or another for years as a simple video player, did not compare to Winamp’s unique ability to organize and play music. Winamp’s surging popularity drew the attention of AOL who quickly snatched up Nullsoft, the company beind the media player. Like most Windows users, I relied on Winamp for many years, right up until I began using Linux in 2006.

By then, Winamp’s glory days had waned. It had long since become a bloated piece of nagware. Winamp’s UI had also suffered. Gone was the crude yet simple UI of yore. Instead, we now had the ability to “skin” the application. Unfortunately, all of the available skins made me want to wretch. Even the default skin was ugly, cluttered, and confusing. I have always hated when UI designers use theming as an excuse not to provide a good default. I don’t care if I can make my media player look like the cockpit of a B-2 stealth bomber, just give me a simple, clean UI.

I switched to Linux in 2006 and, considering the sad state of Winamp, I was eagerly looking forward to finding a replacement for it. For a while, Amarok addressed most of my needs. It was a fine media player in the Winamp tradition. But, by late 2007, Amarok had been suffering from the same bloat and lack of polish that affected KDE as a whole, so I tried numerous GNOME-based alternatives, Rhythmbox and Exaile being the most memorable.

Unfortunately, I never found a media player I really liked. It wasn’t because they couldn’t play music. That was the easy part. It was their poor UIs, their lack of attention to detail, and their inability to effectively organize and manage large digital libraries that I found most annoying.

Finding iTunes

Interestingly, although iTunes had been available for Windows users since 2003, I had never once tried it. I didn’t have much reason to. I didn’t own an iPod and I refused to buy DRM-laden music, which was all iTunes carried, on principle. At first impression, iTunes seemed like junkware to me. Whenever I downloaded the QuickTime player to my Windows machines Apple always tried to piggyback iTunes onto my download. In my experience, whenever a company is that eager to give away its software, you know it’s junk. It wasn’t until I switched to the Mac in 2008 did I really give iTunes a fair shake.

The more I played with iTunes, the more I realized how amazing it really was. Here are just a few of iTunes’s major achievements over its competitors:

The UI was clean, simple, and elegant. The next best UI I have
encountered was Exaile on Linux, albeit with a severe reduction in
functionality.

Although I resisted it at first, I was finally freed from having to
manually manage my music. Instead of creating directories for artists
and albums and then pointing my media player at these directories, I
just let iTunes handle the organization from the start.

Once I imported a song into iTunes, I could delete the original file.
This was unsettling at first but, once I got used to it, I enjoyed the
sense of freedom that it gave me: “Don’t worry, iTunes will take it from
here.”

Changes made to a song’s metadata automatically updated the respective
file structure if necessary. For example, if I moved a song from one
album to another, iTunes moved the file automatically. I had nothing
more to do.
Furthermore, this underlying file structure meant that, if I wanted to,
I could easily locate the actual physical file.

Smart Playlists were a welcome addition. You would think that something
so obvious would have been common in all media players.

Cover Flow provided a unique, elegant, three-dimensional way for me
to view my albums. Since going digital, I hadn’t realized how much I
missed the feeling of “flipping through” my albums.

iTunes offered a ton of metadata options, allowing it to gracefully
handle albums by single artists, compilations, multi-disc albums,
podcasts, audiobooks, and more.

I could sort music on any number of fields, including “Album by Artist,”
which I am quite fond of and have rarely seen duplicated in other media
players.

iTunes was the first media player I ever used that correctly handled
album artwork. Every other media player claimed to but, in my
experience, not one succeeded.

Clever and useful library searching. I remember struggling with this
with Winamp. It was sometimes difficult to know which songs you had
because there was no reliable way to search your library’s metadata.

Once the DRM was gone, I could buy songs directly from the integrated
iTunes Store.

When ripping CDs, iTunes provided me simple options. Even as a geek, I
had a difficult time determining the minute differences in lossy
encoders, bitrates, and licenses presented to me by other applications.
Presenting these options as a simple “Low, Medium, High” list was a vast
improvement. I can easily understand that high quality requires more
storage space, but I can’t easily understand why I would choose LAME
over MP4/aacPlus at 128kbps VBR versus CBR, etc.

iTunes seamlessly provides two-way synchronization for music and
playlists to an iPod with just a few clicks. If I rate a song on my
iPod, the rating is carried over to iTunes on my computer the next time
I sync. This is important because, with Smart Playlists, your
preferences might change over time, and it’s annoying to have to make a
mental note to “Rate this song down a star when I get home.”

Library sharing. While this feature could definitely use some work,
it is very easy to set up library sharing over bonjour on a local
network.

Uh oh. Was your sync was interrupted? On many media players, this can
be anything from a minor annoyance to a disaster. On my Sansa SanDisk,
I once interrupted a sync and some metadata must have become corrupt, so
I ended up scouring the web for how to reconfigure my MP3 player. It
sucked. With iTunes and an iPod, this type of corruption is almost
unheard of.

Frankly, after using it for a few weeks, I couldn’t believe I had ever used anything else. Of course, as with all software, there were drawbacks. Here are a few of them:

iTunes never has handled streaming radio very well. Unless your choice
of radio station is already in Apple’s default list, it can be a real
pain to add a new station.

iTunes is slow to start on all platforms. See “iTunes on Windows” below
for a more detailed explanation of the problem.

Limited encoder selection. As I mentioned above, this can also be
interpreted as a feature since limiting choices makes the process
simpler for most users. Power users should definitely go elsewhere for
advanced ripping and encoding options.

In true Apple style, iTunes will only play a handful of the most widely
used file formats. Apple doesn’t typically waste time, energy, or money
trying to please everyone. If you aren’t using the most popular formats,
namely MP3, AAC, or Apple Lossless for music, you’re out of luck.
Fortunately, it is possible to hack in support for other formats such as
FLAC and Ogg.

By offering a ton of metadata options, one has the tendency to want to
fill it all out. If this hasn’t been done already, it can mean many
painful hours of data entry.

Small changes in metadata can have weird effects. For example, if one
song has the album date off by a single digit, that song will
mysteriously disappear and be transported away into in a doppelganger
album.

iTunes on Windows

To be fair, iTunes on the Mac is different from what most people experience on Windows. On the Mac, iTunes is largely snappy, responsive, and is perfectly integrated into the hardware and the operating system. On Windows, iTunes is a gross imitation of a Mac OS X application. Apple didn’t even bother to redesign iTunes’s UI to match the platform, a condescending slap in the face to Windows users if there ever was one (but, since Windows has no unified UI and has historically had relaxed UI guidelines, I don’t think most people notice unless you point it out to them).

The worst problem with iTunes on Windows, however, is speed. With everything hanging from its neck from the notoriously bogged down iTunes Store to library sharing via Bonjour, which can be slow on large networks, it is no wonder that iTunes takes a while to start. Like all Apple software, it is designed for a Mac where, once started, it remains running in the background. This is different from what Windows users expect. Windows users have been trained to kill every unused application and service to keep their system from slowing to a crawl. When a Windows users sees mdnsresponser.exe and the Bonjour Windows service active, they freak out and kill it without really understanding what it is. They then gripe that iTunes has screwed with their computer and complain when library sharing or other services no longer work.

While this might sound like an overgeneralization, I have heard this exact argument more than once when speaking to people about iTunes. This is part of why Windows itself annoys me: it forces end users to grapple with concepts that should be abstracted from their experience. No one should be required to know how a Windows service functions just to use Windows. Yet, in many cases, users are sometimes required to manually tweak their systems in a way that assumes this knowledge.

Because iTunes is designed differently than most media players, Windows users tend to use it differently from most Mac users. Generally, one shouldn’t use iTunes like Winamp or VLC. In other words, clicking on a filename just to play a song or video clip in iTunes is really kind of silly. Sure, iTunes can do this, but it is designed to manage large media libraries, to sync with mobile devices, and to act as a gateway to the iTunes Store.

When someone says to me that iTunes is slow to launch, I find myself thinking: “When was the last time I actually launched iTunes? Must have been the last time I rebooted. That was, ummm, a month ago?” For Windows users, this kind of application longevity is almost unheard of. Launching and closing iTunes are regular, painful experiences not to be discounted.

iTunes Bloat

With iTunes, Apple has earned a lot of money and a lot of mindshare. In the minds of most, iTunes the application and the iTunes Store are one and the same. From Apple’s perspective, this is probably for the best. Regardless of which head of the bicephalic monster we are talking about, virtually everyone has heard of iTunes. But this presents a problem for both Apple and for consumers.

Apple doesn’t want to complicate things. And neither do consumers. So, when Apple wants to promote a new feature, it is going to end up in iTunes. After all, we already have it installed, we already know what it is, and we’re already using it. Why introduce a new piece of software to do something that iTunes is already designed to handle? Take iBooks for example. If you were Apple — or, hell, if you were you — would you prefer to install another application to manage your digital books, or would you just rather add a tab to iTunes?

This means that, with each iteration of iTunes, it gets more bloated. Here are just a few examples of how iTunes has outgrown the boundaries its original domain of music (and related features like playlists and streaming Internet radio) and iPod syncing:

the iTunes Store

Podcast support

iTunes LPs, iTunes Extras

iTunesU

Photo and video support

Library sharing via Bonjour

iPhone activation

Apps management

Ringtones

iBooks (non-audiobook book and PDF support)

Ping

For any software geek, bloat is a bad thing but, sometimes, the alternative is worse. The real key to understanding iTunes bloat is in the knowledge that iTunes is not just a media-player, it is a bridge between the physical world — the one we and our iPods occupy — and the world of digital media, from which our music and movies and apps spring forth. By acting as a bridge for consumers, iTunes simplifies an insanely complex task: quickly and easily connecting our eyes and ears with intangible media stuff.

A Bridge Between Physical and Digital

Before buying an iPod, I tried a handful of different media players. I hated all of them. The hardware was usually fine but the software was a nightmare. In fact, in one case, I had so much trouble with the software that the device was practically useless. And it is this junction, at the point where the hardware meets software for the end user, where iTunes truly shines.

In the early days — the FairPlay days — iTunes was not so much a luxury as it was a necessity. Apple needed to not only provide a way for users to sync their iPods with their computers, they also needed a way to reassure the frightened record companies that users were not going to pirate their music. iTunes seamlessly achieved both.

Not only could iTunes manage a vast library of digital media with syncing so precise that even one’s place in an audiobook was recorded on sync, it also provided a means to manage the RIAA’s useless but complex DRM portfolio. No other vendor was able to assure the RIAA that its DRM requirements would be met while satisfying consumers’ demand for digital media. It was a juggling act truly worthy of praise, even if you opposed DRM (as I did, and do).

This is iTunes’s real achievement: it opened the door to digital audio in a way that no one had done before. Consumers were confident that, by plugging in their iPod, their music would magically appear on the device. There was no crapware to install (other than iTunes itself, of course), or procedures to follow. It just worked.

The time is fast approaching when the intersection between the physical and the digital will have grown so thin as to be barely noticable, where wires will no longer be needed, and digital synchronization will happen with no user intervention whatsoever. Soon, your music, movies, photos, books, and other digital media will be either be streamed to the device directly from the cloud or will be managed in such a way that no intermediary will be necessary. I am willing to bet that, when Apple takes that step, iTunes will be behind the scenes making it happen.

Why it Matters

The world is becoming increasingly mobile, and software like iTunes is facilitating that change. I believe the lack of an iTunes competitor is one of the Android platform’s biggest disadvantages. Google is trying to leap over the need for iTunes by going directly to the cloud for everything, similar to what Microsoft did with its ill-fated Kin. But the infrastructure isn’t ready for this yet. With 3G technology, we simply don’t have enough bandwidth or throughput to push all of those songs, photos, and videos over the air. Even with WiFi, it isn’t simple to do right.

Just imagine Apple TV and Google TV duking it out. With Apple TV at least, the user can stream his iTunes library from a Mac, PC, or iOS device. From Google TV, this is all supposedly coming from the cloud, but we have yet to see how well this will work in practice. So iTunes is not only Apple’s not-so-secret Ace up the sleeve, it is becoming more relevant as our use of mobile devices shifts from casual entertainment to a necessity of modern life.

And that is why I love iTunes.

]]>https://joshrhoderick.wordpress.com/2010/09/02/why-i-love-itunes-and-why-it-matters/feed/0rhoderickjWhy Dell Segregates Ubuntuhttps://joshrhoderick.wordpress.com/2010/08/02/why-dell-segregates-ubuntu/
https://joshrhoderick.wordpress.com/2010/08/02/why-dell-segregates-ubuntu/#respondMon, 02 Aug 2010 17:29:19 +0000http://joshrhoderick.wordpress.com/?p=69Continue reading →]]>LinuxInsider has noted that Dell continues to play with its Ubuntu strategy in a way that confounds Linux advocates. For those closely watching the dramatic ups and downs of Dell’s relationship with Ubuntu, it can make one feel a little seasick.

…just days after the news broke that Dell had removed all Ubuntu-preloaded machines from its site, reports emerged that the company is actually *expanding* its desktop Ubuntu selection.

This, after Dell’s “love letter” to Linux went up — and then down — and after it crafted the daringly insightful “Windows vs. Ubuntu” comparison guide. All in the space of a few weeks!

I don’t really feel comfortable referring to this as Dell’s Linux “strategy” because it looks much more like an experiment than anything else. A strategy has focus, tactics, and a clearly defined goal. Dell seems to be doing something completely different. Regardless of what we call it, I think many Linux advocates don’t understand that Dell can’t offer Windows systems side-by-side with Ubuntu systems, and it isn’t because of Microsoft FUD or FTC G-Men. Take a look at these statements from Robert Pogson who was quoted in the LinuxInsider article:

Dell is fragmented, blogger Robert Pogson told Linux Girl. “Their site is a mess, and they do not have a vision for GNU/Linux. I do not understand why a customer cannot dial up a machine, choose an OS and go to the checkout,” he said. “That’s what other OEMs do, but no, Dell fragments their site with different products available to different customers without rhyme or reason.”

[…]

The only motivation, as Pogson sees it, “is possibly they are trying to please M$ without offending the FTC, who might frown on exclusive dealing,” he suggested.

[…]

“Businesses used to have ‘token blacks’ or ‘token women,'” Pogson noted. “I think GNU/Linux may be Dell’s ‘token OS.’ Ubuntu definitely rides at the back of the bus on Dell’s site.”

It’s a good guess, but it’s wrong.

First of all — the annoying use of “M$” and “GNU/Linux” aside — I have no idea why Pogson thinks the FTC cares about Microsoft’s (former?) monopoly on operating systems. They didn’t care much about it in 1991 when the FTC first noticed that it might be problem. It wasn’t until years later, when Netscape started complaining about Microsoft’s monopoly — which was, by then, practically blessed by the FTC — that the FTC bothered to do anything. And they did so then only because they had to. The effect of Internet Explorer on Netscape was blatantly obvious and could not be ignored. Of course, the web browser itself was never really important. It was just a symptom of a larger disease. Microsoft wasn’t making money from IE or earning a greater market share on PCs from it. Furthermore, competing browsers were never banned from Windows. The prosecution’s primary complaint was that it was too difficult to install competing browsers because Windows itself was hard to use, not that competing web browsers couldn’t be installed. Microsoft was actually correct in pointing out that IE was a “feature” of Windows, not a separate product. The lawyers within the Department of Justice failed to acknowledge — or, more likely, were afraid to acknowledge — that the real threat to competition was the operating system itself.

Keep in mind that Judge Jackson’s solution to Microsoft’s monopoly was to split Microsoft into two entities: one to build Windows while the other built software products, like web browsers, that ran on Windows. Although this might have negated both Netscape’s and FTC’s complaints, it would not have solved the real problem. The more obvious solution — the one few people talked about, especially within the DoJ — was to split Microsoft into multiple entities, each building operating systems, à la the breakup of Ma Bell. Ultimately, it didn’t matter. The antitrust case was thrown out and Microsoft recieved a gentle, maybe even loving, slap on the wrist. Fortunately, despite the help of the United States government, Microsoft has been actively eroding its monopoly ever since.

So why would Dell care about a token OS? It isn’t like Sony or HP or Lenovo are offering a variety of operating systems. As far as I know, no PC vendor has ever been confronted by the FTC for concerns related to operating systems. This is largely because much of the world still doesn’t know how to deal with software. Some nations have laws that treat an operating system like hardware, others treat it like ethereal intellectual property, some do both (United States), while others try not to think of it at all (China). Because no one can agree on what software really is, allegations of monopoly are doomed to fail. That’s why the focus was always on Windows and Internet Explorer rather than just Windows itself. By redirecting the argument to Internet Explorer, a number of simple solutions were made clear, stockholders were largely unharmed, and Microsoft was only required to fess up to the lesser of many more devious evils. But questioning Windows itself would have required an actual definition of software, a complete rewrite of patent law, and would have done severe damage to a multibillion dollar corporation and its stockholders. No, it was much easier to pretend that IE was the problem.

The real reason Dell segregates Ubuntu is obvious: Linux is unpredictable. Think for a moment about Dell’s hardware channel. Their hardware comes in from a variety of vendors, many of whom are partners with Microsoft, each designing products almost exclusively for Windows. It’s a giant commodified assembly line, typified by cheap parts, cheap drivers, and minimal testing. Dell uses economy of scale to make a (small) profit. It’s pretty easy to slap Windows onto these machines, fill with them junkware, and then ship them out to customers as quickly as possible.

Now imagine adding Linux to this channel. There is no guarantee that what is coming from the hardware vendors is going to work with Linux. While it is quite possible that this hardware can be made to work, it becomes Dell’s sole responsibility to do so. The vendor no longer bears any responsibility. This raises Dell’s costs for each hardware component they have to “certify” with Ubuntu. There is no central company or organization — not even Canonical — who can accurately predict what hardware will work from release to release. Dell must essentially find the subset of hardware that works with the least amount of work and create a parallel assembly line for just those components. Furthermore, none of Dell’s stock junkware or marketing ploys can be used on these machines. Customers are largely ignorant of operating systems — and they have a right to be if they so choose — and Dell doesn’t want its customers, who are already very price conscious, accidentally choosing an Ubuntu system just because it’s cheaper. If Dell actually lowered the cost of Ubuntu systems to reflect the difference of the Windows license, it might increase sales, but it won’t necessarily increase customer satisfaction or retention.

Meanwhile, it is in Dell’s best interest to promote alternate operating systems because they are stuck in a tooth-and-nail battle with dozens of other commodity PC vendors. Anything Dell can do to differentiate their brand or build a loyal customer base is going to be a welcome idea. We also mustn’t forget that Apple has taught the electronics industry that, to catch the ball, you have to be where it is going to be one second ahead of time. My guess is that, after years of dwindling profit margins and bullying from Microsoft, Dell is simply trying to keep an open mind about future markets. It is possible that Ubuntu might become a mass market alternative to Windows in the future.

I do not agree with the many Linux aficionados who celebrate Dell’s Ubuntu offerings. I think Dell’s move may actually be detrimental to Linux in general. I would much rather Linux be associated with high quality machines built by caring people, targeted at customers who really want them and are willing to pay for them, such as the machines offered by system76 or ZaReason. Dell has no direct market with its Linux PCs and it will likely not be very profitable for anyone involved, including Dell and Canonical. Frankly, I think that, to succeed, Canonical needs to set up partnerships with smaller companies like system76 to offer Ubuntu on high quality machines with limited hardware sets. The Linux sales model should not mimic Microsoft’s. In fact, for wide adoption, Canonical’s strategy needs to be friendly with proprietary hardware and software vendors and much more vertical, much like Apple’s strategy with the Mac.

]]>https://joshrhoderick.wordpress.com/2010/08/02/why-dell-segregates-ubuntu/feed/0rhoderickjAre iPhone Owners Just Religious Zealots?https://joshrhoderick.wordpress.com/2010/07/23/are-iphone-owners-just-religious-zealots/
https://joshrhoderick.wordpress.com/2010/07/23/are-iphone-owners-just-religious-zealots/#respondSat, 24 Jul 2010 00:38:07 +0000http://joshrhoderick.wordpress.com/?p=67Continue reading →]]>Alexis Madrigal of the Atlantic can’t understand why Antennagate didn’t destroy Apple and send us all out to buy a new Droid X. Given no better explanation, Madrigal presumes that it must be due to our religious devotion to Apple. In other words, Apple sold 8.4 million iPhones in the third quarter of 2010 because we all believe Steve Jobs is Jesus and Bill Gates is Satan. I mean — hello? — it’s obvious.

According to Madrigal’s logic, our only interest in Apple’s products is as icons of our devotion. Similarly, we want nothing more than to prostrate ourselves and worship at the altar of Steve. We are adherents of the “religious narrative,” as Madrigal quotes, dictated to us by our congregation. Just look at all of those crazy people lining up outside Apple stores to buy overpriced calculators. I mean, really, who would do such a thing if he weren’t a delusional zealot suffering from product deification?

To support his assumptions, Madrigal cites two academics whose work is a study called The Cult of Macintosh published in 2005, two years before the iPhone went on sale. In the early days of Apple’s comeback from the dead, Apple’s zany devotees were a proportionately larger group than they are today, and the study was not unique in pointing out some of the wackiest outliers from this group. But, in a world in which Apple’s customers, who own more than just Macs, range in the hundreds of millions worldwide, these stereotypes just don’t apply anymore.

Madrigal quotes someone named “Thomspon,” whom he doesn’t identify — the quote appears to be misattributed from Heidi Campbell — as saying: “This resurrection myth, and the belief in the infallibility of Mac technologies [sic?] is going to keep people still invested.” The “resurrection myth” is referring to the return of Steve Jobs to save Apple. Sheesh. “Mac technologies”? That’s a clunky and inaccurate term, but I’ll skip it. At least she didn’t say “MAC technologies.”

So, since it isn’t religious devotion, let me share with you the secret that keeps people invested in Apple: good products. It really is that simple. It’s easy to look at today’s current computer and smartphone offerings and wonder why anyone would buy an Apple product. I mean, look at Windows 7: it’s so sexy and functional and has desktop search and that new taskbar thing. Surely no “Mac technologies” can compare to that? And the new Droid X! It does e-mail. It does the web. It has those app things people keep talking about. It does… other things. What’s so great about a Mac or an iPhone?

Consider for a moment how Apple built its brand loyalty. Look at those same markets just three years ago: remember Windows Vista? Remember how annoying it was to find that your 32 bit drivers for your crappy webcam wouldn’t work with your 64 bit version of Vista Ultimate Extreme Super (Not For Commerical Use) Minus Plus Student Edition? Remember bloatware? Remember recalls prompted by melting power bricks? Cramped trackpads with shotty gesture support? These were the products that had most consumers chanting “I hate computers” like a mantra.

In response to the rock-bottom-prices crapfest, Apple provided something that disrupted these low expectations and consumers loved it. Of course Apple didn’t always get it right, but they got it right 90% of the time, whereas most of their competitors got it right 30% of the time or less. Consumers weren’t buying Macs because of a resurrection myth with Steve Jobs as the messiah, they were buying them because they were fed up with the commodity junk market that was the PC. They wanted something nice that just worked.

Remember when the “smartphone” was the Blackberry or the Dash or the Palm Treo? Yeah, me too. It sucked, right? Then Apple introduced the iPhone, a product that everyone said had no market which, ever since, dozens of manufacturers have scrambled to copy and emulate. Yes, there were smartphones before the iPhone and some people used them — mostly business customers — but they sucked. Few people even knew what a “smartphone” was. The definiton then was a phone that could recieve e-mail and do a few tricks. Consumers didn’t buy iPhones because they believed Bill Gates was the embodiment of Satan, they bought them because there was nothing else like it on the market.

Apple haters just love it when a crappy product reaches feature parity with an Apple icon. As far as they’re concerned, that’s it, game over. If Apple’s product has x number of megapixels, and Product Y also has x number of megapixels, then Apple has lost, and you’re just a fanoy idiot for buying the Apple product. These people never consider the numerous unquantifiable variables that go into product design and marketing. Consider such features as:

tight product integration with existing consumer tools

an almost religious zeal in regards to a refined user interface and a smooth user experience

an emphasis on simplicity

attention to detail

helpful customer support

elegant and innovative hardware design

a respect for the difficulties of human-computer interaction and how these affect the product once it is in the user’s hands

These esoteric, unglorified features are the real reason Apple so often dominates their competition, not the spec sheet, the logo, or the religious devotion.

Most of Apple’s customers are unable to explain why they love Apple products so much, but this isn’t because they are stupid, or because they are religious zealots, or even because Apple’s products are “magical,” it’s because these minor but critical details are difficult to explain without a language that most people don’t possess. If you ask an average customer how his iPhone functions with respect to Fitts’s Law, he is probably going to shrug and say, “I don’t know, but I like it.” That is the way it should be. And that is exactly why people like Madrigal are unable to understand why Apple is successful. They see people enjoying Apple products and are unable to quantify the reason in terms of a spec sheet, so they resort to asinine explanations like cult behavior.

And if you’re just chomping at the bit to tell me that some people buy Apple stuff just because it has the Apple logo on it, I’ll be the first to admit this is true. But this is the result of, not the cause of, Apple’s customer loyalty. Most people know that Apple has a reputation for excellent products and they trust Apple to deliver where many others fail. So, even without knowing anything about the product itself, consumers feel comfortable with the purchase. If Apple failed to deliver on its promise to customers, this reputation would deteriorate quickly.

Antennagate was perceived by some as Apple failing its customers, so it is confuses people like Madrigal and Dan Lyons as to why consumers are still buying iPhones. The answer is simple: when you have a track record like Apple’s, it takes more than one humiliating, media-hyped failure to dislodge consumers’ perceptions. People trust Apple, even if they make a mistake every now and then. It’s like Darth Vader said to the Apple representative: what are you going to do? Buy an EVO? Most competing smartphones still regularly fail to deliver on many of the details that Apple has been getting right since 2007. That isn’t religion, it’s rational consumer behavior, and it is perfectly explainable for those willing to look beyond whiny accusations of “cult behavior.”

I am not religious about Apple products, I just don’t like wasting my time getting shit to work. I have a busy life. I don’t like headaches. I don’t want to scan through a thousand deeply embedded, poorly described menu items just to get my music player to do something as simple as remember where I was in my audiobook on sync. I just want to plug it in and go. When it comes to Apple products, I trust that they have gone through their stuff with a fine-toothed comb and worked out most of the kinks. I know that, for example, when I updated my iPod Touch to iOS 4, nothing was going to break. And, if something did break, I knew whom to call to get it fixed.

Apple does make mistakes. I don’t know anyone who claims Apple to be “infallible,” as Madrigal claims. I have mentioned Apple’s failures many times before, especially with regards to Apple’s horrible track record with mice. But, by and large, I trust Apple to make the right decision. When Apple fails repeatedly, I go elsewhere. For example, I don’t use any Apple-made mice. Logitech makes fine mice. I don’t worship Logitech. I don’t have a religious narrative with Logitech. They make good mice and I buy them. (But their drivers usually suck, so I use something else.)

Let’s look at another quote from Madrigal’s article:

…the iPhone does mean something, and it’s the type of meaning that transcends rational optimizing about features and raw performance. “Apple weathered the storm because there is such brand loyalty through the religious narrative,” Campbell maintained. “When you’re buying into Mac, you’re buying into an ideology. You’re buying into a community.”

When I first bought Apple products, I never bought into any ideology or community. I didn’t receive a membership card or get greeted at the door. I just bought a piece of metal, silicon, plastic, and glass. I took it home, I enjoyed it, and I decided to buy more of the stuff that makes me happy. If you call that “buying into an ideology,” maybe you’re right: I bought into the ideology that products should work for me, not against me. I bought into the ideology that I have the right to be lazy and stupid if I want to. I bought into the ideology that attention to detail paid off. Most importantly, I admitted to myself and to the market that I was willing to pay for high quality products.

This is the new model for most of Apple’s customers. Gone are the golden days described so lovingly in Cult of Mac. Seriously, it’s time to quit making excuses about Apple’s success. A few hundred die-hard cult fanboys does not a multi-billion dollar company make.

]]>https://joshrhoderick.wordpress.com/2010/07/23/are-iphone-owners-just-religious-zealots/feed/0rhoderickjThe Politics of Antennagatehttps://joshrhoderick.wordpress.com/2010/07/20/the-politics-of-antennagate/
https://joshrhoderick.wordpress.com/2010/07/20/the-politics-of-antennagate/#respondTue, 20 Jul 2010 15:45:01 +0000http://joshrhoderick.wordpress.com/?p=61Continue reading →]]>The iPhone 4 has antenna problems. But you knew that, right? I think the clamorous hordes of tech bloggers have beaten that into our heads by now, like Gallagher pounding a watermelon into juicy chunks while we in the audience cower under the plastic sheet to keep from ruining our favorite ThinkGeek T-shirt. A few weeks ago, just as the iPhone 4’s antenna problem was surfacing, it wasn’t clear if this was a design flaw, a manufacturing defect, or something else altogether. It turns out that the attenuating effect is either a minor design flaw or a design tradeoff, depending upon whom you’re speaking to.

The Apple haters said the antenna was defective, Apple was evil, and that you should just buy an Android phone instead. They were having a jolly time poking fun at Apple. Their joy was warranted. After all, Apple is known for its quality products and a serious Apple fail is a big news. On the flip side of the coin, Apple’s devotees said that it wasn’t a big deal, that all phones have this problem, and that the whole thing was being blown out of proportion.

Soon, however, the jovial anti-Apple crowd, emboldened by their self-righteousness and their insatiable schadenfreude, became more vociferous and acerbic. This wasn’t just a funny Apple goof anymore, it was Apple’s Waterloo. Once the fatwa had been issued by the partisans at Gizmodo and elsewhere, lawsuits commenced, threats were issued, the blogsophere erupted with angry Apple haters bleeting insults at the “iSheep”, and craziness ensued.

Don’t get me wrong. I am pretty annoyed with Apple over the whole thing, not because Apple made a mistake — anyone who has used a mouse made by Apple knows that Apple makes mistakes — but because the official responses from Apple were so tactless and absurd that, at first, I honestly believed they were hoaxed. There was Jobs’s dismissive e-mail response which spawned the “Don’t hold it that way” meme. Then there was the official letter from Apple which promised a “fix” that would make the bars bigger. Oh, and by the way, the iPhone’s signal strength display is “totally wrong” and has been since the beginning so, even if you weren’t being duped now, you were being duped for the past three years. Sorry! Eventually, Apple’s bumbling culminated in a hastily announced press conference during which they admitted to the issue, downplayed its significance, and offered free cases to iPhone buyers.

The whole thing sounds like something straight out of The Onion. Apple’s absurd handling of the issue was deservedly excoriated in the media. But what amazes me most about Antennagate is how closely it resembles American politics.

Take the name itself: “Antennagate.” The “gate” suffix is almost entirely exclusive to the realm of American political scandals. Aside from Antennagate, I can’t think of a single tech scandal in which the “gate” apellation was applied as liberally as it has been here. Even the SCO controversy or Microsoft’s monumental failures with Vista or the Xbox’s Red Ring of Death were spared the indignity. And surely no tech controversy has achieved such a level of partisan rancor as has this one. Even the most heated debates within the factional and highly-opinionated open source community — Mono, for example — are weighted heavily to one side and quite civil by comparison.

In the United States, we don’t have the type of political discourse that Thucydides would recognize. I would describe that classical form of politics as “the tactful maneuvering against one’s political opponents using logical deduction, reason, and carefully crafted oratory.” Rather, we have a circus act orchestrated by two clowns, the Republicans and the Democrats, each of which is more interested in bonking the other over the head with socks filled with loose change than with solving, or even rationally discussing, real problems. In America, politics is about competition, not ideas. It’s a reality show, funded by taxpayers and lobbyists, scripted to keep us enraged and disgusted while America’s real enemies quietly pick our pockets.

Antennagate has had a similar effect. Instead of a discussion of the merits of Apple’s design choices, the mob quickly disintegrated into distinct parties, each shouting the other down. Like the Democrats did to Bush and the Republicans are now doing to Obama, the Apple haters were hoping that by throwing as much crap as possible at Apple, something would eventually stick. It doesn’t matter how extensive the problem is, how much consumers care about it, or even if Apple’s competitors face the same difficulties. The Apple haters’ goal was never justice for Apple’s customers, but victory for themselves (and probably for Android by proxy).

Also like in American politics, the cacophony eventually bubbled out the cloistered inner circles and into the public, leaving people confused and cynical. In politics, voters often don’t know who to believe, so they believe whoever is shouting the loudest, simplest, most oft-repeated phrases. The same has happened here with Antennagate: much of the public now believes that iPhone 4’s antenna is severely crippled. Facts and nuances are irrelevant. In an attempt to point out the hypocrisy, John Gruber has recently been postingnumerousexamplesof other phones showing the attentuation problem but, as with politics, it won’t matter. Perception is reality, and Apple has already lost the battle for consumers’ perception of the iPhone 4’s antenna.

I won’t be making any dire predictions about recalls or market failure for iPhone 4 because that is a bunch of nonsense. The iPhone still represents the best smartphone on the market and most consumers know this, even as the Android mob insults their potential customer-base and gabs up their own superiority. My point isn’t that Apple has failed with the iPhone, it’s that the discussion has become so political that the discussion is driven more by partisanship than market forces. The market forces, which will likely continue to favor Apple, are largely irrelevant, just as law and policy are often irrelevant in American politics.

The iPhone antenna problem has become the political equivalent of gay marriage. Rather than open, honest debate about the issue, the public sphere descended into two camps, each promoting their own brand of lunacy. At first, this was a discussion about Apple, mobile phone design, and how this affects consumers in the real world. Some smart people chimed in on the problem using terms like “detuning” and “attenuation” but, over time, it turned into a pissing match in which factions like Gizmodo were more interested into turning this into a public defeat for Apple.

Had Jobs at least pretended to be concerned in the beginning, the whole thing may have remained a technical discussion rather than a vitriolic political circus headed up by Michael Arrington in a leotard flying trapeze and Jason Chen juggling stolen prototypes while riding a unicycle. But now the Apple haters feel more self-righteous and more inspired than ever. As with American politics, even a minor win can embolden the most ill-conceived of factions. Although most people still plan to buy an iPhone 4, myself included, I do know of one individual — a normal consumer, not a Gizmodo or TechCrunch reader — who was scared off by the antenna problem. No, he didn’t buy an EVO or a Droid X, he bought an iPhone 3GS. That too is the nature of politics: the public, when subject to demagoguery, is unpredictable.

]]>https://joshrhoderick.wordpress.com/2010/07/20/the-politics-of-antennagate/feed/0rhoderickjOutrage at AT&T’s Femtocell Policy is Stupidhttps://joshrhoderick.wordpress.com/2010/06/17/outrage-at-atts-femtocell-policy-is-stupid/
https://joshrhoderick.wordpress.com/2010/06/17/outrage-at-atts-femtocell-policy-is-stupid/#respondFri, 18 Jun 2010 03:20:54 +0000http://joshrhoderick.wordpress.com/?p=57Continue reading →]]>It’s amazing how intertwined AT&T and Apple have become in the past few years. When AT&T announced their tiered data plans, judging by teh innernets, you’d think that Apple had ordered the destruction of Alderaan. Many of the major tech blogs were filled with foaming rants claiming that Apple had conspired with AT&T to introduce FaceTime and then snag its helpless customers on overage fees, forgetting that FaceTime is WiFi only for now. Right, as if Apple had to steal from its customers to make shitloads of cash.

We also saw hundreds claiming that they were ditching AT&T and going to Big Red, as if the grass were any greener over there. If the chatter is any indication, Verizon will roll out its new tiered data plans later this year.

Just today, someone made the observation that AT&T counts femtocell usage against the customer’s regular voice and data plans. And, as usual, indignant outrage ensued. AT&T’s femtocell, the MicroCell, is described by AT&T thus:

AT&T 3G MicroCell acts like a mini cellular tower in your home or small business environment. It connects to AT&T’s network via your existing broadband Internet service (such as DSL or cable) and is designed to support up to four simultaneous users in a home or small business setting. With AT&T 3G MicroCell, you receive improved cellular signal performance for both voice calls and cellular data applications like picture messaging and surfing the Web.

Sounds nice. While there are no monthly fees, you do have pay for the device itself and its use counts against your voice and data plan. At first glance, that seems pretty damn greedy considering that data is data and if it is travelling over the customer’s pipes, it doesn’t make much sense to charge for it. According to AT&T, via Business Insider:

AT&T explains the practice by saying there is a cost to handle the data transmission once it hits AT&T’s network, after it goes through your broadband pipe. (Likewise, it charges you for the voice minutes that you use over the Micro-Cell. But that’s a different service.)

I’ll just have to take their word for it until someone smarter says otherwise, but what interests me is that, in none of the articles I have read about this today, did anyone bother to look into what AT&T’s competition was doing. Turns out, they all do the same thing. Verizon follows the same policy with its femtocell, the Network Extender. Sprint’s femtocell, the AIRRAVE, works this way too unless you purchase an unlimited plan to go along with it, and they charge an additional monthly fee.

Judging by MobileCrunch’s poor reporting of the story, you would think that no other carrier would dare to count customers’ femtocell usage against their voice and data plans. There is no mention of any of AT&T’s competitors, nor any serious thought given to why this might be the case.

I understand why some of AT&T’s customers are frustrated. Even knowing that one’s contract with a wireless carrier has never included a guarantee of service — “where available” has always applied — the frustration isn’t unwarranted. People buy a wireless phone on the assumption that they will have wireless service, and the carriers aren’t always honest about the quality of service a customer can expect. Of course, without service, the phone and the plan are worthless. But the ignorance of the Androidiots seriously makes me laugh. Just take a look at this cute little comment from a reader who commented on MobileCrunch article linked above:

Just another reason I went ANDROID.

There were a handful of comments to this effect, extolling Android for… what? What the hell does any of this have to do with Android? The MicroCell can be used by any of AT&T’s 3G devices on postpaid accounts, including Android phones. It’s amazing how the most uninformed are always the evangelists. They don’t even bother to read the article, they just assume that by attacking AT&T they are attacking Apple by proxy.

Because it is the sole iPhone provider in the U.S., AT&T recieves an enormous amount of scrutiny that the other carriers do not. What many don’t realize is that none of the wireless carriers are angels. I can’t wait until Verizon gets the iPhone so this ridiculous demonization of AT&T stops. Just wait until Verizon sinks its teeth into its new customers.

UPDATE 06/18/2010: Fixed a few typos, made some clarifications, and removed some of my late-night gratuitous snarkiness.

]]>https://joshrhoderick.wordpress.com/2010/06/17/outrage-at-atts-femtocell-policy-is-stupid/feed/0rhoderickjThoughts on Apple: A Switcher Storyhttps://joshrhoderick.wordpress.com/2010/06/15/thoughts-on-apple-a-switcher-story/
https://joshrhoderick.wordpress.com/2010/06/15/thoughts-on-apple-a-switcher-story/#commentsWed, 16 Jun 2010 01:49:57 +0000http://joshrhoderick.wordpress.com/?p=47Continue reading →]]>Now that the mobile titans are battling for market-share and mind-share, it seems that the golden days of the Mac switcher story are pretty much over. The future of computing lies not in the PC or the Mac, but in the more abstract mobile platforms that power the iPhone, the iPad, and their counterparts. So, before the fine art of the Mac switcher story becomes extinct, I’d like to contribute my own. But, rather than just write about how I came by a Mac, I will also write about why my story is as much about Apple itself as it is about becoming a Mac user.

Vista, Linux, or Mac?

I was late to the party. Very late. Prior to 2008, I owned no Apple products, not even the venerable iPod. It isn’t that I was a luddite. I was working on a degree in computer science and I had considerable programming experience. And I wasn’t anti-Apple. I just had no interest in an iPod, an iPhone, a Mac, or anything else made by Apple. My music was on CDs, my phone wasn’t smart, and my custom built PC dual-booted Linux and Windows. Thus, in a world full of people who either adored or loathed Apple, I was honestly apathetic toward the company. That was about to change.

In 2008, my wife’s two year old Dell laptop choked. Its battery failed to hold a charge, the integrated LAN port had died, the display had developed a flicker, and a few of the special keys weren’t so special anymore. But the problem that eclipsed all others was its performance: it had slowed to a crawl, as if the mere suggestion of opening Firefox caused the thing to sputter and gag and fall to its knee. It was virtually unusable, even for my wife’s meager computing needs.

Like a seasoned Windows power user, I first attempted the standard remedies: complete anti-virus and spyware sweeps, registry cleanup, cruft removal, defragging, disk analysis, and Windows updates. When none of this worked, I went for the nuclear option. I formatted the hard drive and re-installed Windows.

Sadly, there was little improvement. It was if the hardware itself had given up on life.

Next, I wiped out Windows and installed Xubuntu, the lightweight version of Ubuntu targeted at lower end systems. Xubuntu worked well for a few days. It wasn’t perfect, but the speed was acceptable, and my wife was browsing the web and checking her e-mail without complaints. But then a few major bugs started giving her headaches. I don’t know if these bugs originated in the Linux kernel, in X.org, in XFCE, or in some other esoteric component. In any case, the laptop would not sleep or wake reliably. Even after closing the lid, it would often run at 100% CPU consumption for hours, overheating and causing all sorts of havoc. During these episodes, the power brick felt like a chunk of pyroclastic rock recently ejected from Eyjafjallajökull, and I feared it was a fire hazard. Also, it seemed that the sound system wasn’t exactly stable, as the Dell screeched and belched every time there was a confirmation “ding” or an accusatory “dong” from the UI. Finally, the virtual workspaces went wild and switched themselves randomly as if the machine were possessed by a productivity-obsessed demon. I spent hours Googling for fixes. I perused forums, blogs, and wikis. Despite my efforts, I gave up. I was exhausted. And it was time for a new laptop.

At the time, I was quite disillusioned with laptops. They all seemed to suck. I hated the small, cramped trackpads, the heat problems, their lack of serviceability, their creaky design, and batteries that — if they didn’t burn your house down — often required replacing after just a year. So, at my urging, my wife and I decided that we were in the market for a desktop rather than another laptop.

While shopping, we walked by the Apple kiosk at Best Buy — that tiny alcove of shiny glass and aluminum amongst a sea of black and gray mottled plastic — but we immediately turned away when we glimpsed the price tags. Compared to a similar Windows machine, the Macs’ specs were weak, their list of features small and unimpressive, and they were still more than double the price. What kind of racket was Apple running, anyway?

But by this time, the summer of 2008, another variable had entered the equation: Windows Vista. Despite having mostly migrated to Linux the year before, the demands of work and the occasional game of Company of Heroes meant that I was still shackled to Microsoft, and I had been using Vista on my dual-boot machine. Vista was a constant frustration that I loathed even more than its hideously ugly predecessor, Windows XP.

So we were in a bind. We could choose a system running XP (an old evil), Vista (a new evil), or go for a custom build with another Linux install. The biggest hurdle with Linux was instability. As a Linux user, I knew that each kernel update brought fixes for old problems and introduced new ones. Regressions were common and troubleshooting Linux’s weakness sometimes teetered on being a part-time job. I was used to it — I had convinced myself that I enjoyed it actually — but I was not about to put my wife in that situation. She needed something simple. She needed something that didn’t require a lot of tweaking. So, despite the price, we decided to take another look at the Macs.

As I considered buying a Mac, I was attempting to rationalize the purchase in a way that justified the increased price. Having never used a Mac before, I had to resort to something other than personal experience, so I fell back to what I knew.

I knew that Mac OS X had it roots somewhere in FreeBSD. Being a *nix geek, that was a huge selling point for me. I also knew that Apple had played a role in a few open source projects like CUPS and WebKit. I never suffered the delusion that Apple was an “open” company, but I wasn’t an open source zealot so I didn’t much care. I used Linux because I hated Windows, not because I was making a political statement.

And there was another, more embarassing, factor in my decision: one of my earliest computers was an Apple IIe. I had adored that machine. In addition to an early model IBM PC AT, the Apple IIe was a computer on which I had spent many long hours programming text-based fantasy games that I had lovingly referred to as my “RPGs.” Later, once I had figured out graphics mode, I had written my magnum opus, a custom version of Pong, in BASIC on an Apple IIe. Yes, I loved that machine, and those fond memories definitely softened me to the idea of buying Apple’s new machines, even if they were a pale shadow of the computers of yore.

Granted, these are all poor reasons to drop an enormous amount of cash on what appeared to be a childish imitation of a computer. I mean, come on, this thing — this iMac — was all one piece. How did you swap out hard drive? Replace the monitor? Graphics cards? Why was there no physical button to eject a DVD? And that Mighty Mouse thing? Utter rubbish! (And it remains rubbish. Despite their amazing track record on everything else, Apple has yet to produce a usable mouse.)

Ultimately, my disgust with Windows and my lack of faith in Linux left me no alternatives. It was either buy a Mac or go back to Microsoft, so I decided on the lesser of two evils and we went home with a base model 20″ iMac. Its weak specs were still far above what my wife required and I was happy knowing that by avoiding their latest OS adbomination I had smote Microsoft.

I admit that I was a little embarassed while carrying our new iMac out to the car. I didn’t want people to think that this thing, delicately packaged in an effeminate white box with a convenient handle, was for me. I was a programmer, a geek who liked code obfuscation contests and technical manuals and C++ GUI programming guides. I wanted to shout to the passersby, “This is for my wife. I’m just tired of servicing crappy machines and I’m looking for a break. Don’t judge me.” Who could blame me? Really, who?

This Isn’t So Bad

At home, we booted the machine and began exploring. We played with iTunes, Spotlight, Exposé, and Spaces. Being a Linux user, the latter three were known to me under different monikers, but all were impressive to see on a mainstream machine. We snapped some cute photos using the iSight camera. We browsed the Internet, glanced at iPhoto and iMovie. I then opened a Terminal window and tried out some of my trusty bash commands.

I was impressed. The iMac was as snappy as my custom Linux box. The UI was simple and intuitive. Every keystroke, every icon, and every menu item had a clear, defined purpose. No detail had been overlooked. My wife was comfortable with it immediately. No waiting. No tweaking. No crashing. No Googling. No man pages to read. It was nice, even though I hesitated to admit it at the time.

To be fair, we were annoyed by a few of the Mac’s quirks. For one, the mouse didn’t feel right. Despite having the sensitivity jacked up to the max, the mouse felt strangely unresponsive, as if I were mousing through molasses. It made my arm and wrist sore. I eventually discovered that the problem was with OS X’s acceleration curve. To remedy the problem, I installed USB Overdrive (which also happens to be the best set of mouse drivers I have ever used on any platform). In my opinion, this is one of Apple’s largest UI failures and it remains unresolved.

I was also annoyed with the Mac UI’s tacit policy regarding “click-through.” (Isn’t it odd that almost everything that intensely annoys me about Macs is somehow related to the mouse?) I hate having to click a window just to bring an application into focus when I can quite easily use a single click to both perform a task and bring the application into focus. John Gruber has writtenextensively (and eloquently) about click-through, but I disagree with him. Gruber thinks that people are confused when, after clicking to bring a background window into focus, the click inadvertently performs an additional action. In practice, I have never seen a Windows user confused by this and the explanation is simple: people are able to logically conclude that the action of clicking has consequences. Click a button, get a response. That’s a good UI principle to adhere to, regardless of where the window is positioned. Most Windows users are quickly trained to click in empty space in background windows to avoid inadvertent actions. Sure, it isn’t elegant, but I believe it makes for a more responsive UI.

Also, to make the “it confuses people” argument, the Mac would have to abandon the common problem associated with a windowless application having focus. For example, if I have both Safari and iTunes running, it is quite common to close the last Safari window to see iTunes occupying my desktop, but the focus still remains with Safari until I explicitly click on the iTunes window. That’s just stupid.

Ultimately, despite its quirks and its limitation to Apple hardware, I decided that Mac OS X was the best choice in a very short list of choices for desktop operating systems. My wife was pleased and I was pleased.

Enter the MacBook

A few months later the new aluminum unibody MacBooks were released. Despite my disgust with laptops, my wife’s iMac had impressed me so much that I was curious to see how a MacBook would fare. For me, the most enticing feature was the new trackpad: a giant, beautiful, solid piece of glass that fully supported gestures. Sure, it wasn’t the first trackpad to support gestures, but it was the first that I found usable. After trying it out at the local Best Buy, I proclaimed that it was the best trackpad I had ever used, a statement that I still believe to be true today.

So I ordered myself a 13″ MacBook.

That semester I was taking an advanced Unix programming class. It was the standard Unix system stuff: file I/O in C, fork, wait, exec, etc. Our final class assignment was to build a rudimentary web server to demonstrate our prowess with Unix IPC. I typically did my programming in Linux, under Cygwin on Windows, or via SSH to the university’s Solaris server, but I had accidentally mangled GRUB somehow and my PC was unbootable. Sure, I could fix it in an hour or so — I just needed to figure out how to restore GRUB and pray that I didn’t destroy anything else in the process — but I was frustrated and hurried, so I decided to try programming on the MacBook instead.

Surprisingly, everything worked as well as it had in Linux. Even the more complex programs which failed on Cygwin ran flawlessly on the Mac. I was shocked. Here was a machine that my wife could use to buy and sync music with iTunes and yet I too, the classic computer nerd, could use the very same machine to code, test, and debug a web server from a native command-line interface running vi. That was damn cool, and I was no longer embrassed to admit it.

Over the next few months, it became obvious that my wife liked my MacBook far more than the iMac we had purchased for her. It turns out that she preferred mobility to screen size. Since her old Dell had to be constantly plugged in, she hadn’t realized that a MacBook would last for hours on a charge. Now that she was untethered, she wanted one. I inherited her iMac after my wife commandeered the MacBook, but it worked out well because I was actually beginning to prefer the iMac to my own, custom-built Linux/Vista PC.

Although my PC was more than double the specs of the iMac, it was significantly less usable. The iMac and MacBook both slept and woke within a few seconds every single time. Not once did either crash when waking from sleep. Both Linux and Vista regularly failed to wake from a low power state and, despite hours of troubleshooting, I was never able to fix it. Furthermore, my PC sounded like a hurricane. Even with the fan speeds turned down as low as possible, the noise was annoying. In comparison, the iMac and MacBook were whisper quiet, so quiet in fact that I couldn’t tell by ear whether they were on or off.

But the OS was the most important upgrade.

It wasn’t until I saw how elegant and graceful an OS could be did I actually become discriminating enough to be annoyed by Linux’s warts. After having been spoiled by Mac OS X, I was finding it increasingly difficult to be satisfied by the “good enough” creed by which I had lived for so long. Linux began to feel like OS X’s temperamental, half-baked doppelganger. If it weren’t for the hardware lock-in and Apple’s strict control of the OS, I would have shed a tear for the discovery my dream OS. (Although a convincing argument can be made that Mac OS X is so great precisely because of the hardware lock-in and Apple’s strict control.)

Linux does give the user complete control over his system but, as Uncle Ben said, “With great power comes great responsibility.” Linux demands a lot of its users. It is a tangled web of independently developed components. It is so modular and flexible that the system sometimes breaks down at the seams. As I got older and had less free time to spend tinkering, I was finding it more difficult to justify spending long hours massaging Linux into compliance. Despite wanting to maintain my self-image as a geek (how incongruous is that?), I was secretly loving and using OS X for its simple usability. OS X is the antithesis to Linux: everything is designed to fit together perfectly, yet the perfect fit comes at the cost of flexibility. Whereas Linux can be one of a billion different things, a Mac is a Mac and, if you don’t like it, you can install Linux.

Then, of course, there is the aesthetic charm of Apple’s products. For many years, I thought Macs were the ugliest computers on the market. But once the Mac’s primary exteriror components were made of aluminum and glass, I was lovestruck. The Mac, iPod, and the iPhone were stunning: simple, elegant, symmetrical, devoid of embossed plastic, markings, or unnecessary lines, rarely hinting at the raw industrial power that lay just millimeters beneath. My days of yearning for the futuristic devices seen on Star Trek: The Next Generation were finally over. Apple was now making them.

In which I actually pay for software

It took a few weeks but I eventually found alternatives to my favorite Linux and Windows software. And, in most cases, the Mac versions were superior to their counterparts. The only downside to this was that I found myself shelling out cash for software that I would never have paid for on Windows or Linux. This is a significant difference between the platforms. As a Windows user (or a Linux user), I was accustomed to crappy software. When people asked me to pay for it, I was offended: “Pay for it? For your crappy UI, your constant crashes, your spelling mistakes, and your restrictive license? No thanks.”

In the world of Windows, there wasn’t much worth paying for. I paid only for Microsoft Office, Adobe Photoshop, and games. These were worth the price. Everything else in the Windows-verse was bloatware, nagware, spyware, or shareware. If you could get past the horrible user interfaces and the instability, things sometimes worked just enough for you to get the job done. And this was the way of things.

This is not so on the Mac. Yes, there is crappy software in the Mac ecosystem too, but the good software is far better than anything I ever used on Windows or Linux. Applications like TextMate, BBEdit, PathFinder, Delicious Library, Things, ScreenFlow, and even iWork exceed even their closest Windows or Linux incarnations. But Mac software is rarely free, and this is no coincidence. Good software requires paying customers, and Mac owners are willing to pay for quality.

Addicted to Quality

It has been almost two years since I first bought my first Mac. Like many of Apple’s customers, once I bought one Apple product, I eventually bought more. This is not just because of the quality, but also because Apple’s stuff is designed to work together. If you have one Apple product, adding a second one almost always extends the first in some way.

Paying a premium for Apple products hurts, but that pain fades quickly. On the other hand, buying cheaper, junky products often results in a device that is lackluster, underutilized, and imparts buyer’s remorse. Years before buying a Mac or an iPod, I purchased a Sansa SanDisk MP3 player. Feature for feature, it was “better” than the iPod. The Sansa’s software was awful and it never became a part of my daily life. Syncing it felt like more of a chore than an upgrade and, eventually, it ended up packed away in my graveyard of obsolete gadgets. None of my Apple devices ever ended their life in that graveyard.

At no point during those intervening years did I decide to become an Apple fanboy. I was just so happy with each purchase that I became spoiled. Spoiled by quality. Spoiled by good design. Spoiled by simplicity and attention to detail. Eventually, I began demanding these qualities from everything I bought, not just computers and electronic devices. For me, the days of “good enough” have come to an end. Apple has made me a more demanding consumer.

I think this is a good trend, especially for the electronics industry, where companies thoughtlessly cram “features” into their products. These features are often unimpressive, half-baked shams, or even outright lies. Their purpose is not to deliver value to the customer, but to deliver to the illusion of value to set their widget apart from their competitor’s nearly identical widget.

In the tech media, Apple products are often compared to these generic feature-stuffed products and found wanting. Some are convinced only by specs and benchmarks. Bigger numbers are always better. These consumers aren’t interested in any form of value other than those that can be directly quantified. Intangible features like the user experience are simply not factored into their equation, so they believe Apple products are less valuable and Apple’s customers are simply fanboys with no rational logic behind their purchases other than the desire to flaunt the Apple logo.

Take the Mac for example. Can you buy a computer for less than a Mac? Of course. Can it do everything a Mac can do? Absolutely. The discrepancy here isn’t function. You can buy any number of cheap computers that do virtually everything expected of a Mac. The discrepancy is with user satisfaction. Mac users are a specific type of consumer and they hate buyer’s remorse. They are willing to pay extra for the comfort associated with knowing that they will be satisifed with their purchase.

Where to Go From Here

As of this writing, Apple has surpassed Microsoft to become the United States’ second largest company in terms of market captialization, just behind Exxon-Mobile. What does this mean for Apple and its fans? Hopefully, not much. I hope Apple continues to do what it does best: deliver innovative quality products to demanding consumers.

But there is another alternative. With Apple’s explosive growth has come FTC investigations, DoJ investigations, patent spats with Nokia and HTC, fierce competition from Google, and the swelling legions of Apple haters. Let’s face it: Apple has become “the man.” Apple is a bigger target now than it has ever been before. Even relatively obvious and innocuous business decisions, like supporting H.264/AVC over Ogg or VP8, are scrutinized and criticized ad infinitum by the scores of churlish tech bloggers who want to see Apple humiliated and defeated.

I am happy with Apple’s success. The market is ripe for the type of products Apple produces and Apple’s sales figures reflect this. But, with its enormous growth, Apple is asserting greater control over its products in unprecedented ways. They are circling the wagons. I am tempted to say that Apple is getting cocky, but Apple has always been cocky. That’s part of what makes Apple so unique. They do not shy away from, and rarely even respond to, criticism. They maintain an intense focus on their product line and pretend as if the world around them weren’t churning a hurricane of anti-Apple animus.

So I am in the awkward position of both loving a company and being afraid of what it might become. Is it bad to be dependent upon a single company for all of my computing needs? Maybe. But considering the importance of computing isn’t it just as bad to be dependent upon a handful of companies? Logic tells me that putting all of my eggs in one basket is not a good thing, but logic also tells me that I should use what works rather than distribute my eggs into baskets I can’t carry just to prevent a potential, ideological threat. Besides, I currently use Google’s services quite heavily, including Gmail, Google Docs, Google Calendar, and Google Reader. Google has more of my data than Apple does, so why should I fear Apple?

Don’t get me wrong. I am concerned about some of Apple’s recent actions, primarily those that directly impact users. For example, remember Apple’s decision to block all device manufacturers from syncing with iTunes? While iTunes is indeed Apple’s property, the content managed within iTunes is owned or licensed by the consumer, and this is just a form of vendor lock-in. However, there is a good business case to be made against allowing other companies to pair their devices with iTunes. Why should Apple invest in a piece of software only to have their competitor’s use it against them? Since Palm competes directly with the iPhone, allowing Palm to do iTunes syncing would basically free Palm from the responsibility of building and maintaining software for its own platform. Like most of these discussions, the issue is not as dualistic as the throngs of choleric tech bloggers would have you believe.

All of the indignant, self-righteous fearmongering gushing forth from the blogosphere regarding Apple is so overblown that I find it difficult to take seriously. Ultimately, most of the complaints we hear are only tangentially relevant to the real argument these people are selling. They really want us to believe that Apple is evil, or, more specifically, that Apple is moreevil than is Google or Microsoft or Nokia. But we must remember that, unlike Microsoft and Google and, to a lesser extent, Nokia, Apple has never had, nor does it have now, a monopoly on anything. The market is flush with competitors nipping at Apple heels in every market, yet Apple is always there, at the top of the market, siphoning off the most valuable consumers and leaving behind the dregs for everyone else to scrap over. Apple can’t do this because it has a monopoly, or because it absorbs its competitors, or because it manipulates the market, or even because its customers are so locked-in that they can’t flee. Apple can do this because people spend enormous amounts of money on Apple’s products.

Most of the bitching comes from people who, for whatever reason, think Apple’s lead in the marketplace obligates them to allow their competitors a handicap. Frankly, I think that is ludicrous. I want to see competition, and companies aren’t pressed to compete when they are given an unfair advantage just for being behind. If nothing else, Apple’s rise to dominance should be a clear sign that consumers are willing to pay for quality. As Mike Davidson said regarding Apple at the helm, this is a good problem to have.

In the two years since I first purchased a Mac, my entire perspective of computing, and even my behavior as a consumer, has been permanently altered. When I bought my first Mac, I was not intending on ever buying anything else from Apple. I never intended on even liking the damn thing. But, in time, I found out that Apple really does make awesome stuff. Their products aren’t perfect, as tradeoffs must always be made in engineering, but these sacrifices are almost always made with the user experience, not the marketing or accounting departments, in mind. If Apple makes a contentious decision, it is most often because they truly believe it improves the product.

After writing this, I have finally figured out why switcher stories are so enjoyable. Each story is an individual’s tale of doubt, chronic frustration, and annoyance, ending in a splendid orgasm of user satisfaction. Until another company can prove to me — not with prototypes or vaporware but with real, shipping products — that they too can sacrifice the old, race-to-the-bottom strategy and build quality products, then I will continue opening my wallet for Apple. It really is that simple.

]]>https://joshrhoderick.wordpress.com/2010/06/15/thoughts-on-apple-a-switcher-story/feed/2rhoderickjApple Haters, a Critical Analysishttps://joshrhoderick.wordpress.com/2010/05/12/apple-haters-a-critical-analysis/
https://joshrhoderick.wordpress.com/2010/05/12/apple-haters-a-critical-analysis/#commentsWed, 12 May 2010 18:45:03 +0000http://joshrhoderick.wordpress.com/?p=42Continue reading →]]>Is it me or have the legions of Apple haters been growing lately?

I am not referring to folks who simply don’t care for Apple products or Apple’s business practices but to the ranting lunatics who spend every waking moment attributing blame to Apple for some failing aspect of their lives. Apple’s continued success in the marketplace is their bane. They claim that Apple is popular only because Apple is popular. The more popular Apple becomes, the more they seethe loathing and disgust.

Apple haters — or “iHaters” — are a smug bunch. (Yes, I said “smug.”) They are the bullies of the tech world. iHaters believe that people buy Apple products only because they expect some form of social gain: status, prestige, admiration, etc. They see no inherent value in Apple products. Furthermore, they claim that all of Apple’s contributions to society, culture, or business are merely tangential aberrations, explainable by marketing gimmicks and shiny surfaces.

iHaters exhaustively compare hardware specs to anything made by Apple because these specs are the sole method by which they understand value. You can’t argue with them about this. Their focus is numbers: processor speed, screen resolution, storage, the number of ports. If you tried to point out other reasons to buy Apple products, you are deemed an irrational “fanboy.” Do you like a good user interface, consistency, or attention to detail? Fanboy. Do you enjoy tight product integration, technical or aesthetic? Fanboy. Do you prefer elegance over clutter? Fanboy. Fast, responsive software? Fanboy. Do you enjoy walk-in technical support, or excellent resale value? Fanboy.

So, what makes these people tick? Aside from a personality disorder, I mean?

iHaters remind me a lot of the tea-baggers. They display a lot of anger and frustration but they have no coherent message. Not a day passes without some new faux outrage that boils their blood and affirms their allegiance to the cause. It is all a lot of bellyaching at perceived injustices rather than real, tangible concerns.

iHaters think that their non-comfority somehow displays bravery or intelligence. They define themselves in the negative, as the opposite of whatever they deem to be popular. If you like a movie, they hate it. If you like reggae, they aren’t so keen on it. They are always looking to shock people into acknowledging their raw individuality.

They are terrified of being anonymous. When they see a crowd move in one direction, they instinctually turn the other. They do not want to be viewed as conformists, nor do they want to be associated with people whom they deem to be less intelligent than themselves. Such an association may cause them to lose social status among the hordes of other iHaters with whom they regularly meet to complain about Apple.

Don’t bother trying to debate with an iHaters or even explain a simple misunderstanding. Objective facts are irrelevant to their thought process. It is their perception of facts that deserves primacy. Reality is to be considered only after one’s opinion has been solidified and reinforced with emotional buttresses.

But, most of all, iHaters secretely admire Apple and its products. Buried deep within them is a desire to buy and hug an iPad and possibly even to hump Steve Jobs’s leg like an adoring but lonely terrier. They aren’t even conscious of this desire, as it escapes only in microbursts during moments of emotional weakness. Yet, when they acknowledge these breaches, they over-correct by becoming even more viciously anti-Apple. They lash out at iPods, Objective-C, innocent consumers, or anyone unfortunate enough to be standing nearby.

Apple hatred is a feedback loop. They can’t relent even for a moment because they have never yet relented. If they were to acquiesce on any of their cherished anti-beliefs, they would become anonymous, a mere sheep in the herd. No, they will not succumb. They will not budge. They hate you for buying that MacBook. And they hate, hate, hate Apple.

]]>https://joshrhoderick.wordpress.com/2010/05/12/apple-haters-a-critical-analysis/feed/2rhoderickjAndroid’s Looming Image Problemhttps://joshrhoderick.wordpress.com/2010/05/11/androids-looming-image-problem/
https://joshrhoderick.wordpress.com/2010/05/11/androids-looming-image-problem/#respondTue, 11 May 2010 15:09:57 +0000http://joshrhoderick.wordpress.com/?p=32Continue reading →]]>By now, many people are either firmly pro-iPhone or pro-Android and can explain why in a few, snarky sentences. There are few who want to see both succeed. Many of the well-trafficked tech blogs often descend into pissing matches between mobile platform adherents.

I am somewhere in the middle. I think their joint success is critical to the future of mobile computing and, although they are competing now, they will ultimately serve two different markets. One of the most interesting aspects of this fight is that both represent a distinct paradigm of computing. They represent a choice we were never really given at the beginning of the PC era, a tragedy for which you can thank IBM and Microsoft.

After the iPhone single-handedly created the smartphone market, a lot of people took notice. Google and a handful of handset manufacturers realized that if they did not confront the iPhone and the App Store, Apple might eclipse them all, leaving each company scrapping within its own fiefdom over Apple’s leftovers, much like what happened with the PMP market. Most handset manufacturers could not independently develop a platform for this new class of devices and expect to catch up with Apple. No, they needed something that could grow quickly enough to offset Apple’s huge lead. And they needed an app store. And they needed it all yesterday. Thus the Open Handset Alliance (OHA) and Android were created.

Don’t let the name fool you. The OHA doesn’t care much about openness so much as they care about pumping out Android devices at a dizzying pace. The goal is to smother the iPhone or, at the very least, capture a large enough piece of the pie to remain relevant in spite of it. It isn’t just hardware sales and carrier contracts at stake, but income from the burgeoning apps market and the potentially explosive mobile ads market.

On one hand, we have the iPhone, a platform in which Apple controls the software, the hardware, the apps marketplace, and the distribution channel. All of the pieces can be moved at will, if necessary, to rapidly adapt to changes in the market. And Apple and the iPhone continue to lead the market — not in bean counting, as in the pixel for pixel hardware spec comparisons that most Android fans are so adamant about — but in its ability to rapidly harness and exploit the mobile platform.

In comparison, Android’s OS is controlled mostly by Google under the auspices of its openness (which has been debated), the hardware is controlled by a myriad of competing stakeholders. That competition is great if you like bean counting specs but not so great if you are trying to define the Android brand. Furthermore, the distribution channel is decentralized and is largely controlled by the carriers. Google tried its hand at delivering the Nexus One directly, bypassing carriers using a web-based model, but since the Nexus One was quickly eclipsed by other phones, it appears that this distribution model may not work as well as Google had hoped.

The result of this loose coupling is fragmentation. It is inevitable. Android’s fragmentation is usually references in terms of the various OS flavors (1.5, 1.6, 2.0, etc.) floating around on numerous devices and carriers, but there is a deeper problem. For an example, look at HTC Sense. While it is possible that HTC wasn’t happy with the stock Android UI which some considered to be subpar to the iPhone-set standard, it is much more likely that HTC added a proprietary layer to Android to differentiate itself from its “partners” in the OHA. This is similar to how PC manufacturers like Dell, HP, Sony, and Lenovo “add value” by tacking on to their products proprietary UI components (such as the Dell Dock), special service buttons, and “bonus” software. In a commodity market where every PC pretty much looks the same as the next, it becomes especially important to lure consumers in with exclusive features.

One of the problems with HTC Sense is that it becomes more difficult to distribute OS updates. Owners of HTC devices running Sense may be forced to wait longer than a Nexus One owner for updates. This is particularly problematic for developers, who are being asked to target various incompatible versions of the environment. And it is problematic for carriers who are required to juggle the OS’s various flavors which, when multiplied by the number of unique devices running these variants, becomes expotential in complexity.

Sense presents another problem. It is a representation of how Android’s importance may be relegated to just another spec. Verizon’s Droid was, for a short while, the icon of Android. Then it was the Nexus One. But with new phones rapidly being released, consumers may soon forget exactly what Android is and begin to associate it with something as uninteresting as Windows Mobile was. In other words, Android won’t be a feature so much as an expectation. Like Windows, everyone will know it, but few will love it.

Of course, lack of platform enthusiasm in mainstream consumers is not a death sentence. It will still capture a large segment of the market, but it will be capturing customers who just don’t really care about apps, which means less revenue for developers. Right now the Andorid market continues to grow and that is largely because the platform has promise but if it becomes blasé the market may suffer.

Despite Google’s efforts to contain it, we will continue to see Android’s fragmentation. It is simply too difficult to coordinate so many disparate interests (which is one of the reasons desktop Linux remains the hodgepodge that it is). We will see a flurry of devices, great and small powered by Android in the hopes that this alternative business model can do in sheer volume what Apple has already done by being first to the market.

So here we have two completely different strategies. One is a monolithic approach with Apple at the epicenter. This gives Apple agility and delivers a polished, high-end product. The other approach is a fragmented, semi-ordered chaos from which will emerge everything from gems to the rickety, plastic me-too devices, leaving consumers to sift through the lot. That does not bode well for Android. To avoid this, Android needs an iconic device — much like the Nexus One — that is sold on numerous carriers.

The beauty of this match-up is that both business models have merit (and faults). Despite what angry bloggers may say, neither approach is universally right or wrong. My guess is that both platforms will flourish but that the smartphone market will break among familiar lines. Whereas Apple will likely continue to lead and innovate in a higher end niche market, Android will provide a cheaper, more ubiquitous alternative for those who don’t care much about smartphones but simply like to update their Facebook status while in the waiting room at the doctor’s office.

]]>https://joshrhoderick.wordpress.com/2010/05/11/androids-looming-image-problem/feed/0rhoderickjHow to Block Facebook’s "Like" Buttonhttps://joshrhoderick.wordpress.com/2010/05/10/how-to-block-facebooks-like-button/
https://joshrhoderick.wordpress.com/2010/05/10/how-to-block-facebooks-like-button/#commentsMon, 10 May 2010 15:12:15 +0000http://joshrhoderick.wordpress.com/?p=27Continue reading →]]>Facebook has continued to integrate itself into the web like a gangrenous infection. Are the “Like” button or other Facebook widgets getting in your way or running afoul of your company’s firewall? Fortunately, there are a number of extensions out there to help you.

You can also use an ad blocker to obstruct all things Facebook. For Firefox, AdBlock Plus is the way to go. For Safari, install AdBlock from the Apple’s Safari Extensions, or use the older Safari AdBlocker for pre-Safari 5 browsers without official extension support.

Just add the following custom rules to your adblocker to block only the Facebook widgets and the “Like” button: