Thursday, March 3, 2011

I'm a big DIY nut. I admit, I don't have enough time to do any big projects on my own. My writing, coding and kids keep me pretty busy. But I love reading about other people's projects, and dreaming up things that I might try to do someday (when I have enough space, money and time).

As a result, I've been following the Instructables feed for several years now. At first it was a very cool mix of interesting food and technology projects. Things that really inspired the imagination, with clear step-by-step instructions on how to do it yourself.

Recently, however, I've noticed that a growing number of Instructable posts are either inane (e.g. "How to put on pants") or there simply someone showing off their own project without bothering to provide the step-by-step instructions. As a result, I've been growing more and more frustrated with the feed. And, I hate to admit it, but the time has come. Something must be done.

This is actually a symptom of a broader internet problem. Something similar happened with Twitter's trending topics. At first, they were a great way to keep track of geek news--new movies, new technology, new video games, whatever. If it was a trending topic, I was probably interested in it. Now, however, they're almost useless to me. The Twitter demographic has clearly shifted to a more mainstream audience, and I just can't bring myself to care about Justin Bieber or Ke$ha. Ironically, I was going to use the current trending topics to prove my point, but Blade Runner is currently #2, so maybe all hope is not lost.

Now the problem with Twitter and the problem with Instructables are not exactly the same. In Twitter's case, it's more of a lowest common denominator issue. For Instructables, that's part of the problem--more kids posting low-quality 'Ibles (not to pick on kids--I like to see them getting involved and trying new things--but if I see another "I'm 13 and this is my first 'Ible, please be nice" post that goes on to show me how to draw ligers using only blue and purple crayons, I may have to gouge out my own eyes). But there's also a ton of people posting "joke" 'Ibles (I use quotes, because they really aren't funny). This feels more like the comment troll problem, where a few individuals seem intent on entertaining themselves at everyone else's expense.

In both cases (and in the internet at large), there's still a lot of really cool stuff going on. It's just getting harder and harder to find it. For Instructables, it's become painfully clear that sucking on the main RSS fire hose is no longer the way to go. I'll give the Editor's Pick or Popular feed a try. Hopefully that will give me a more-curated experience. For Twitter, I'll carefully select the people I follow, and blissfully ignore trending topics. For the internet at large--who knows. We've been struggling with this issue for years now, and while the battlefield shifts around a bit, it really hasn't gotten any better.

We're still at the early days in this technology, where rampant growth and changes are the norm. The current hope is that social search will save us. I'm not so optimistic. Still, by the time the internet matures, we should have better tools for content discovery. But, who knows when (or if) that will occur.

Thursday, February 17, 2011

Ok, let me start by admitting that I don't watch Jeopardy. I don't even own a TV. I have, however, been following the epic man-vs-machine battle that played out this week, and I must say, I'm honestly surprised. Not that the computer won. That was inevitable. If not this year, then someday soon. No, I was surprised that most of the people who commented on Watson's victory completely missed the point.

If you look online (go ahead, you know you want to), you'll find a lot of people talking about how, of course, the computer won. They say its reaction time gave it an unfair advantage. That it could push its button faster, cutting out the human opponents. Or they talk about its database. Of course, if you load all that data into a machine, you'll be able to answer any trivia question with ease.

In both cases, the commenters are fixating on the minor details and missing the main point. Watson was able to parse natural language questions and come up with reasonable answers. That was the hard part. That was the key accomplishment. Natural language processing is unbelievably difficult (trust me, I worked in the NLP lab as an undergraduate). Button pushing and database access are trivial.

The interesting thin is, this shows our natural bias. As humans, parsing the question is easy, so easy we don't even think about it. Instead, we focus on the things that give us trouble. Do we know the answer? Can we beat our opponent to the buzzer? Those are the areas that concern us, so any perceived advantage in those areas seems grossly unfair. But, in doing this, we forget the first step. You must understand the question before you can answer it.

This just highlights the differences between humans and computers. Our brains and Watson's brain have vastly different areas of competence. And lets face it, our brains work very hard to make many tasks seem trivial (object recognition, natural language processing, etc.). Even the dumbest Wheel of Fortune contestant has more processing power inside their skull than Watson could ever dream of. But here's the thing. Computers can always add more processing power. Human brains--not so much. Once computers get as complicated as a human brain, then things really get interesting.

I will say, I am a bit more moved by Noam Chomsky's criticism of Watson, dismissing it as "a bigger steamroller". Chomsky claims that Watson doesn't really understand the questions. But, I'm not so sure. How do we measure understanding? It seems like this line of argument steps into a murky, metaphysical swamp, from which we can never escape.

Instead, I tend to agree with Kurzweil's comments (from the same article):

Kurzweil says that Chomsky’s “answers are so brief that it is difficult to understand what he is trying to say. I would say that Watson is clearly not yet ‘strong AI’, but it is an important step in that direction. It is the clearest demonstration I’ve seen of computers handling the subtleties of language including metaphors, puns and jokes, something people had said would not be possible. I don’t agree with Chomsky that Watson is not impressive in that regard. As long as AI has any flaws or limitations, people will jump on these. By the time that the set of these limitations is nil, AI will have long since surpassed unaided human intelligence.”

Wednesday, January 12, 2011

For the last 6 months, I've had the good fortune to work on a number of awesome projects. Things that I would love to show to other people and brag about, but I always hesitate because they aren't unconditionally awesome--and they aren't all mine.

As part of my contracting gig, I've jumped into a number of projects that were about 75% finished. My job was to polish them up and get them ready for the app store. While I'm really proud of the work I've done, there are several lingering problems that I really wish I could go back and "do right." Most of these are existing features that worked (more or less), and we just didn't have the time or money to fix.

There were also a number of management-level decisions that I didn't entirely agree with. Things like the way we handled in-app ads. I mean, I understand. My client needs to make money. Hell, I want them to make money. After all, I want them to keep paying me. But, it would have been nice if we could have toned things down a bit. Still, they asked for it; I implemented it. No shame in that.

Mostly, though, I don't want to look like I'm taking credit for someone else's work. After all, they lead the horse to water--I just got it to drink. Or, in other words, I didn't make it, I just made it awesome.

The really weird thing is that this uneasy feeling goes cuts other way as well. I often find myself having trouble talking to my clients, especially when things start to go wrong.

I mean, I don't want to be THAT guy. You know the one. The guy who is always whining about the last person who had his job. Always finding some way to blame them or excuse his poor performance on them. In an ideal world, I want to be the one who finds the problems, owns the problems and fixes the problems. But when my clients are pestering me for results, and I'm fighting to meet a tight deadline, and I just spent the last 15 hours cleaning up someone else's mess...well, it's hard to find a constructive way to express my concerns. I mean, let's be honest, (and I'm sure Obama would have my back here) sometimes it really is the Bush Administration's fault. Or in my case, it's the fault of whatever knuckle-typing orangutan they hired to cobble this thing together.

Don't get me wrong, I love a good orangutan. But, let's face it, they are the hippies of the simian world. If you need a sidekick to ride on your hog and watch your back at the roadhouse, then a orangutan is definitely the way to go. But, despite the convincing neck beard, they should not be allowed anywhere near code.

And I guess that's the real lesson here. Don't hire orangutans. They may work for bananas, but eventually you'll have to hire someone like me to come in and shovel out their poop. In the long run, those will become the most expensive bananas you've ever purchased.

Friday, November 5, 2010

Over the last couple of days, an idea has begun to bubble up from the bottom recesses of my brain. Preparing and putting on my presentation just reminded me how much I enjoy teaching. As a result, I'm seriously considering putting together a few tech training courses. I think I'd like to start by focusing on "Intro to iOS Programming" classes, then possibly branch out into other topic areas (Ruby on Rails comes to mind).

If anyone would be interested, please drop me a line. Also, let me know what sort of topics you would be interested in covering, as well as preferences for class length, etc.

I'm sitting in the great hall on the last day of the MacTech conference. It's been a great 3 days. I've learned a lot, and had a chance to talk to so many interesting people. My presentation went over well (though, being between sandwiched between two of the best sessions all conference, I am afraid it may have been somewhat overshadowed). It was just about the right length. I'm starting to feel presentation fatigue just as things are wrapping up.

Friday, October 22, 2010

The upcoming App store for Mac seems to have the internet in a mini-uproar. There seems to be a general fear that this is the narrow point of the wedge. That Apple's long term plan is to lock-down the Mac, the same way they have locked down the iPhone and iPad. And yes, that could be their end goal, but I can't help but wonder if maybe a desktop app store couldn't lead to more-open app stores across the board.

Limiting the app ecosystem makes sense for a phone. By it's very nature, its a limited device. It's OK if it can't do everything--as long as it lets me accomplish useful tasks while I'm on the run. On the other hand, users expect to get more work done on their iPads. Not surprisingly, the iOS SDK loosened up considerably with the iPad's release. Before, each application was kept entirely in its own sandbox. Now we can move files from one app to the next. It may not be complete access to the file system, but it has vastly improved the iPad's usefulness.

Soon we will have a full-blown desktop app store. Most of the excused given for limiting apps simply don't apply. We're no longer dealing with devices that have severely limited resources. We no longer need to worry about upsetting AT&T. Users don't need to jail break their computers to load applications from outside the app store. Most importantly, users will have even higher expectations on what applications can and should be able to do. This will create a considerable amount of pressure on Apple to open up the process, and despite what many people think, Apple is not immune to pressure.

In many ways, Apple is already on a slow path towards loosening restrictions on the iOS app store. They've lifted the ban on third-party languages. They've published a more specific list of their requirements. They continually add new features to the SDK that allow access to previously restricted features on iOS devices. I fully expect this trend will continue as Apple feels their way through what is obviously a tricky and difficult issue. I also expect this trend will accelerate once we have a desktop App store. It won't happen overnight, and it won't be perfect. But, I have a feeling that the Mac App Store will be a good thing for the larger App ecosystem.