Ramblings from the creator of HomeSite, TopStyle, FeedDemon and Glassboard Android.

Wednesday, November 02, 2011

The Long-Term Failure of Web APIs

Years ago, when developers such as myself started the transition away from OS-specific APIs to web APIs, we believed that doing so would empower our software and save it from the confines of the desktop.

And we were right.

But we've also learned that while web APIs enable us to tap into a wealth of data, they can only be relied upon in the short term. The expiration date of software we create has been shortened due to the whims of those who create the web APIs we rely on.

I wrote the first version of HomeSite back in 1994, and seventeen years later I can still run it on the latest version of Windows.

I created FeedDemon 1.0 in 2003, and it was the first app I wrote that relied on web APIs. Now those APIs no longer exist, and almost every version of FeedDemon since then has required massive changes due to the shifting sands of the web APIs I've relied on.

You might think you're immune to this problem if you only integrate with APIs created by large players such as Twitter, Facebook and Google. But in recent years we've seen Twitter switch to a new authentication system, Facebook deprecate FBML, and Google discontinue several APIs. All of these changes have broken, or will break, existing apps.

The end result is that developers are spending more time upgrading their software to ensure that it continues to work with web APIs they've integrated with, and less time adding the features and refinements that would really benefit their customers.

I like that Yahoo has some APIs that are versioned so that you can rely on the old version as you need, but as they change and add functionality (or re-define it) your old code doesn't break, but new code can be written against the new APIs. I'm sad to see the Google Reader API gone, too, though.

Networked computing is fundamentally different from single machine computing, not better/worse just different. I agree that API providers need to do better but the network is 'shifting sand' and we accept that compromise because of the benefits it brings.

We're already seeing a move to API intermediaries who can arbitrate this kind of stuff for a fee. I expect that all but the largest API providers will eventually use an intermediary to provide the actual API (a la CDN situation).

Final thought, there's a strong trend for sites to turn their backend into a pure API provider that can be consumed by web pages, apps, etc. This kind of 'API dogfooding' can only help drive better API stewardship.

Homesite: Just the mention of it takes me on a trip down memory lane. I must have written a million lines of code with that editor. I sure could use a copy of it now as I have to do some major updates to an old asp classic web site and it would be so much easier to do it in homesite than Visual Studio.

At the same time, one could consider that API are (usually) changing for the better, and that this in turn should theoretically help drive improvements in the API clients. It could create a survival of the fittest type ecosystem where clients that can't keep up become obsolete. This may happen because the developer(s) don't have the necessary time or skill or dedication or funding or whatever, it doesn't really matter, only the outcome matters, which is that the product is not growing and improving unlike the rest of the ecosystem.

The reason HomeSite still runs unchanged is not due to desktop software by itself, but rather Microsoft's approach to backward compatibility. For example of the opposite, if you wrote it for whatever system Apple had 17 years ago, it will certainly not run on the systems they ship now.

Is there not a sense that the web is more fluid: apps are run on the cloud, so that deployment is much easier (I update the web site, all users suddenly get new version without installs)? This perceived ease-of-upgrade might lead folks to be willing to change APIs without the normal "this will break a zillion desktops" concern. They change the API, they give notice (hopefully), you update your code, app keeps working, users never notice a problem. This would be more difficult to replicate on a desktop (such as, IE6 is still almost 50% of traffic, some say), though rolling out a patch ahead of a change is doable, and some iOS apps appear to be updated on a daily basis (I jest, but only a bit).

All fine IF you treat web APIs like features or product instead of the core of the business. Think of the APIs you use as business partners, not tech. Your examples of Twitter, Google, etc. are good examples of why communication and sound business foundations matter most.

I think this post reflects some of the growing pains of Web APIs as they transition from what has been "best effort" to professional. There are now plenty of APIs which are very stable and companies version very very carefully.

I think it's important to remember that what a Web API does is of huge value and not achievable in other ways. If you want Netflix data, use the netflix API, if you want to automate your interaction with eBay, the API is there for you. Yes, there is still too much change but the power of these interfaces is well beyond what you could achieve as a single human clicking away at a web interface.

So I'd agree people need to get better at managing the resulting system dependencies (we and other vendors in the space try to help with that), but the power of these interfaces is extremely compelling. The alternative (a DVD of netflix metadata in the mail every month?) doesn't bear thinking about any more.

i don't think its only a problem of new versions of web apis. its also the missing ability or experience of many web "developers" to create apps, where new or different apis can be swutched into/replaced.

if you rely complete on any given web api its your fault. as a programmer now more than 25 years in business , i know, that apis change and that systems change. therefore i try to develop in a very "independent" way. of course, even then you have to face some refactoring or overhauls, but they are seldom very complex.

Nick this is a great piece, thanks for writing it. But one perspective that's missing if the user's. I might be using a tool that is no longer in active development, but it works and does something no one else is doing. We do leave behind ideas in tech, sometimes lots of them.

Microsoft's approach says the user comes first. Why would they want to break the user? There's really no upside, except it means less work for the platform vendor (it costs money to keep the platform backward compatible). But there are so many more users, that work has the greatest leverage. Much better investment than investment in speculative features that very few people might use.

It also means data lives longer. For example, I can still read the sample files I created for ThinkTank in 1984 for the IBM PC. The Mac version, heh -- long-gone. Same of course with the Apple II or Apple III versions.

There are concentric circles of ego in the computer business. People who work for companies like Apple and Twitter feel they're at the top of the pyramid. People who make applications are the next layer ego-wise. And users are nothing, in the group ego of the tech world.

Of course there's another view of this that is exactly opposite. That our work only has value to the extent that users are more productive and lose less of their work.

I totally respect Microsoft for keeping my software working, both as a user and a developer. It makes me trust them more, and want to put more software on their platform.

On the other side, as soon as I saw Twitter willfully breaking developers, I stopped investing in their platform. I knew where this was going.

And Nick, I never invested in the Google Reader API. If my users had asked me to do it I would have said no. If they asked why, I would have told them that I knew what just happened would eventually happen. They might have used another product, but I don't want to build on shaky foundations.

This is also why I fought so hard to freeze RSS 2.0, despite the fact that there more programmers who wanted to break it (note that none of them were people who actually had an investment in RSS). People didn't understand why I wanted to that, maybe now they do.

Sorry for the length of this comment. This should be a blog post, but I'm in a place that I can't write a post unfortunately (someone thought that opening port 5337 was a bad idea, as well as the port for Remote Desktop).

I agree with what you are saying in general but this is not true for all API's. In our case, our API is built for our partners - our API exists to assist them in making sales. As such, all changes on our end are made with our partners (API users) in mind. Our API users are not an afterthought that have to scramble around to adjust to changes we make.

Yes, what you are saying might be true in most cases but I do not think it applies to all cases.

Alas, an application is only as good as the weakest API it relies on. We're run into similar issues with APIs from the above vendors, but also Yahoo and smaller companies that were either acquired or went out of business. The truth is that relying on Web API dramatically increases maintenance costs and makes products brittle.

Windows has too much longevity. The price for that was too high. The typical Web API has too little longevity. The price for that is also too high. Apple takes the middle ground and that is the right solution. When they dropped support for Classic apps, less than 1% of the user base was running the Classic system. And those 1% can run their Classic Mac apps on the latest Macs via a free 3rd party emulator. That is a very small price to pay to drop 20 years of legacy. So if Facebook and Twitter care to have 3rd party apps, they just need to be more like App Store.

Steve Yegge get's it right on the money. Google is not a platform vendor. neither is twitter. Microsoft is. it's about platforms. html, rss are not attached to vendors or platforms so they are agnostic. and they are not api's but file formats.

Obviously Hamtoolongusername has never tried to run anything of substance in said emulator (SheepShaver). And SheepShaver won't run those PPC OS X apps that didn't run in Classic. But hey, thanks for playing.

I don't think Nick is suggesting an alternative technology solution. I'm not sure he has a solution in mind.

However, the most obvious solution is NOT a technological one, but a change in (or return to old) business practices. The problem is that the creators of these Web API's do not care about backwards compatibility in the same way that previous platform providers did. The real question is WHETHER that behavior can be changed and then SHOULD it be changed?

" we believed that doing so would empower our software and save it from the confines of the desktop." Well, there's your problem... Desktop apps have always trumped web apps for performance and usability, and still do.

Isn't the core that Twitter, Facebook, Google et al are just big data sets on which external services are provided but the Homesites of the World are in a different business sphere? The big data sets actually owe it to themselves to make API's as easy as possible to maintain marketing relationships, but they aren't to their own detriment!

Twitter, Facebook, Google et al provide access by external businesses via their huge datasets which benefits both parties. This is different from stand-alone software like Homesite. Unfortunately the big data sets don't seem to want to publicly acknowledge the external service providers' symbiosis. Even so, messing with their API's is a bad business decision!

Agree with Craig. It's because these APIs are free. Previously backwards comparability was part of agreements and maintenance fees. Nowadays it's all free and in the cloud far away - no accountability and no dollars leads to no comparability.