Thoughts about the web

I’ve had several announcements about pet-projects of mine that got to the front page of HackerNews and some technology-related subreddits. And although that doesn’t mean that the projects were promising, game-changing, or investor-attracting, it at …

We read. We watch. We learn. We gain knowledge. What’s the point? The point of knowledge is to allow our imagination to make unexpected connections between more and more elements. In other words, we feed …

When I first released Computoser (my service for generating music, without any human input), I didn’t think of a business model – it was just something interesting to play with. And the lack of business …

At least for me, all links are shown as t.co. Even in clients like mine that used to resolve them, based on the information provided by the API, it’s t.co now (the actual URL is not provided).
This probably means something has gone wrong temporarily, but it also shows that shortening all inserted URLs was not a great idea. I’ve always thought it’s a bad decision for a number of reasons:

it’s not user-friendly – when something breaks, or when the countless twitter clients don’t bother resolving the URLs, the users see t.co all over, and they don’t know which site they are going to open (not to mention spam opportunities)

its main purpose is to provide analytics, but analytics are nowhere to be seen, despite being announced a number of times in the past two years.

there’s a much better way to handle that – the way facebook approached it, with l.php?url=actualUrl. That way all links clicked on the web client will get tracked. And for links provided via the API…clients should be forced to record link clicks. If they don’t – the account gets disabled. Too complicated? Well, currently it’s also possible not to go to the t.co link in the client – you just get the actual URL and show that to the user.

One thing t.co is good about – it saves some characters. You no longer need 3rd party shorteners. But people use them anyway, so the benefits diminishes.

Imagine the typical case when a user posts a link to twitter from some external app. Then the tweet gets sent to the user’s facebook account (via some account synchronization app). What happens when one of his friend clicks the link:

Facebook alters the link with javascript and sends to http://facebook.com/l.php?u=http://… It does that for tracking purposes – all links go through l.php so that facebook knows which user clicks what.

The URL which l.php redirects to is actually http://t.co/xxxxxxx, because twitter now shortens every URL that gets sent. Twitter does that for similar reasons to those above – to track what gets clicked. Internally, twitter looks for the actual URL behind the t.co url and redirects the browser to that URL

The user has already shortened the link with bit.ly (or any other shortener), so the browser is now at http://bit.ly/xxxxxx. Bitly loads the corresponding target link, which is now the actual link that was initially shared, and redirects the browser to it. People do that because the 3rd party shortener gives them important analytics about interaction with the link. Analytics that twtiter and facebook don’t give them (I don’t know whether facebook insights show such information, but regular users don’t get it anyway)

If the user opening the link is on a mobile device, the site may detect that and send another redirect to a mobile version of the site.

So, these are 4 redirects to get to the target URL. A “redirect” means a server sends special instructions (header) to the browser to let it know where it should go. This means there are 4 different requests, to 4 different servers just to get to the desired URL, and one more to actually load it (one less if you are not on mobile). Imagine some of the servers are down or are responding slowly. That makes opening a single link, which is at the core of the web, a very slow action. Just because of a bad combination of tools.

Why this happened? Because of twitter, its char limit, its lack if initiative to introduce their own shortener early, and their strange decision to shorten every link that gets tweeted without providing any analytics. Because of facebook and its lack of link analytics. And because of social network clients that just don’t think it’s important to integrate properly with t.co. Virtually all clients I’ve seen do nothing about it, even though the API provides the original link.

How can this improve?

twitter and facebook should duplicate the analytics functionality provided by URL shorteners

tools should integrate with t.co properly. For example my tool (welshare) is currently the only one I’ve seen that calculates the length of the actual twitter message containing a link (due to t.co tweets can be longer than 140 characters), and it automatically expands t.co links. It also expands them when resharing a tweet to facebook, which means no t.co link appears on facebook.

Probably the original idea behind twitter was that people share random things that are relevant to the current moment only (or not relevant at all). And it is still a predominant use-case, but there are others. People generate funny phrases, share great videos and links that they just found. Things that will be relevant in a month and even in a year.

Unfortunately, twitter makes it really hard to get back to your old tweets. It stores only the latest 3200, but to get to the 3200th tweet from the UI you need to be persistent – there is no paging, and search just doesn’t work like that, even if you know parts of your tweet.

Having 10000 tweets, I realized my first tweets are gone. Forever. The service I used to make backups was abandoned and the data – deleted. And that’s not good. My tweets are my short thoughts, my micro blogposts. (Does wordpress delete older entries?) I don’t think it will be that expensive to store everything forever. Facebook does. Making it accessible is a bit trickier, but the user is likely to wait more if he tries to access something old, so the performance doesn’t need to be as good as with new tweets.

Perhaps twitter wants to be exclusively about what’s happening “now”, and never care about last week, but that’s not everyone’s use-case. Having created a horizontal product means you have to support horizontal use-cases. Pinterest seems like a better match for just one of these use-cases – interesting things you find on the Internet, but do I need yet another service just for that?

I’m now backing up everything in welshare, where I can search my old tweets, and I even get suggestions for sharing some of them again, so my tweets are not short-lived. But yours are.

I’m a really actice twitter user and I often check twitter from my smartphone. But I’ve realized that there isn’t any decent twitter client for mobile. Not one. Now, of course my criteria for “decent” are a bit high, but I expect it to have the following:

getting notifications for new retweets (of me) and replies – Seesmic fails on the retweets side, TweetDeck shows only some of these events, and others are skipped. The API is the reason for that, as it recently started behaving strange when giving retweeters information, but the problem was solved in welshare (which still lacks mobile app) by at least showing “Someone has retweeted your message”.

not getting all new tweets as new notifications. Many do so, which means my phone top bar is always full with a number I don’t care about

resolving images and showing them inline – from at least twitpic, yfrog and all raw image links.

to be able to scroll to the top with one click. Tweetdeck and twitter’s own app do that, Seesmic and others doesn’t. Which means I have to manually scroll tons of messages to go to the top.

being resonably fast – twitter’s own client fails miserably on that point. Sorry, but the client is unusable – you have to wait for more than a munte to get some simple details.

bonus points – facebook integration. Tweetdeck and Seesmic have that, but for a whole month now tweetdeck doesn’t have the required “read_notifications” permission to get facebook notifications, and Seesmic has a completely separate interface, so you can’t cross-post tweets to facebook.

TweetDeck is the closest to that, but it gets worse and worse after Twitter acquired it (no facebook notifications, not all retweets getting shown). I’m sorry Twitter, but you telling people not to develop twitter clients means you need to have a usable client first. And you don’t. I hope welshare will have its own mobile client at some point, which will address all the above issues.

Quora is a place to ask anything. And probably get answered. But who gives these answers? I tried – I have a couple of answers, but there is nothing that makes me return to the site and answer more questions. Quora lacks the game mechanics that has turned StackExchange into such a successful Q&A engine. TechCrunch wrote that Quora adds gamification via “ask to answer” with some credits, which I honestly could not find in the UI. Then I got a direct link to see how many credits I have, and it still means nothing to me, because there are no ladder-boards. If you are short of altruistic incentives, Quora gives you no reason to go back to it. And by the way “ask to answer” is not something you can’t do on StackExchange. I’ve been asked not once “could you look at this other question of mine”. And I do. Not because the poster will give me credits, but because others will.

Should Quora copy-paste the reputation and badge systems of StackExchange? Not at all, it has a different structure, but it should employ some game mechanics if it wants to stay. Otherwise StackExchange will be the preferred place to ask a question, where people compete to give you the best answer, not just randomly answer questions when passing by.

The Dilbert comics is just 2 years younger than me, which means there are thousands of strips. Most of them represent a mocked real-world situation, especially about big corporations and geeks.

I have always wondered if managers read Dilbert. I bet they don’t, otherwise they’d say “oh, wait, this thing I’m doing was mocked by Scot Adams, and it is obviously idiotic”. Anyway, regular people like Dilbert could sometimes get a Dilbert attitude and send a comic strip to their managers.

And the Dilbert website offers a pretty good search engine for that – just search for “deadline”, “raise”, “new colleague” and you’ll find a lot of good strips to send to your pointy-haired boss. Even if this isn’t going to change the habits of managers, let’s at least make some fun of them. So the next time your company hangs big posters with “Company values”, send this to everyone, cc management. And hope not to get fired.

Megaupload was taken down for hosting pirated content, and knowing about that. How about that – uploaders upload encrypted content, and share the decryption key through another medium (skype, forums, facebook, etc.). That way the other medium is not infringing anything, and the hosting provider cannot possibly know what is being uploaded, and therefore can’t be held responsible for it.

Many tools exist that tell you when are the best times to tweet. The analysis is usually based on the number of tweets sent by your followers, as an indication to when there are most people online. Tools that do that include tweriod, tweet o’clock, timely, etc. My service welshare shows you the currently tweeting followers. But I have to argue that the above assumption is not entirely correct. The best time to tweet is not when most of your followers are tweeting, but right before that. In the most active periods of the day the stream becomes too noisy, and it gets harder to get noticed. While if you send your tweet right before everyone appears online, they will see it before the stream gets filled with noise.

It is of course hard to tell when is that time, because people don’t appear online all at the same time. But since everything in this metric is an approximation anyway, you can subtract 30-60 minutes from the estimated peak of online presence of your followers, and here’s your suggested time to tweet.