While one might opine that Amazon is entitled to the .amazon domain, what happens to domains like ‘.football’, ‘.medicine’? Will companies like Nestle and Hershe (just to name a few) be allowed to registed ‘.chocolate’ for their exclusive use? Similarly, will ‘.cola’ be granted either to Coca-Cola or Pepsi simply because they are able to afford $850,000 – the registration fee for the domain?

I believe that this ‘expansion’ of domains is nothing but monopolistic behaviour and profit maximization simply because it is possible.

The other day, I updated my home computer using Windows Update service. I decided to install multiple .NET security patches and the size of the update was around 180MB. This was after I excluded installing .NET 4 updates.

How much time did it take for the installation??? 1 hour 15 minutes. The download was done pretty quickly. Bu then the installation beast got into its act and took a long time for completion.

Irrespective of the anti-virus software being active, I have observed that installing .NET updates takes an enormous amount of time. Each time I run Windows Update, I shudder when I see .NET updates and defer them as much as possible.

With the launch of quad-core smartphones, the biggest question to ask will be, ‘which of the apps will consume the available power?’ The simplest answer? Games.

With so much power being available, gaming will try and take advantage of it, coming out with more feature-rich games for the mobile world. The next candidate, the vague term called ‘multi-tasking’ or ‘multi-window’. While multi-window feature has been available on desktops, do people really use it? Most of the time, we will want to switch from one app to the other, but will rarely have two windows open side-by-side and read in both – our brains are still single-core.

With increase in the power of apps, longevity of battery is going to take a hit and we will need batteries with more capacity to deliver juice to the demanding apps. This situation reminds me of a joke I heard a long time ago – a few decades ago, when Japanese people were way ahead of many countries in the world in inventing and adopting gadgets. The joke is as follows:

A man in India was traveling to office, noted that a Japanese tourist was waiting for the train to arrive. The Japanese tourist had a couple of bags next to him. After some time, both struck a conversation, during which the Japanese mentioned that he had a watch that could tell time for any place on earth. The Indian was impressed, but decided to check. He asked for the time in Delhi, which was given correctly by the Japanese. Then he asked for the time in Japan, which was also answered correctly. This was followed by queries about the time in many cities of the world. The Japanese answered each one correctly.

The Indian was highly impressed and was determined to have a similar watch. He requested the Japanese to sell him the watch. Initially the Japanese was reluctant, but finally he relented. The Japanese handed the watch to the Indian, who being mighty pleased, thanked him and started to walk away. The Japanese called out and requested the Indian to take the batteries along. Then the Indian extended his hand for the same, the Japanese instead pointed to two of the brown coloured suitcases he was carrying.

I just finished reading the book ‘The Voyage of Trishna’, written by Brig. T P S Chowdhury. The book is about the journey undertaken by a team of Indian army men in the 1980s, to travel around the world in an 11 meter fiberglass sailboat.

While it was interesting to read the book, I felt a bit let down. The book is almost like the captain’s log and is very factual in nature. The story has been divided into three parts, with each part giving details about one leg of the journey.

While factually rich, the book is not captivating enough. In many places it starts abruptly and ends abruptly. Similarly, the story moves from point to point without much of a bridge between the two points.

The book would have been very interesting if it had carried some more information about the team that undertook the journey, and the reasons behind why each one undertook it. The book should also have mentioned in detail some of the incidents that took place on board – while some have been described, they have been done so in a very bland manner. For example, there is a description of the team capturing a 3kg tuna fish. It would have interested the reader to know how this large a catch was made, who made the catch and was there a struggle. Instead, what is described is that the fish was caught and then the team had fish from daybreak to day end, in various forms.

The book could also have included some incidents on board. For example, in the beginning, the indident of the rudder getting broken is mentioned as well as how it was repaired with plywood, but this story is not mentioned in the narration. Hence, one is left with a sense of incompleteness. If this story had been described in detail, it would have made the book more fascinating. The book contains a piture of one team member repairing the radio mast, while atop the mast. But even this incident is not mentioned in the book – while masts getting broken is mentioned in many places, there is no mention of someone climbing up the mast on high seas, to make the repairs.

While the book was very interesting to read – fascinating to know how 10 people travelled in a 30 foot sail boat around the world for 18 months – it left out much of the excitement of the journey.

These days, search is secondary nature to all of us. Whenever we wish to look for information, the first thing we do, is head to a search provider like Google or Bing.

Over the years, Google has made giant strides in the search market, constantly tweaking their product, such that it has become THE dominant player in the search market. While there are alternatives like Bing, DuckDuckGo and many others, Google’s mind share is very high.

Given the mind share of Google, many of its competitiors have ganged up to form an alliance and are petitioning governments that Google implement ‘search neutrality’.

‘Search Neutrality’? What is that? The argument of Google’s competitiors is that Google uses algorithms to rank search results. This very ranking is considered to be ‘discriminatory’ behaviour.

While not praising what Google does, let us keep its algorithms aside for a moment and ask the question ‘What is search’? Search is the method that allows is to navigate through information and reach the most relevant nugget of information we are looking for. Inherent in the search definition is the fact that we have to compare multiple possibilities and options with each other and grade them against each other. In other words, when we have multiple eligible documents that match our search string/pattern, we need to rate the documents against each other and hope to arrive at the most relevant document. Hence it goes without saying that we have to rate the candidates against each other for relevance.

Search engines essentially do this analysis and comparison on our behalf. Search engines scan through documents and try to extract meaning from them, using keywords, textual analysis and linkage to other documents. Typically, all such inferences will be encoded into algorithms that take webpages as input and then classify the pages according to various parameters. Whenever a search condition is executed by the user, it is matched against the patterns and relevant results are displayed.

Google, being a private company, does not share its ranking algorithms – with good reason. Once it does so, its competitors will implement the same algorithms and grab market share. As per the book ‘Inside the Googleplex’ we are led to believe that Google has developed algorithms that trawl through massive amounts of data to come up with suitable ranking. While such algorithms are always written by humans, as long as there is no real manual filtering of the rankings generated for pages, Google is safe from ‘search tuning’ allegations.

What happens when we compare ‘net neutrality’ with ‘ search neutrality’? In net neutrality, we expect that telcom providers and ISPs will not discriminate between services they provide and the similar services provided by competitors, when their customer make use of the same infrastructure. Does a similar concept hold true in case of any search engine? Does the search engine provide common infrastructure that is shared by multiple search providers? Not the case. Each search provider has its own seach infrastructure.

The concept of neutrality is relevant when competitors use some common infrastructure, which is definitely not the case for search.

The main grouse against Google is that it is the biggest player in the search market. As the task of getting bigger market share and mind share is getting tougher day by day and takes times, filing such disingenuous complaints seems to be the method to mire the opponent down such that their attention is diverted from the main threat. Additionally, once any government gets involved in an investigation, the accused essentially goes on the back foot and needs to work carefully.

It seems that a recent FTC investigation of Google has revealed what the book states – Google does not tinker with the search results and the ranking other than what its algorithms tell it. In other words, the algorithms are not partial to specific information such that they award undue priority or undue penalty over other similar information available to it.

As defined on Wikipedia, network neutrality (also net neutrality or Internet neutrality) is the principle that Internet service providers and governments should treat all data on the Internet equally, not discriminating or charging differentially by user, content, site, platform, application, type of attached equipment, and modes of communication.

Given that access to the Internet has become commonplace, most of our activities are happening over the Internet. While the Internet started as a carrier for email (textual data), these days, it carries voice and video and incidentally textual data. This development is causing heart burn for the established players, who earlier had subscriber based services for voice and media.

As these players have been mostly reduced to carriers of data packets, they have resorted to practices that discriminate between data packets. For example, if a company offers long distance calling over VOIP, that company will try and block data packets for services like Yahoo Chat, Google Voice and Skype. Similarly, for players who offer media streaming services of their own, will block of delay data packets for services like NetFlix and others.

By the principle of net neutrality, the providers who are providing connectivity to their customers should not discriminate between services. Hence, they should not block traffic from rival services.

Net neturality is a very important concept that forms the backbone of the Internet. While most countries do not officially have a net neutrality law, there should be one. This will ensure there is sufficient competition on the market and the consumers will truly benefit from the best services.

Recently, I was trying to implement an idea I had. The idea needed me to implement a Graph data structure. While implementing a Graph was very simple, I was stuck at another ‘simple’ task – that of finding all paths from the source to the destination.

Initially I searched on Google and found a few tips, but most of them referred to the shortest path from the source to the destination. After giving the algorithm a shot using recursion, I decided to search once again. Then I found the code by Robert Sedgewick, which did what I wanted. While going through my code, I realized that I was stuck at maintaining the backtracking information – which I had in my mind, but was not able to get working properly.

For the iPhone, India seems to have become the mobile market of focus. Either that, or they want to aggressively push the iPhone and are feeling the heat from multiple Android vendors.

Multiple times each week, the iPhone features as a full page advertisement in the newspaper. The announcement is not just about the availability of iPhone 4 or iPhone 5, but also that they are available on EMI.

I do not remember if there was such an aggressive push for iPhone 1, iPhone 2 and iPhone 3. This time, it seems to be much more aggressive.

Most of us are familiar with file systems on Operating Systems. Data on OS is stored using files and folders, where folders are nested and files are stored in the folders.

Given the large amount of data available, storing information in an organized manner is a challange and a big task. Many of us have faced a situation where we keep the files in a certain directory and then forget where we kept the file.

Can we look at a different way to store data. While we can retain the concept of files, do we need ‘folders’? Instead of folders, can we use a data store, where data is stored not using folders, but tags or keywords? Such a mechanism would mean that storage of files is opaque to the user who will access data using one or more tags. Obviously, a document can have multiple tags associated with it. Else we will not be able to store ‘versions’ of the data.

Even this mechanism is not fool-proof as we can easily get lost in the ‘tag cloud’ or ‘sea of tags’.

Given the availability of desktop indexing options, the solution discussed above may not have much relevance, as implementing it will mean a change to all the software in the world.

While reading the book ‘Imagining India’ by Nandan Nilekani, I read on topic that caused me to sit up and take notice. Nandan has mentioned that India has a few coal mines that are burning underground since 1916. Yes, you read that correctly – 1916. On reading this bit of text, my first reaction was ‘what nonsense’. Then it was ‘can this be true’?

It seems that there is truth in what Nandan has written.

The question is, if the fire has been burning for such a long time – almost one century – what can be done to snuff it out? Is it possible to pour water into the mine to end the fire? Sending someone into the mine to snuff out the fire is a prospect that is full of hazards. The situation down in the mine will be environmentally unfriendly for people.

Is it not possible to send a robot controlled from the surface, into the mine and pour water at all the places where the fire is? I know that it is a very difficult exercise, can the cost of doing this activity not be justified by the fact that the mine will become usable once again?

If too much water is needed to extinguish the fire, how about using a controlled explosion that will such out all the oxygen and starve the fire? Alternately, use a controlled explosion to send CO2 into the mine and starve the fire? Sending CO2 has its own problems as then the mine will have to be pumped once again for clean air, before it can become operational.