Friday, March 28, 2008

Last week I was at the Tridentcom conference in Innsbruck. There, while attending a talk, I saw first-hand the horror of a presentation gone completely wrong, the discomfort of the audience, and the pain of an embarrassed speaker as his ASUS EEPC failed miserably. The presenters beautiful slides were cut horizontally (about 30% of the lower part of each slide was missing) because the ASUS EEPC could not drive the overhead projector properly. Every other laptop, including old clunky student-budget ones, running Windows , Linux, and Mac OS, were successful in beaming presentations. But not the ASUS EEPC.

There are many objective reviews of laptops on the Internet based on specifications, design, speed, etc. They try to compare products side by side so that potential buyers can choose the product that is best for them. Still, somethings are considered standard and not even mentioned - like the assurance that the power adaptor will charge the laptop's battery, the battery won't burst, the USB ports will work, and the VGA output will work. Seasoned customers are smart enough to get a mix of online and off-line opinions about a product before buying it. But sometimes when products are just released, it is hard to ascertain whether a product will serve its purpose down the road. I hope this EEPC horror story gives folks some additional information before they buy it.

Its all nice to tout the small, cheap ASUS EEPC. But seriously, didn't VGA-out technology mature like 15 years ago? And ASUS cannot even get this right in 2008? My 2 cents - get a second or third hand Pentium 3 laptop or something instead.

Monday, March 24, 2008

I was thinking of my first algorithms class and remembered the "Towers of Hanoi" problem. This problem is used to introduce the concept of recursion. It can also use the stack data structure (LIFO) quite naturally, making it a universal favorite in assignments. An Internet search yields 1000's of websites discussing the problem. Heres adding yet another discussion to this timeless classic!Problem SetupHere is the setup and the rules of the Hanoi puzzle

There are 3 towers labeled 0,1, and 2.

There are N circular discs, each of different radius, that are initially placed around tower 0.

Any circular disc can only be placed above a disc of larger radius, or as the first disc of the tower.

The objective is to move the N discs from tower 0 to tower 2 following the rules mentioned above.

Figure 1 explains the problem graphically and shows the rules of the Hanoi puzzle.SolutionIn order to solve the puzzle recursively, look at Figure 2. The first requirement on moving the largest (red) disc from tower 0 (source tower) to tower 2 (destination tower) is that there should be no discs over this red disc. This means that the other 2 discs (green and yellow) should be moved to tower 1 (auxiliary tower) and the setup should be in state 1. Then, the red disc can be moved to tower 2 (state 2).

Now the problem is reduced to one with just 2 discs (yellow and green). These need to be moved from tower 1 (the source) to tower 3 (the destination) using tower 0 as a temporary go-between (auxiliary tower).

Figure 2: Top level steps. Click to enlarge

But wait, how do we accomplish the transition from state 0 to state 1?Figure 3 shows the intermediate steps to accomplish this. The aim is to move the green and yellow discs from tower 0 (source) to tower 1 (destination) using tower 2 as the auxiliary tower.

Figure 3: Mini-steps from state 0 to state 1. Click to Enlarge

So the high-level idea is to have a function that recursively places the discs in the appropriate towers. This will be clear in the C++ "Hanoi" function shown below. I have used STL data-structures (vector and stack) in order to simplify the explanation.

//Recursive function Hanoi - the most interesting function//When this function exits, it has moved the lowest//Tower (as specified by numDiscs) from the sourceTower to the//destTower using the auxTower as a temporary holdervoid Hanoi(int sourceTower, int destTower, int auxTower, int numDiscs){

Saturday, March 22, 2008

I visited the UN data website and downloaded data about the total electricity production of countries and matched it with their populations in order to obtain the electricity produced per 1000 inhabitants (for each country). I have plotted a histogram of this in the figure above.

This histogram is disturbing.

The smaller issue is that inhabitants of most countries have lesser electricity than the world average. This means that a few energy-rich countries are producing (and probably consuming) most of the world's electricity. I have marked some of the representative countries on the histogram. There is a table at the end of this post containing the parsed data in case you want to look up your own country and/or use the data (After citing the UN data source, off course).

The bigger issue is that countries with low electricity per capita (bars toward the left of the figure) are striving to move the per capita electricity production higher to improve quality of life. For example, both India and China are lower than the world average. And I have the impression that these countries are really looking to improve their citizen's living conditions. And remember, populations are increasing too (see one of my previous posts), making it necessary to pump up electricity production even faster.

Producing more electricity is a positive development. But the problems associated with increasing production are the issue here. Here are some questions:

Where are we going to get the energy to increase electricity production so rapidly?

What will be the environmental cost of creating so much additional capacity? I hope it is renewable energy. Hope. Hope. Hope.

Or, does this analysis indicate that even in the next decades electricity will be a premium, for-the-well-off, limited quantity luxury given the lack of such a massive energy source?

Tough cookies.

We really need a breakthrough with some new technology here.

Here is the table containing data used in the analysis.Country, million kWh per 1000 inhabitants (German notation: "," is the decimal point)

Wednesday, March 19, 2008

The Global Environment for Network Innovations (GENI) idea is to build an experimental facility that researchers can use to experiment with new communications and networking technology, distributed systems, cyber-security aspects, and applications. A couple of weeks ago I attended a talk given by Craig Patridge, chief scientist at BBN technologies. Craig is heading the GENI project office tasked with implementing GENI. He is an engaging speaker and got me thinking about the merits of building GENI.

GENI is a sort of first for the National Science Foundation (NSF) and the network research community. NSF occasionally funds large infrastructure projects like building astronomy telescopes and particle accelerators but this is the first time that a networking infrastructure is being funded. The interesting thing is that the infrastructure comes first and then protocols or services follow. Moreover, the infrastructure will be built based on requirements specified by the research community. Is this the future of network research?

The current Internet is an engineering marvel. It scaling property is absolutely remarkable - 100s of millions of hosts in a federated environment with completely different underlying access technologies just work. Then the creative engineers and innovators come in to shape the Internet APIs (e.g. IP communication stack) into myriad applications - email, VOIP, P2P, Web 2.0, digital libraries, and whatever else they can think of.

The Internet was first built by engineers, and later scientists highlighted several flaws in its design. This has sometimes served as a good feedback loop for refining the Internet over time. For example, security researchers have continually unearthed security holes in the IP socket stack. Scientists, coming from the outside, study the insides of the Internet and contribute to refining the already-built Internet.

But can researchers with limited engineering experience design a new communication infrastructure such as the GENI infrastructure? I very much doubt this. Researchers are seldom successful in building viable commercial technologies. They are very smart but usually focused on one or few problems. It is not at all clear if GENI could deliver the next Internet.

But lets back-pedal a little. Is GENI supposed to be building the next Internet? Answer: perhaps not. But what I find troubling about the GENI (or FIRE - the European counterpart) is that there is that almost the whole OSI stack - network, transport, session and applications - is supposed to come from the research community. I really really wonder who is going to write all this up? Off course there can be a module-based approach to plugging in pre-existing pieces into GENI, but then how is this Internet design revolutionary, and does this justify building GENI in the first place? Why not stick with something more real like PlanetLab?

I don't believe scientists can build another Internet from the inside out. Shouldn't building a new network, from the inside, be left to the engineers?

Sunday, March 16, 2008

We all know that cellular mobile networks are rapidly expanding all over the world - in fact the rate of adoption of cellular mobile technology has been faster than that of the Internet. I am bullish about this technology's role in human development. The bar-graph says it all. Amen!

Saturday, March 8, 2008

The above plot shows the growth of the Internet in the World, again from UN data. There are 3 histograms (2004, 2001, 1998) of the number of countries (Y axis) vs. the size of the Internet population in the countries (X axis). I have left out the countries with less than 1m users because they are the vast majority in the data - partly due to low Internet adoption in backward areas and partly due to small populations.

The key points seen immediately

The number of Internet users is (surprise surprise) growing rapidly over the years.

USA leads every other country, but since the total US population is only about 300m (see my previous post), US Internet users' growth is saturating.

On the other hand, China is growing very quickly. I suppose by 2008 (present) the number of users there must have got close to the US number (if not ahead).

But lets look at a more holistic trend here. This is the "long fat tail" that shows up in the 2004 data - the bars concentrated on the left hand side have started spreading out to the right in the 2004 histogram. This means that the Internet is increasingly becoming multinational with a significant representation of people from many countries, bringing up these questions

Where is the Internet content suitable for people from different cultures and not just western cultures?

Are we doing enough to localize Internet applications for these large multicultural groups joining the Internet?

Finally, for those interested, here is a table showing the number of Internet users, as reported by the most recent data from UN Statistics (2004). I have ranked the countries based on the number of Internet users in the rightmost columns of the table. Again, if you look at the top 10 countries in this table, only 2 have a high number of English native speakers. In addition, 5 out of the top 10 countries are not "Northern hemisphere western countries".

Wednesday, March 5, 2008

I learnt of a superb data-resource posted recently by the UN on the Internet - from where you can get nicely formatted data about all sorts of statistics from different countries of the World. I wrote up a dirty Python script to parse out the estimated and projected population between 1950-2050 of 4 major chunks of the World: India, China, USA, and Europe. I have plotted these above.

It is obvious that the biggest markets by volume in this century will be in China and India. This does not say anything about margins off course. But we are are limiting this discussion to volume only.

Why? Because I want to get an idea about what sort of digital content is needed for the future. My assumption is that by 2020 or so, most media delivery technologies would have matured and moved out from the premium segments (video capable mobile phones, high speed broadband Internet, computers etc.) and into mass markets where the sheer volume pushes down the price of technology enough for everyone to afford. For example, this has happened recently in cellular voice and SMS services.

So, assuming every John Doe in China and India has a content-capable receiver (e.g. a mobile phone, or whatever they call them in 2020), where is the content?

CaNsOmEoNePlEaSeTeLl Me WhErEtHeCoNtEnTiS?

I mean content for the masses of non-westernized people in China, India, and other parts of the World.

The people who invest in making/acquiring content for these mass markets right now would land a huge windfall in 15 years. Best medium term investment I can think of!