Search This Blog

Saturday, November 14, 2009

Happy 40th year Internet - A historical perspective on the evolution of computing

While I was at a conference in Mexico, I missed a very important date: The 29th of October should be a special day for all internet users and IT/Data networking professionals. The 29th of October 2009 marks the 40th anniversary of the Internet.

The very first transmission had to be episodic. The cleartext login banner made its way down the routes of the packet switch network, but a fault with the SDS-940 machine ensured that only just three characters of the login prompt made their way through. The fault was fixed about an hour later and the first successful remote login took place, writing history as the first successful ARPANET remote login, in fact the first ever successful login of the Internet era.

Every major historical event offers the opportunity for reflection. I entered the world of computing much later in 1987, during the era of the Zilog Z80 home computers (an Amstrad CPC 6128), and I remembered the excitement of connecting to my first BBS (as much as the screaming of my parents when they saw an elevated phone bill). 8 years later, a DECstation with the Mosaic browser got me into the era of the World Wide Web. Today, we talk about the semantic web and we have transformed our lives by means of using Internet technologies, to work, communicate, and even make social connections.

Despite the entanglement of rich content down the packet network, I can't help but notice that the very fundamental algorithms that govern the operation of networks (queue theory, routing) remain the same. Sure, speeds are up, encapsulated data complexity is up but the basics are still the same.

What's more fascinating is that this trend is true beyond data networking, and in my humble opinion is universal across the entire field of computing. In modern processors, for example, the miracle of wafer manufacturing and materials technology has made computing affordable for the masses and skyrocketing speeds possible. However, the very fundamental architectural framework behind the operation of all modern computers has been the same since the days of John Von Neumann. In computer programming, new high level languages have revolutionized the productivity of the programmer by letting him focus on the task at hand. Garbage collection, object orientation, interfacing and scripting are some of the techniques used today to shift the programmer's mind from the low level machine details to the details of the task to perform. However, at the compiler/interpreter level, you still have more or less C/C++ libraries and machine code to do the job. Many aspects of high level languages are still the same as Konrad Zuse's Plankalkül .

This raises important questions about the fundamental ground breaking research in the field of computer architecture. We sure have many computing inventions in the applications area, but we also have some big setbacks (connectionist architecture, AI), when it comes to true parallel operations. I am not pessimistic, I am just very aware that we "stand on the shoulders of giants".

If I may, you have a couple of points I do not agree with with regards to your thoughts about the evolution of computing.

First of all, I do not think the field of computing has not deviated from the Von Neumann architecture. Quantum computing and other experimental parallel architectures exist at this point and they have made huge breakthroughs.

Secondly and lastly, the connectionist architectures have not failed (I get that impression by the term 'setback'). They just lacked a suitable hardware framework. We now enter an era where we approach the capabilities for building connectionist architectures large enough to infer and simulate thought processes.