Recursive cycles of innovation happen within many different areas, most notably in the domains of Art and Design. Even those who are completely out of touch with fashion can observe how the field is constantly rediscovering its past, recycling ideas, and incessantly mixing new trends with old influences. In Economics, it’s a well-known fact that business trends and stock markets expose processes that tend to repeat itself in a more or less regular fashion. In this domain, the most notorious study was developed by Nikolai Kondratiev, a Russian economist who observed a series of long 50-year cycles in modern world economy. The Kondratiev waves, as they were later called, consist of alternating periods of high and low sectoral growth, which for most cases have proved to be accurate since the end of the 18th century. In fact, from the different angles you can look at history, there’s always someone who is ready to point out a specific recursive cyclical pattern.

But even though we are accustomed to this type of process in many fields, we always think technology, and in particular the computing industry, is immune to it. After all, our technological progress is made of several vertiginously rising paths that share the absence of a review mirror. There’s no point in looking back, or even considering that some aspects of the past might re-occur, simply because there’s nothing to learn from it. Nonetheless, the act of uncovering patterns and potential cycles, particularly in an industry that prides itself of its continuous fresh innovation, is an extremely appealing exercise.

The pattern I’m about to describe is divided in 3 periods, starting at the foundation of computing history and ending with a set of strong indicators of a third new cycle. It tries to make the case that even though individual technological components evolve in a really fast and unique pace, the way in which they interrelate and behave might follow some level of cyclical occurrence. Separating these 3 stages are roughly two periods of 25 years. The first cycle started in the late 1950s, with the widespread of the mainframe-computing model, followed by the second stage in the beginning of the 1980s, with a succession of events that lead to the emergence of the highly powerful laptop computer. Finally, the latest and most recent cycle has just started. Lead by Cloud Computing and the Netbook phenomenon, everything seems to indicate this will be a major movement for many years to come. From an initial centralized model, to a dispersion of increasingly independent machines, the new drift foresees storage and computing drainage from many portable computers and the return to a model based on data centrality. The main distinction this time is that instead of the mainframe, the “Cloud” emerges as its central interconnected hub. Although recurring cycles might be a noticeable pattern for how data is stored and accessed, there’s still a unique common thread to all these stages: a continuous straight progression towards mobility.

First Cycle | The Central Mainframe

Characterized by one central computer, responsible for most of the storage and processing power, linked to a series of satellite terminals, the mainframe-computing model has been a key protagonist in the history of the modern computer since the late 1950s. By then people accessed and interacted with immensely large mainframes through a variety of linked terminals that suffered significant changes over time. From early punchcards and teleprinters, to latter video computer displays with their familiar green and amber screens, interactive computer terminals through the 1960s and 70s had one thing in common: its powerless unintelligence and dependency on the crucial mainframe.

Second Cycle | The rise of the laptop and its Portability Effect

By the end of the 1970s, specialized terminals, as the precursors of modern-day portable computers, were becoming smarter. Initially packed with terminal emulation software, these machines were detaching itself from the almighty mainframe and becoming self-sufficient entities with their own processing capability. This process opened the path for the desktop computer, with early pioneers like Apple II and IBM 5150 leading the way. The course of computing mobility had just started and it would be just a matter of time before laptop computers started to materialize and eventually replace desktop computers.

In most part, the computing industry in the past 25 years has seen laptops dramatically increase their computing power and rival traditional desktop PCs. In 1986, battery-powered portable computers had about 2% of market share worldwide[1]. Today there are more laptops than desktops in businesses and general use, and in 2008, more laptops than desktops were sold in the US[2]. Even though some mainframes have evolved into the supercomputers of modern age, uncovering important aspects of science, like the structure of cosmos or the vast neuronal network of the human brain, the true hero of this story is the laptop. These tiny compact boxes have become potent full-fledged machines with the added benefit of portability – an essential attribute in an increasingly mobile world. But how long will the mobile processing power rush last? Or has it in fact reached a tipping point? Will the hero of the last decade be partially or entirely replaced with its new weaker adversary: the Netbook?

Third Cycle | Cloud Computing: The Personal Mainframe

Cloud Computing is seen as the next computing trend and the key driver for the third cycle of data centrality. It can simply be described as a “style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet”[3]. This model is extremely in tune with our contemporary lifestyle. We currently access the web through a variety of devices, with different features, shapes and sizes. And while the amount of access points keeps increasing, the ability to synch content between them is still an immense headache, where rarely (if ever) we see a satisfactory user experience. The Cloud paradigm substantially alleviates this problem, by increasingly relying on services, information and applications stored on online servers – the vast Cloud landscape – that can then be accessed at anytime from anywhere, as long as there is an online connection.

On an enlightening special report by The Economist, entitled “Let it rise”[4], it’s asserted that the Cloud is already a common phenomenon, where 69% of Americans connected to the web use some type of “cloud service” including web-based e-mail or online data storage. Many companies are following this feverish movement and in the same article by The Economist, Irving Wladawsky-Berger compares it to the Cambrian explosion some 500m years ago, when the rate of evolution speeded up, in part because the cell had been perfected and standardised, allowing evolution to build more complex organisms.

Another indicator of this turning point is the Netbook. In part driven by a global economic downturn, the Netbook phenomenon might prove to be a long-lasting craze. Characterized as a lightweight, economical and energy-efficient laptop, especially suited for wireless communication and Internet access, this new mobile computer has been all over the news lately. In a recent article from Newsweek magazine[5], they uncovered a growing market trend in Japan, as more consumers are opting for netbook computers. While PC sales went down 4 percent on the fourth quarter of 2008 in Japan, sales of netbooks shot up 43 percent. The recession has been an important driver for this consumer shift, since people have become more sensitive to price, but the growth of cloud computing is its vital ingredient. There’s also an undeniable rational deduction behind this behavioral change. Many users start questioning themselves if they actually need all the bustling speed and storage, when their computers are mostly used for emailing and web browsing.

On the diagram shown above we can observe a series of laptops and netbooks linked to a central Cloud, which is in turn surrounded by a multiplicity of abstract devices. Many of these future devices will not require a vast processing capability, since they will work as rendering windows for the same online service - flowing incessantly through all of them. The role of “windows for services” might be what awaits many future mobile computers, including mobile phones. This points out to the growing value and significance of online services and applications, as the vital glue across many systems and platforms.

Conclusion

Predictions always feel like empty promises and there can be no certainty on what the future holds. Is the Netbook a predecessor of a future class of dumb terminals entirely dependent on the Cloud? Is Cloud Computing really going to be the next big thing? If so, how long will it last? Will it prove to be a long lasting shift or will people grow increasingly wary of their privacy and lack of ownership and return to a similar model as we have today, and in the process instigate a fourth cycle of data centrality?