Technology tends to run in cycles. Microsoft ruled the 90’s by building essential software for enterprises. Then Apple created a new device driven marketplace in which the consumer was king. What will drive the next decade?

While these things are always hard to predict with any specificity, much of the writing is already on the wall. Humanlike, no-touch interfaces will combine with a pervasive array of sensors and intelligent back-end systems to form a new Web of Things. Computing will become truly ubiquitous.

This new era of computing will be different than anything we’ve seen before. Technology will cease to be something we turn on and off, but will become an inextricable part of not only our environment, but ourselves. It is a future that is both utopian and dystopian (depending on your perspective), in that the human experience will change dramatically.

4 Digital Laws

When William Gibson said, “The future is already here – it’s just not evenly distributed,” he meant that the seeds of the future are sown in the present. While there is no telling the exact composition of the fruit that those seeds will bear, we can expect the stalks to grow according to laws already apparent.

Moore’s Law: Back in the 80’s and 90’s, when computers first landed on our desktops, we were mostly concerned with processing power, because we wanted to be sure that our hardware would be capable of running the software that made computers useful.

Today, however, most of us pay little attention to processing speeds because we’re confident that whatever device we buy will be fast enough. That’s because of Moore’s law, a principle first identified by Intel cofounder Gordon Moore in 1965 which states that the power of our chips doubles about every 18 months.

Kryder’s Law: When Steve Jobs first returned to Apple, he revamped the product line and then went searching for the next big thing. An avid music fan, he was disappointed with the primitive MP3 devices on the market and envisioned a new product that would allow him to carry around 1000 songs in his pocket.

In a matter of months, his team identified a supplier which could deliver drives that were both small enough and powerful enough to make good on his vision. The iPod was born and Apple was on its way to becoming the most valuable company on the planet.

Of course, 1000 songs is no big deal anymore. Today’s iPods carry 40,000 and you can buy a drive that can play 1000 full length movies for a few hundred dollars, less than the price of those original iPods. This is thanks to Kryder’s law, which doubles storage about every 12 months, even faster than Moore’s law increases processing power.

Nielsen’s Law: Even after we stopped worrying about the speed of our computers and our hard drives became big enough that we didn’t need to clean out our e-mail archives every month, we still had trouble accessing content because Internet connections were so slow. Now with 4G mobile connections, we scarcely have to worry about it.

This is thanks to Nielsen’s law, which observes that effective bandwidth doubles every 21 months. That’s’s quite a bit slower than Moore’s law and Kryder’s law, which is why bandwidth has historically been such a limiting factor, but at current speeds we can do almost everything we want to and 5G is expected around 2020.

Kaku’s Caveman Law: Now that we have eradicated most technical limits to everyday use, the most important law to pay attention to is what Michio Kaku calls the “caveman law”, which can be stated as follows:

Whenever there is a conflict between modern technology and the desires or our primitive ancestors, these primitive desires win each time.

It is this last law, riding the wave of the previous three, that will drive the next decade of technology. Our devices will become not only vastly more powerful, but also more natural and eventually disappear altogether. Effective computing will become less dependent on expertise and more a function of desire.

A New Digital Paradigm

While the digital laws may seem to be working steadily on our behalf, the numbers can be deceiving because they actually represent accelerating returns. Simply follow the pace of Moore’s law alone and you will quickly realize that we will advance roughly the same amount in the next 18 months as we did in the previous thirty years.

At some point, a difference in degree becomes a difference in kind. Having exhausted most of the possibilities we saw for computers a decade ago, we are beginning to focus our technology on completely new tasks, such as nanotechnology, genomics and energy. Clearly, we are entering a new digital paradigm.

To get an idea of how this will all play out, look at how supercomputing has progressed at IBM. In the 90’s it focused its efforts on pure computation, eventually defeating chess champion Garry Kasparov with brute force. In 2011, its Watson computer triumphed at Jeopardy!, a game show that requires intuition as well as intelligence.

Now, IBM is repurposing Watson for human professions, such as medicine, law and evencustomer service. The line between man and machine is blurring beyond anything we could imagine even a few years ago.

Atoms Become The New Bits

There is probably no place the expansion of the digital economy is as dramatic as in the field of manufacturing, which until recently was assumed to be a low tech area best left to sweatshops and cheap labor. Today, as Steve Denning reported in Forbes, companies from Apple to GE are finding it makes more sense to keep manufacturing closer to home.

The reason is that we are in the midst of a new industrial revolution where the informational content of manufactured goods is becoming more valuable than the physical content. An array of technologies, ranging from CAD software to 3D printing tolights out factories which are entirely populated with robots, is reinventing the economics of making things.

Just as people gathered in places like the Homebrew Computer Club in the 70’s, there are now dozens of fab labs scattered across the globe where hobbyists can meet and build prototypes. These designs can then be manufactured at just about any scale by services like Ponoko and Pololu.

Tech Becomes More Like Pharma

When the personal computer revolution took hold, it was driven by garage entrepreneurs. Hobbyists tinkering with homemade kits could outfox big corporations and turn a clever idea into a billion dollar business. This trend only deepened as software became dominant and any kid with a keyboard could compete with industry giants.

Smart companies embraced the start-up culture and became more nimble. The tech industry began to resemble the entertainment industry, with the business press spending more and more time in sweaty convention halls hoping to catch a glimpse of the next blockbuster hit.

That’s changing as devices and applications are becoming secondary to platforms. The new paradigm shifts, such as IBM’s Watson, Google Brain and Microsoft’s Azure take years and billions of dollars to develop. The upshot is that the tech business is starting to look more like pharma, where the R&D pipeline is as important as today’s products.

And for better or worse, that’s where we’re heading. Whereas previous tech waves transformed business and communication, the next phase will be marked by technology so pervasive and important, we’ll scarcely know it’s there.