Remember Mad Libs? You'd get a book that contained paragraphs with key words missing, replaced with hints as to what might fill the blanks. You and your friends would come up with silly words to complete the sentences, with predictably hilarious results.

Well, it's time for some Gartner Mad Libs. For this game, the same word belongs in all the blanks. See if you can figure out what the missing word is (here's the source if you can't resist peeking).

The current software ecosystem allows only whole products to be commercialized - not functions or features.... While the number of existing _________ is significant, very few engineers or scientists commercialize their _________, mostly due to the limited prospects of real success.... Another profound aspect of a marketplace is its ability to allow _________ to be reused.

If this quotation came from 2005, you'd probably guess the missing word was services. Back in the day, we talked about services as ways of breaking up monolithic applications in order to support the commercialization and reuse of those services.

Perhaps this quotation came from 2012, in which case the missing word is likely to be APIs. People still talk about the API Economy, after all - what might happen when enterprises and others expose application functionality in order to commercialize and reuse it.

If you guessed either services or APIs, however, sorry, you're wrong. This quote is from 2016, and the buzzword Gartner is hyping today is algorithms.

Gartner is now treating algorithms like they are some kind of innovative addition to the modern digital discussion. Presumably the brilliant minds there have some novel insight into algorithms and, yes, the Algorithm Economy that CIOs should sit up and take notice of.

Well, it's time to let some hot air out of this balloon, and if Intellyx won't do it, who will?

Nothing New Under the Sun

The point to the Mad Libs exercise, of course, is to underscore how repetitive Gartner's thinking about algorithms actually is. If they can simply switch out one buzzword for another every few years, then are they actually bringing any new insight to the table at all?

Let's see if we can peel away the layers of hype to see if we can find such a hidden nugget of insight. Our starting point: the definition of algorithm.

According to Wikipedia, an algorithm is "a self-contained step-by-step set of operations to be performed." Your grandma's muffin recipe, therefore, is a type of algorithm, although the history of algorithms focuses primarily on mathematical algorithms.

Algorithms, therefore, aren't new. In fact, they date to antiquity, as the well-known Sieve of Eratosthenes algorithm for creating a list of prime numbers illustrates.

The story of algorithms gets more interesting, however, with the invention of machines that could follow them - or at least, theorizing about such machines, as computing pioneers like Ada Lovelace, Charles Babbage, and Alan Turing did.

In any case, while algorithms are unquestionably fundamental to computer science generally - not to mention your grandma's cooking - we still have the question as to why Gartner is focusing their hype engine on them now. Let's go to the source and see what we can uncover.

Gartner on AlgorithmsGartner Research SVP Peter Sondergaard laid out Gartner's thinking on the Algorithm Economy in his keynote at a recent Symposium/ITxpo. He states that "algorithms are defining the future of business" because "algorithms define the way the world works."

In particular, he distinguishes algorithms from data (in particular, big data), because "big data is not where the value is." Instead, "algorithms are where the real value lies, because algorithms define action." He then ties this notion of defining action to digital transformation by stating that "dynamic algorithms are at the core of customer interactions."

For Gartner, algorithms are central to the "post app era," where by 2020, Gartner predicts that "smart agents" will facilitate 40% of interactions. According to Sondergaard, "Agents enabled by algorithms will define the post-app era," and calls out Apple Siri and Microsoft Cortana as prototypes for this new era of virtual assistants and other smart agents that will replace apps.

Siri and her brethren are unquestionably innovative, and no one doubts such technology will continue to improve rapidly. But let's take a closer look at Gartner's thinking and see how well it holds water.

The Obviousness DepartmentNot only are algorithms nothing new, but much of what Gartner is saying about them is obvious - in particular, Sondergaard's pronouncements over the value of data.

By stating that data alone have no value without algorithms, he's essentially stating that computing breaks down into data on the one hand and code on the other. This division is at the core of digital computing ("digital" in its original sense of using zeroes and ones), as it has been since ENIAC.

Seventy years later, Gartner suddenly discovered that digital computing divides the world into data and code, and that data aren't really that useful without code. Good for it. However, the fact you have to do something with data to get value out of them shouldn't come as a surprise to anyone.

Sondergaard also calls out algorithmic stock trading as an example of how algorithms can be disruptive. To be sure, using software to trade stocks disrupted the earlier, manual approach - but isn't he just saying that using computer programs to automate manual processes is disruptive? Again, true, but both obvious and nothing new.

The bigger picture here is that software continues to improve, and enterprises are becoming increasingly software-driven, in part because of such advancements. Algorithms are part of this story, but are not a particularly interesting or insightful angle on the broader trend of software-driven business transformation.

Heading in the Wrong DirectionWhat about the idea of algorithm marketplaces mentioned in the Mad Lib above? As my little game would suggest, we've been down the same road with services and APIs with increasing levels of success. Are algorithm marketplaces the next logical step in the evolution of this trend?

I'm afraid not - in fact, they're a big step backwards. The reason why service marketplaces were possible - and why API marketplaces are increasingly feasible - is because services and APIs are ways of abstracting software functionality into loosely coupled units that support the consumability and reusability of the underlying software.

In the case of algorithms, however, we're stripping away all such abstraction layers and even the fundamental abstraction of written code itself, leaving the underlying patterns such code must follow laid bare for all to see.

As the limited success of Web Services in the 2000s would suggest, there are perhaps better ways of packing up algorithms for consumption and reuse than services. RESTful APIs improved matters to be sure. What, then, would be the next step on this path of evolution? Not algorithms, but microservices.

In fact, you could think of a microservice as a properly encapsulated algorithm if you like, as they are in essence units of execution with well-constructed APIs. But strip away the encapsulation from any microservices-based system and you end up with a tightly coupled, unimplementable tangle.

Are We Nearing the ‘End of Applications'?Then there's the question as to whether smart agents like the Siri and Cortana virtual assistants herald the end of apps, leading to a new age of algorithmic dominance.

The crux of this issue is how we might define application. Today when we use the word app we're more likely than not referring to a mobile application, a simple bit of code we might download from an app store to our smartphone.

But of course, the world application has long meant more than that. Any tool, after all, has an application - the application of a hammer is hammering nails, for example. Early digital computers were purpose built with a single application, like cracking Nazi codes or calculating missile trajectories.

Only with the development of programmable computers like ENIAC did the word application come to mean computer program, as these devices were essentially general purpose, and thus had many possible applications.

Today we still think of an application as a computer program, although some uses of the word are narrower than others. But certainly, wouldn't Siri or Cortana be just another kind of application?

The Intellyx Take: Are We Nearing a Post-Algorithmic Age?Ironically, instead of considering advancements in smart agent technology to be the triumph of algorithms over applications, perhaps it's more logical to consider the reverse: are applications becoming post-algorithmic?

This question is a philosophical one that attempts to divine the essence of modern artificial intelligence (AI) - as well as where AI is heading over the next several years.

If we look at machine learning or deep learning algorithms, we're essentially teaching computers to learn on their own so that they can come up with new ways of doing things - with resulting behavior that may look nothing like "a self-contained step-by-step set of operations" that is the essence of an algorithm.

In other words, AI broadly speaking introduces two levels of computing: the human level where people code the AI programs (which are still algorithmic), and the self-learning behavior of those programs themselves that potentially leads to novel behaviors (which I posit may become post-algorithmic).

Seen in this light, we may in fact desire post-algorithmic behavior from our virtual assistants. After all, if they're simply following recipes - even complicated ones - then they're not really doing anything more than a traditional, human-coded application can do.

Only when the agent can come up with novel behaviors based upon unpredictable, independent learning will our virtual assistants become truly useful. At that point, algorithms - and Gartner's poorly thought out opinions - will become things of the past.

Mr. Bloomberg’s articles in Forbes are often viewed by more than 100,000 readers. During his career, he has published over 1,200 articles (over 200 for Forbes alone), spoken at over 400 conferences and webinars, and he has been quoted in the press and blogosphere over 2,000 times.

Mr. Bloomberg is the author or coauthor of four books: The Agile Architecture Revolution (Wiley, 2013), Service Orient or Be Doomed! How Service Orientation Will Change Your Business (Wiley, 2006), XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996). His next book, Agile Digital Transformation, is due within the next year.

At SOA-focused industry analyst firm ZapThink from 2001 to 2013, Mr. Bloomberg created and delivered the Licensed ZapThink Architect (LZA) Service-Oriented Architecture (SOA) course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, which was acquired by Dovel Technologies in 2011.

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting), and several software and web development positions.

The Chief Information Officerâ€™s role is to provide vision and leadership for developing and implementing information technology initiatives. The Chief Information Officer directs the planning and implementation of enterprise IT systems in support of business operations in order to improve cost effectiveness, service quality, and business development. This individual is responsible for all aspects of the organizationâ€™s information technology and systems.

Cloud Expo

Cloud Computing & All That
It Touches In One Location Cloud Computing - Big Data - Internet of Things
SDDC - WebRTC - DevOps
Cloud computing is become a norm within enterprise IT.

The competition among public cloud providers is red hot, private cloud continues to grab increasing shares of IT budgets, and hybrid cloud strategies are beginning to conquer the enterprise IT world.

Big Data is driving dramatic leaps in resource requirements and capabilities, and now the Internet of Things promises an exponential leap in the size of the Internet and Worldwide Web.

The world of SDX now encompasses Software-Defined Data Centers (SDDCs) as the technology world prepares for the Zettabyte Age.

Add the key topics of WebRTC and DevOps into the mix, and you have three days of pure cloud computing that you simply cannot miss.

Delegates will leave Cloud Expo with dramatically increased understanding the entire scope of the entire cloud computing spectrum from storage to security.

Cloud Expo - the world's most established event - offers a vast selection of 130+ technical and strategic Industry Keynotes, General Sessions, Breakout Sessions, and signature Power Panels. The exhibition floor features 100+ exhibitors offering specific solutions and comprehensive strategies. The floor also features two Demo Theaters that give delegates the opportunity to get even closer to the technology they want to see and the people who offer it.

Attend Cloud Expo. Craft your own custom experience. Learn the latest from the world's best technologists. Find the vendors you want and put them to the test.