I agree to TechTarget’s Terms of Use, Privacy Policy, and the transfer of my information to the United States for processing to provide me with relevant information as described in our Privacy Policy.

Please check the box if you want to proceed.

I agree to my information being processed by TechTarget and its Partners to contact me via phone, email, or other means regarding information relevant to my professional interests. I may unsubscribe at any time.

Please check the box if you want to proceed.

By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.

as Netflix, Google, Amazon and even the U.S. Postal Service. Yet, its use in computer networking is something relatively new, and its application within the networking industry has sparked interest among academics and vendors alike.

In a nutshell, machine learning, or ML, is nothing more than software that can learn from past experience. This experience is in the form of data, which makes machine learning closely related to statistics. But machine learning transcends statistics in an effort to classify and group data in such a way as to create models that can be used to predict future outcomes.

With traditional programming, computers are given all the parameters and information they need to run a program. Machine learning starts with only a few simple parameters and a data set from which it can deduce new information. This is the learning component of machine learning.

Why artificial intelligence isn't machine learning

It's important to note that machine learning and artificial intelligence (AI) are two different things. While the two concepts are related -- AI is grounded in machine learning -- artificial intelligence goes further to create a machine that can mimic a human mind exhibiting such capacities as the ability to reason and perform abstract thinking.

Machine learning makes use of both imperative and declarative programming. Imperative programming deals with the explicit steps a computer must take to produce an outcome. In contrast, declarative programming deals with the method a computer uses to deduce new information from known facts.

The term machine learning was coined in 1959 by Arthur Samuel, a very early leader in computer gaming and artificial intelligence. He explained that "machine learning is the field of study that gives computers the ability to learn without being explicitly programmed."

Machine learning, then, includes the classification of data, modeling and then the deduction of new facts. The heart of machine learning is the extraction of new knowledge, which is deduced from an existing data set.

Algorithms underpin machine learning

There are a number of common mathematical algorithms underpinning machine learning, among them linear and polynomial regression algorithms, gradient decent, the Naive Bayes algorithm, the decision tree, logistic function, linear optimization, clustering and nearest neighbor. The complete list is quite long, but ultimately, it’s important to understand there are numerous algorithms to solve different problems.

Spam filtering is a good example of how these algorithms are used in IT. A spam-filtering system can learn what normal mail looks like and what abnormal mail looks like. And, over time, filtering improves, as the system has a better idea of how to identify spam among all incoming messages.

Another example is WAN optimization. Some platforms begin by operating in a pass-through mode to gather network data. In this way, the platform can create a baseline of network traffic as a data set and use it to make predictions on what paths would be best in any given time, how and when to apply deduplication, compression, and other WAN optimization technologies.

How machine learning in networking works

Machine learning can be split into two main categories: supervised learning and unsupervised learning.

Supervised learning involves training a machine with labeled data. A label is explicit metadata that describes an input in a data set. This label can be in the form of some identifier, classification or judgment that the machine learning algorithm can use to characterize the information it is processing.

For example, an input describing the height of a person might also have the label "tall" or "short." The machine learning algorithm would then tag other unidentified data "tall" or "short." This would create meaning in the data set. And the larger the identified data set, the better the machine can accurately deduce new knowledge.

When analyzing a new data set of images, the computer isn't told which of the images is a face; instead, the computer is able to deduce that based on its training.

Unsupervised learning involves training the machine to use a data set that does not have labels. This is relevant to networking, in particular, because the data derived from network devices and visibility tools generally is not explicitly labeled.

In unsupervised learning, computers create structure and meaning out of what are, at first, arbitrary inputs. They don't have labels, so a machine uses methods such as clustering to identify relationships among inputs. By identifying these relationships, a machine can start to build structure and meaning among the data.

A popular example is facial recognition. A computer can be taught how to recognize a human face -- or, in other words, the output. A computer is trained by showing it example after example of eyes, noses, mouths and complete face images. The larger the training data set, the more accurate its ability to recognize a face.

In this example, the new knowledge is knowing which image among many unlabeled, unclassified images in a data set is a human face. When analyzing a new data set of images, the computer isn't told which of the images is a face; instead, the computer is able to deduce that based on its training.

There are several other machine learning methods, as well, such as semi-supervised learning, which is a combination of supervised and unsupervised learning used to create labels for unlabeled data. And there is reinforcement learning, which attaches probability to predictions based on feedback from previous calculations.

The impact of machine learning on networking

The benefits of machine learning in networking are multifold. Network infrastructure produces a tremendous amount of unlabeled information -- both as very short-lived data, such as link statistics, as well as trend data, such as bandwidth utilization over time. With that sort of data set, machine learning can be much more than an analytical tool. Instead, machine learning can be a predictive tool, and in that role, it can push configuration data automatically as a result of those predictions. The idea, then, is to apply this in every area of the infrastructure -- from the access layer and distribution layer to data center and security tools.

Hurdles remain before networking can reap the benefits of machine learning. First, a great deal of network data is ephemeral; in other words, it's very short-lived. This means network telemetry used in a machine learning data set can be very dynamic. Second, data isn't labeled data, and that makes classification much more difficult. This is why semi-supervised learning is often used. And, finally, although various networks share some common traits, no two are really the same.

That's because networks, despite being loosely based on the same design ideas, don't all have the same components. The underlying technology for most networks is TCP/IP, but not all networks have a data center firewall, not all networks trombone traffic between cloud providers and not all networks have an intrusion detection or intrusion protection appliance just behind their edge router. Furthermore, even the networks that contain many of the same devices differ greatly in precisely where those devices are located in the design and how they are configured.

That said, we've seen machine learning used successfully in parts of the WAN, in messaging and in security. Machine learning is also being used by several network visibility and monitoring vendors -- among them ExtraHop and Nyansa -- which make their living collecting and analyzing network data. As networks continue to shift toward a software-centric paradigm, machine learning will play a larger role in the now mundane, but eventually sophisticated, elements of network design, telemetry and daily operations.

Join the conversation

1 comment

Register

I agree to TechTarget’s Terms of Use, Privacy Policy, and the transfer of my information to the United States for processing to provide me with relevant information as described in our Privacy Policy.

Please check the box if you want to proceed.

I agree to my information being processed by TechTarget and its Partners to contact me via phone, email, or other means regarding information relevant to my professional interests. I may unsubscribe at any time.