The content of this website is provided for informational purposes only. Our site is not officially associated with any brand or government entity. Any mention of a brand or other trademarked entity is for the purposes of education, entertainment, or parody. On that note, we also don’t offer professional legal advice, tax advice, medical advice, etc. Neither FactMyth.com nor its parent companies accept responsibility for any loss, damage, or inconvenience caused as a result of reliance on information published on, or linked to, from Factmyth.com.

Machine learning describes the algorithms that allow machines to learn from example and past experience.

What is Machine Learning?

Machine learning involves the creation of algorithms that allow computers to learn from example and past experience rather than reading preprogrammed information.[1][2] In other words, instead of programming a computer to understand every aspect of what makes a letter “A” a letter “A”, an algorithm is written which allows the computer to look at a bunch of different instances of the letter “A” to learn how to recognize an “A” it’s never seen before.

With the right underlying algorithm, a program can learn from examples and write it’s own rules for recognizing patterns. This avoids the insurmountable amount of programming that would need to be done otherwise.

When we pair this concept with our knowledge of academic and business intranets, the internet, smart humans, and then integrate this into important areas like weather, healthcare, politics, and energy, the potential is only limited by our hardware (processing power and memory storage).

Machine Learning Versus Cognitive Computing

“Machine learning” is a reference to the algorithm that lets computers learn from past experiences (machine learning algorithms). It is not a reference to mimicking all the facilities of human thought like abstract thinking, understanding context, knowledge representation, reasoning, etc.

Beyond machine learning is something called cognitive computing. Cognitive computing understands, reasons, and learns. It can communicate with us in natural, human terms. It understands context and nuance, enabling it to not only uncover new insights but also unearth entirely new pathways to explore and possibilities to imagine. Cognitive computing uses machine learning at its core.[4]

Machine learning is an algorithm that allows machines to learn from past experiences, much in the way a human would. This shouldn’t be confused with other facets of the way humans think or cognitive computing in general which seeks to mimic more than just learning.