How can one know a machine is intelligent ? By using the Turing test, the ability of a machine to communicate human like. Have you noticed lately, the human like fluidity of AI powered customer service bots ?

The Start – 50’s and 60’s

The word Artificial intelligence was not coined until 1955 by John McCarthy who then in summer of 1956 organized the Dartmouth Artificial Intelligence Conference. It was the pre-cambrian era of computer technology, the first programmable computer Mark-1, had only showed up a decade earlier in 1944. The Darmouth conference was rapidly followed by the launch of the first coordinated AI research at MIT in 1959. There the junior faculty members John McCarthy and Marvin Minsky founded the Artificial Intelligence Project as part of the Research Laboratory for Electronics (RLE).

The zeal led to the creation of the first chess algorithm in 1962 at MIT. The program could beat an amateur player some of the time.

Starting 1963 DARPA accelerated the momentum behind AI research by funding MIT, Stanford and Carnegie Mellon universities. Out of the research came a few successes. A sub-domain of AI called Experts systems was invented. DENDRAL computer came about in 1966 to solve chemistry problems. Perhaps the most famous was MYCIN, launched in 1979 for diagnosis and treatment of blood diseases. With an inference engine and library of approximately 500 rules it gave recommendations as good as the best medical doctors and provided “justification for the decisions”. In spite of the accuracy, it was never put in practice because of the legal issue of assigning responsibility to decision owner (computer in this case).

The False Prophecies – 80’s and 90’s

“There are now in the world machines that think, that learn and create. Moreover, … in a visible future – the range of problems they can handle will be coextensive with the range to which the human mind has been applied.” — Allen Newell, father of AI in 1957

The first computer vision project started as a summer project in 1966. The goal was to develop the technology to extract features and objects from video within a few months. It is only after 50 years that we have a satisfactory solution with powerful compute resource coupled to machine learning algorithms. It is indeed an illustration of how vastly the AI challenge had been underestimated by the pioneers of the field.

The 1986 cover from one of the trade journals shows the press chasing the AI fad back in 1980’s.

The key breaking news occurred in 1997 when IBM’s Deep Blue defeated Gary Kasparov in a set of chess games. Though it must be remembered that the triumph was not of “intelligence” but of millions of brute force computations. We entered the prolonged period with high hopes but left it with dark disillusionment.

The Twist – 2010’s

But then a seminal moment occurred in 2012 when ImageNet Large Scale Visual Recognition Challenge (ILSVRC) was won by a team from University of Toronto. Their software program called AlexNet utilized GPU’s to execute a class of AI algorithm called convolutional neural network. In the competition with a test size of 100K images, the objects within these images must be detected and classified into one of 1000 categories. AlexNet had approximately 15.3% error, but by 2016 the error rate had gone down to less than 3%, exceeding human precision.

It is reported that AlphaGo used 1920 CPU, 280TPU (shown in the picture). My conservative estimate put AlphaGo at 300K Watts of power versus 20W for a human brain.

But, let there be not doubt, today we are at the cusp of the rise of the AI era. New breakthroughs will undoubtedly occur and we will witness monumental changes in times ahead. To help us crystallize the future with AI, glance at the pace of innovation in the computer industry and how it has created totally new ways to run our lives.

The promise of AI is loud, real and now.

The Showdown

Looking back 60 years from today’s vantage point, we witnessed extraordinary struggles and tremendous feats of engineering. We must now see beyond the borders of what is and into the land of new. AI have already started touching our lives in fundamental new ways and field will continue to move forward with even greater consequences. It is enriching every aspect from cameras, speakers to data centers. AI is the essential element of the next several decades to create competitive advantage, to delight the customers and ultimately change how people spend money.

Hence, business as usual is not an option. A company that does not have a precise and clear AI strategy will be left far behind to rot. We must embrace AI as the differentiating technology at its root and become unbelievably good at it. A company must invest not only to maintain parity but to create substantial economic value.

Like this:

LikeLoading...

Related

Published by Khursheed Hassan

Khursheed is deeply passionate about human potential. He believes that each of us is capable of attaining incredible achievements and reaching greater heights - only if we are given the proper guidance at the right moment in our lives. His career spans across technologies from creating advising startups to working in large Fortune 500 companies. He is a triathlete, an avid reader and an eager student of history.
View all posts by Khursheed Hassan