Wednesday, 24 October 2018

AI Is The Future

Technology is all around us today that we can only wonder just how far we can take it further in the coming years. A few years back, smartphones were still a luxury and only the elite can afford it but today, the masses have their own smartphones that allow them to access the web whenever and wherever. This has been made possible when cheaper and more mass-produced handsets are marketed and sold to reach a wider demographic. Hence, even ordinary individuals own a smart gadget that offers more or less the same features as popular brands like Apple and Samsung but only at a fraction of the price.

The technology that we know today continues to move forward and traverse uncharted territories. Let’s take artificial intelligence for example. We are already quite familiar with the concept of artificial intelligence (AI) per se even if the details are still a little vague to most people. It is even available in some technologies we are using. Think of Alexa and Siri and the likes. However, AI may eventually go mainstream with news about Google adding their AI chips to computers within the year. Imagine how many people can now use artificial intelligence technology from the comforts of their own homes, schools, and offices.

Google, one of the top companies in the hot area of artificial intelligence, will begin letting customers directly use its custom processors for the technology starting in October.

Google's TPUs, or tensor processing units, accelerate AI tasks like understanding voice commands or recognizing objects in photos. Today, Google will let you pay to do that kind of work on its cloud-computing infrastructure. But through a program called Edge TPU announced Wednesday, Google will let programmers install the TPUs in their own machines.

"There are also many benefits to be gained from intelligent, real-time decision-making at the point where these devices connect to the network," without having to wait for a trip over the network to Google's machines, Injong Rhee, vice president of Google Cloud's internet of things work, said in a blog post. "Your sensors become more than data collectors -- they make local, real-time, intelligent decisions."

Machine or deep learning is still in its infancy and there are still a lot of questions left unanswered but making a part of the technology readily available to the public may hasten progress and even give answers to some of the questions raised in the past regarding this highly advanced technology. Soon enough, we may even come across announcements informing us that AI will hit the smartphone industry too full-blown this time. That will likely happen sooner rather than later at the rate these advancements are progressing and being developed right now. Competition is also instrumental here since companies want to beat each other so they double their efforts.

“We see there are emerging needs for edge computing, which is essentially running data analytics and intelligence services in locations where data is collected or what we call at the edge,” he said. “This is important as sometimes moving all data to the cloud from sensors can be very expensive.”

The setup is also underpinned by the Cloud IoT Edge software stack. This can run on Android Things or Linux-based devices and equips them with the capabilities they need to carry out machine learning-related data processing tasks.

Rhee said the setup has already been adopted by the IT services arm of electronics giant LG, which is using it to cut the amount of manpower needed in its product testing procedures and predict it could help the organisation save the organisation around $1m a year per product line.