Google’s AI Tensor Processing Units Now Available

Tensor Processing units are available in “limited Quantities” for a select set of customers looking to run machine learning models on the Google Cloud Platform, The company announced in a blog post on Monday. “Adventurous ML experts may be able to optimize other TensorFlow models for Cloud TPUs on their own using the documentation and tools we provide”. Google even has plans to scale up these offerings further, with a dedicated network and scale out systems it’s calling “TPU Pods“.

Google used a TPU Pod (with 64 TPUs) to train the ResNet-50 model in less than 30 minutes, down from 23 hours when using a single TPU, during tests. Workloads were executed on traditional CPUs or GPUs, then it translates to lower costs for the customer if TPUs can allow fewer machines to be used for less time than it said. This includes graphics processing units, which lend themselves particularly well to machine learning models.

A fleet of field-programmable gate arrays (FPGAs) is being used by Microsoft to speed up its in-house machine learning operations and provide customers with its Azure cloud platform with accelerated networking. Google is providing a variety of open source reference ML models for Cloud TPU ranging across image classification, object detection, etc for a start. Barrus and Stone said,” Cloud TPUs also simplify planning and managing [machine learning] computing resources”.

“Google Cloud TPUs are an example of innovative, rapidly evolving technology to support deep learning, and we found that moving TensorFlow workloads to TPUs has boosted our productivity by greatly reducing both the complexity of programming new models and the time required to train them”. Google said that it would charge $6.50 per TPU per hour for those participating in the beta. The new Cloud TPUs without the highly specialized skills that are typically required when dealing with supercomputers and custom ASICs, which made possible for organizations to program. Amazon.com Inc.is developing its own AI chip that would help its Echo smart speakers and other hardware that uses it’s Alexa digital assistant do more processing on the device so it can respond more quickly than it could be calling out to the cloud. Intel Corp has been touching its latest central processing units for AI workloads it is proved that Google isn’t alone in pursuing its own AI chips, either.