Google announced support for core ML in TensorFlow Lite

Google collaborated with tech giant apple and announced a support for Core ML in TensorFlow Lite. By this way, iOS developers can enjoy the strength of Core ML for deploying TensorFlow models.

Support for Core ML is provided through a tool that takes a TensorFlow model and converts it to the Core ML Model Format (.mlmodel).

What is Core ML

Core ML is a software framework that is user across Apple products such as Siri, Camera etc. Core ML is optimized for on device performance which minimizes memory usage and power consumption. Core ML seamlessly takes advantage of the CPU and GPU to provide maximum performance and efficiency. You can run machine learning models on the device so data doesn’t need to leave the device to be analyzed.

What is TensorFlow Lite

TensorFlow lite is TensorFlow’s light weight version specially designed for mobile and embedded solutions. It enables on-device machine learning with low latency and small binary size that make it faster than its counterparts.

TensorFlow lite uses uses many techniques to achieve low latency performance e.g., optimizing kernals, quantized kernels that allow smaller and faster (fixed-point math) models, and in the future, leverage specialized machine learning hardware to get the best possible performance for a particular model on a particular device.

Google required TensorFlow Lite to optimize the existing TensorFlow covering extra use cases for mobile devices. Below are some points which supports the requirement of TensorFlow Lite to implement the machine learning in mobile devices

Innovation at the silicon layer is enabling new possibilities for hardware acceleration, and frameworks such as the Android Neural Networks API make it easy to leverage these.

Recent advances in real-time computer-vision and spoken language understanding have led to mobile-optimized benchmark models being open sourced (e.g. MobileNets, SqueezeNet).