Abstract

Face parsing is a segmentation task over facial components, which is very important for a lot of facial augmented reality applications. We present a demonstration of face parsing for mobile platforms such as iPhone and Android. We design an efficient fully convolutional neural network (CNN) in an hourglass form that is adapted to live face parsing. The CNN is implemented on the iPhone with the CoreML framework. In order to visualize the segmentation results, we superpose a mask with false colors so that the user can have an instant AR experience.