AI Weekly 11 August 2018

The Defense Department has produced the first tools for catching deepfakes – Fake video clips made with artificial intelligence can also be spotted using AI—but this may be the beginning of an arms race. http://bit.ly/2KOVb82

When Recurrent Models Don’t Need to be Recurrent – in the last few years, deep learning practitioners have proposed a litany of different sequence models. Although recurrent neural networks were once the tool of choice, now models like the autoregressive Wavenet or the Transformer are replacing RNNs on a diverse set of tasks. In this post, authors explore the trade-offs between recurrent and feed-forward models. Feed-forward models can offer improvements in training stability and speed, while recurrent models are strictly more expressive. http://bit.ly/2vC8eoK

Now anyone can train Imagenet in 18 minutes – a team of fast.ai alum Andrew Shaw, DIU researcher Yaroslav Bulatov, and Jeremy Howard have managed to train Imagenet to 93% accuracy in just 18 minutes, using 16 public AWScloud instances, each with 8 NVIDIA V100 GPUs, running the fastai and PyTorchlibraries. http://bit.ly/2P1bbHw