PyTorch is the latest newcomer to the growing list of Deep Learning frameworks and features a simple imperative interface and dynamic graph construction. Check out this 60-minute tutorial to get started.

Libratus had amassed a lead of $459,154 in chips in the 49,240 hands played by the end of Day Nine. One of the pros, Jimmy Chou, said he and his colleagues initially underestimated Libratus, but have come to regard it as one tough player.

The model depends on a latent neural embedding of state and learns selective attention to dialogue history together with copying to incorporate relevant prior context. Outperforms more complex memory-augmented models by 7% in per-response generation and is on par with the current state-of-the-art on DSTC2. (Stanford)

DyNet (Github), a toolkit for implementing neural network models based on dynamic declaration of network structure. In DyNet’s dynamic declaration strategy, computation graph construction is mostly transparent, being implicitly constructed by executing procedural code that computes the network outputs, and the user is free to use different network structures for each input.

The goal of this paper is not to introduce a single algorithm or method, but to make theoretical steps towards fully understanding the training dynamics of generative adversarial networks. To substantiate our theoretical analysis, the authors perform targeted experiments to verify assumptions, illustrate claims, and quantify the phenomena.