Learning Hierarchical Information Flow with Recurrent Neural Modules

Venue

NIPS(2017)

Publication Year

2017

Authors

Danijar Hafner, Alex Irpan, James Davidson, Nicolas Heess

BibTeX

Abstract

We propose a deep learning model inspired by neocortical communication via the
thalamus. Our model consists of recurrent neural modules that send features via a
routing center, endowing the modules with the flexibility to share features over
multiple time steps. We show that our model learns to route information
hierarchically, processing input data by a chain of modules. We observe common
architectures, such as feed forward neural networks and skip connections, emerging
as special cases of our architecture, while novel connectivity patterns are learned
for the text8 compression task. We demonstrate that our model outperforms standard
recurrent neural networks on three sequential benchmarks.