We design and study a Contextual Memory Tree (CMT), a learning memory
controller that inserts new memories into an experience store of unbounded
size. It is designed to efficiently query for memories from that store,
supporting logarithmic time insertion and retrieval operations. Hence CMT can
be integrated into existing statistical learning algorithms as an augmented
memory unit without substantially increasing training and inference
computation. We demonstrate the efficacy of CMT by augmenting existing
multi-class and multi-label classification algorithms with CMT and observe
statistical improvement. We also test CMT learning on several image-captioning
tasks to demonstrate that it performs computationally better than a simple
nearest neighbors memory system while benefitting from reward learning.

Captured tweets and retweets: 1

Made with a human heart + one part enriched uranium + four parts unicorn blood