Humans gain knowledge through lifelong learning. The same approach can be applied for machine learning. And thus lifelong machine learning, or LML, is an area of machine learning research concerned with similar persistent and cumulative nature of learning. The objective of a LML system is to consolidate new information into an existing machine learning model without catastrophically forgetting the prior information. One approach is to perform task rehearsal where examples of the new task are interleaved with examples of the prior tasks during training. This avoids the loss of prior knowledge while integrating in the new knowledge. However, this approach requires saving old training examples. Our research focuses on this retention problem of LML for creating a network of knowledge consolidation through task rehearsal without having to retain training examples from prior tasks. We investigate two approaches of generating virtual examples adhering to the probability distribution of the prior task examples’ from an existing network model. We find out that the reconstruction error of the training data given to a trained model developed using Restricted Boltzmann Machine can be successfully used to generate accurate virtual examples from the reconstructed set of a uniform random set of examples given to the trained model. These virtual examples can be used in rehearsing the prior task knowledge while consolidating new task examples. We demonstrate that the virtual examples perform better in transferring knowledge to the consolidation of a new related task compared to a set of uniform random examples. We also define a measure for comparing the probability distributions of two datasets given to a trained network model based on their reconstruction MSEs. We demonstrate the viability of this measure for evaluating the accuracy of the generated virtual examples based on their adherence to the prior task examples’ distribution.

Rights

The author grants permission to the University Librarian at Acadia University to reproduce, loan or distribute copies of my thesis in microform, paper or electronic formats on a non-profit basis. The author retains the copyright of the thesis.