Abstract

Important factors affecting the efficiency and performance of the nearest neighbor classifier (NNC) are space, classification time requirements and for high dimensional data, due to the curse of dimensionality, the training set size should be large. We propose novel techniques to improve the performance of NNC and at the same time to reduce its computational burden. A compact representation of the training set along with an efficient NNC which does implicit pattern synthesis is presented. A comparison of empirical results is made with relevant methods.