Risk Bounds for Randomized Sample Compressed Classifiers

Authors

Abstract

We derive risk bounds for the randomized classifiers in Sample Compressions settings where the classifier-specification utilizes two sources of information viz. the compression set and the message string. By extending the recently proposed Occamâs Hammer principle to the data-dependent settings, we derive point-wise versions of the bounds on the stochastic sample compressed classifiers and also recover the corresponding classical PAC-Bayes bound. We further show how these compare favorably to the existing results.

Neural Information Processing Systems (NIPS)

Papers published at the Neural Information Processing Systems Conference.