Tag Archives: information theory

This is the first of a series of two posts with a goal of providing a background on information theory. For today, the itinerary is as follows: we will start with introducing basic information quantities, proceed with proving tensorization statements for them — i.e. that they … Continue reading →

There is a way to demonstrate that Chernoff bounding is — in some sense — optimal, and it relies to the so-called Sanov’s theorem, which controls the empirical distribution (as a random measure) in terms of the Kullback-Leibler divergence from the … Continue reading →