Entropy is typically referred to as a measure of disorder

In physics entropy is typically referred to as a measure of disorder in a physical system however I have seen it referred to as a measure of statistical uncertainty for a set of data. I also recall the function defined the statistical uncertainty took the form of an integral and is used in information theory. Could anybody shed any light on this thought of entropy representing a statistical measure of uncertainty?