For the analysis of
cross-classifications having ordered categories,
Tomizawa [Statistics and Probability Letters 11
(1991), 547-550] proposed a Kullback-Leibler
information type or Shannon entropy type measure
to represent the degree of departure from
uniform association. This paper proposes a
generalization of this measure. The measure
proposed is expressed by using the Cressie-Read
power-divergence or Patil-Taillie diversity
index based on the local odds ratios. It
includes Tomizawa’s [Statistics and Probability
Letters 11 (1991), 547-550] measure in a special
case. It would be useful when one wants to use a
single summary measure to measure distance of
the local odds ratios from uniformity, and also
useful for comparing the degrees of departure
from uniformity in several cross-classification
tables. The eye color-hair color data and three
kinds of unaided vision data are analyzed by
using the proposed measure.