st: Searching for Kullback–Leiber divergence

Tirthankar Chakravarty advised that I look into -multigof- for the
Kullback–Leiber divergence. Thanks for the response but -multigof- is
not what I'm looking for.

Kullback–Leiber divergence is sometimes referred to as 'relative
entropy' or 'cross entropy'. The Kullback–Leiber divergence that I need
summarizes the effect of location and shape changes on the overall
relative distribution involving two continuous distributions. The
Kullback–Leiber divergence has a simple interpretation in terms of the
relative distribution, and it is decomposable into the location, shape
and other components.

I have - reldist-. It does a great job in plotting relative &
cumulative pdfs, location/shape shift changes, polarization
coefficients, but it doesn't provide a measure of the overall
distributional difference between two distributions. That's where the
The Kullback–Leiber divergence comes to the rescue. The advantage of the
Kullback–Leiber divergence is that it is decomposable.