ImageAnalyst <imageanalyst@mailinator.com> wrote in message <02ba33ff-bbf9-4aed-838b-0f50616fb9bc@k28g2000yqn.googlegroups.com>...
> On Jan 5, 5:37 am, "Learner " <farhan7...@gmail.com> wrote:
> > Let suppose i have two images.
> > after histeq, i want to know the relative entropy between original & HEed image..
> >
> > How to do that??
>
> -----------------------------------------------------------------------
> Do you have a formula for it?
> Sorry but the algorithm for "relative entropy" isn't committed to my
> memory.

ImageAnalyst <imageanalyst@mailinator.com> wrote in message <642713a8-1e68-4452-8877-28ea64177359@u20g2000yqb.googlegroups.com>...
> He gave the example. Now it's your turn to show why it's not
> working. Why do you say it didn't work? Surely you must have tried
> it with some p and q, didn't you?

On 1/7/2012 1:51 AM, Learner wrote:
> ImageAnalyst <imageanalyst@mailinator.com> wrote in message
> <642713a8-1e68-4452-8877-28ea64177359@u20g2000yqb.googlegroups.com>...
>> He gave the example. Now it's your turn to show why it's not
>> working. Why do you say it didn't work? Surely you must have tried
>> it with some p and q, didn't you?
>
> sorry..
> i mean i applied & answer was :
> entropy=NaN.
> i donno why!!??

On 1/7/2012 10:07 AM, ImageAnalyst wrote:
> On Jan 7, 10:31 am, dpb<n...@non.net> wrote:
>> Indeed...
>>
>> Interesting, though. I did a check of log(0)/log(0) here to confirm and
>> get a warning --
...
>> Doesn't that happen w/ newer versions as it would have seem to have been
>> apparent to the OP?
> --------------------------------------------------------------------
> Here's an interesting experiment:
>>> log(0)
> ans =
> -Inf
>>> log(99) / log(0)
> ans =
> 0
>>> log(0) / log(0)
> ans =
> NaN
> So it appears that to get NaN, p and q must have both had 0 in the
> same location.

Yes. Interesting that you don't get any warnings for log(0), though. I
forget w/o looking up how the default warning level is set but appears
as though it's less strict in later releases than I have. Not sure
that's _a_good_thing_ (tm)...wonder if S Lord will stumble on this
thread and chime in on that point.

So the code I posted before should work:
> p_nonZeroLocations = p> 0;
> q_nonZeroLocations = q> 0;
> both_nonZero = p_nonZeroLocations& q_nonZeroLocations;
> Then:
> e=sum(p(both_nonZero) .* log(p(both_nonZero) ./ q(both_nonZero)))
>
> We've both posted code. I wonder why Learner won't post his code so
> that we can help him. It's like pulling teeth to get him to let us
> help him.

Yes, you have the best solution to get an answer. Whether that's the
right answer I don't know; guess it depends on the purpose OP has in
computing the value in the first place.

It's often the case that folks somehow think clairvoyance is a trait of
respondents to their queries in cs-sm. I often point out that my
crystal ball is in the shop or murky even when available as I'm sure you
recall... :)

>> So it appears that to get NaN, p and q must have both had 0 in the
>> same location.

Not necessarily:

log(NaN)/log(5)
log(0)/log(Inf)
log(Inf)/log(Inf)

all return NaN. log(0)/log(0) returns NaN because log(0) is -Inf and
dividing Inf by Inf (regardless of the signs) returns NaN as per section 7.2
of IEEE 754-2008. [Well, technically section 7.2 only talks about Inf/Inf,
but I believe it reasonable to generalize to combinations of +Inf and -Inf.]

> Yes. Interesting that you don't get any warnings for log(0), though. I
> forget w/o looking up how the default warning level is set but appears as
> though it's less strict in later releases than I have. Not sure that's
> _a_good_thing_ (tm)...wonder if S Lord will stumble on this thread and
> chime in on that point.

That warning was removed a few releases ago -- in release R2010a, I think.

*snip*

> It's often the case that folks somehow think clairvoyance is a trait of
> respondents to their queries in cs-sm. I often point out that my crystal
> ball is in the shop or murky even when available as I'm sure you recall...
> :)

I swear the Mind Reading Toolbox is in the works -- after the last round of
tests turned the lab rats hyper-intelligent, they've been helping out with
the development ;)