Eagle, et al [1] discuss the notion of node entropy and this is captured in igraph via the diversity metric. I was wondering if there was any relationship between these node entropies and the idea of ...

As far as I understand a stationary source is a regular source but it's not necessarily true the other way around.
And a stationary source is a source for which its distribution is unaffected by a "...

I have been reading the chapter 6.2.2 in Knuth's book about lower and upper bound on the average path length in binary search tree.
And I have problems with understanding small details of Theorem M (...

Given a set of random variables $X = \{x_1, x_2, \dots, x_n\}$. If the conditional entropy for all $Y \subset X - \{X_i\}$ where $|Y| \leq 5$. How to approximate conditional entropy when $|Y| = 10$ ...

Let's say I have image where all pixel values are either 0 or 1. What I'd like to do is to be able to generate a new image with the same dimensions where each pixel represents how "ordered" the area ...

There is an information source on the information source alphabet $A = \{a, b, c\}$ represented by the state transition diagram below:
a) The random variable representing the $i$-th output from this ...

(This is a question of information theory, data compression and entropy, so I believe it fits CS forum)
Does the fact that the computer itself is a part of the universe make it logically impossible ...

I understand that the theoretical size of a diff patch between two similar files can be calculated using Kullback Leibler (KL) as described @ Wikipedia. Can anyone point me to a numerical example of ...