Pages

Friday, November 7, 2014

This post will be brief. I don't know anything about Riemann $\zeta$, except the basic definition and the million dollar prize which attracts the most terrible bounty hunters mathematicians. I hardly know more about Chaitin $\Omega$. Yet, I found that they look similar in some sense. Maybe, it's completely obvious, and trivial. Maybe, it's related to something slightly deeper. Anyway, it looks funny.

The celebrated Riemann $\zeta(s)$ has the following form, for all $s \in \mathbb{C}$ with real part greater than $1$
$$
\begin{equation}
\zeta(s) = \sum_{n \geq 1} n^{-s}
\end{equation}
$$
Actually, $\zeta(s)$ can be analytically continued over all $\mathbb{C} - \{1\}$, with a unique pole at $1$.

To define Chaitin $\Omega$, we first have to fix some universal prefix-free partial recursive function $U : \{0,1\}^* \rightarrow \{0,1\}^*$. This simply means that we fix some computer whose legal programs, encoded as binary strings, have some sort of "end of file" marker: no legal program is the prefix of another legal program. In addition, this computer is universal, i.e., is able to simulate every other computer. Denoting by $Dom(U)$ the set of legal programs of $U$, the Chaitin's number is
$$
\Omega = \sum_{p \in Dom(U)} 2^{-|p|}
$$
where $|p|$ denotes the length of the program $p$. This number can be interpreted as a probability. Indeed, if we flip infinitely many times a fair coin, then $2^{-|p|}$ is the probability that the prefix of length $|p|$ in the generated infinite binary sequence is equal to $p$. Since $U$ is prefix-free, this also represents the probability that $U$ will execute the program $p$ and halt. Hence, $\Omega$ is the probability that $U$ halts when given the generated infinite binary sequence. It can be shown that the binary expansion of $\Omega$ is a Martin-Löf random sequence.

Now, I guess everyone has caught it, but let's put bluntly the (not so) funny part. For all (real) $s > 1$
$$
\begin{align}
\zeta(s) &= \sum_{n \geq 1} \left( 2^{\log n}\right)^{-s} \\
\Omega(s^{-1}) &= \sum_{p \in Dom(U)} \left( 2^{|p|} \right)^{-s}
\end{align}
$$
Of course, these two quantities are not equal: $\Omega(s^{-1})$ is a sub-sum of $\zeta(s)$ which does not pick any term with $n$ not a power of $2$. We can make them look even more similar by introducing, for every $n$, the function $m(l) = |\{ p \in Dom(U),~ |p| = l\}|$ counting the number of legal programs of length $l$.
$$
\begin{align}
\zeta(s) &= \sum_{n \geq 1} n^{-s} \\
\Omega(s^{-1}) &= \sum_{n \geq 1} m(\log n) \cdot n^{-s}
\end{align}
$$
The definition of $\Omega(s^{-1})$ really looks like a form of $\zeta$ regularization of the counting function $m(\log n)$. Actually, we could even define, for any universal (not necessarily prefix-free) function $C$, with $m_C(\log n)$ counting the number of legal programs of $C$ with length $\log n$
$$
\begin{equation}
\Omega_C(s^{-1}) = \sum_{n \geq 1} m_C(\log n) \cdot n^{-s}
\end{equation}
$$
provided that the real part of $s$ is sufficiently large. In particular, because the domain of $C$ is not necessarily prefix-free, the function may diverge when $s \rightarrow 1$.

Question. Does such a regularization give more insight on random strings ?

pb

EDIT: It turns out that Tadaki has already proposed an analogy [2] between $\Omega(s^{-1})$ and partition functions in statistical physics. Since $\zeta$ naturally arises in statistical physics, the above similarity between $\Omega(s^{-1})$ and $\zeta(s)$ is not a surprise.

EDIT: Truly, there is something going on here. I found this paper [3] by Calude, introducing zeta numbers for Turing machines.