$\bf Definition.$ We define the space bounded communication in the following way. A and B are
supernatural beings capable of computing anything but
they only have a limited amount of memory and that is shared. The
minimum size of this common memory that they can use to evaluate
a given function $f$ for which both of them possesses one half of the input, resp. $x$ and $y$,
shall be denoted by $S(f)$. At the
beginning it is filled with zeros. Then in each step one of the
players can put there an arbitrary message depending only on the
previous message and his input. They are finished when both of
them knows the value of $f(x,y)$. We can also imagine this as two
people communicating who have no memory at all (however, they can
remember their own input) and are allowed to send each other a
rewritable disk. The question is how big the disk has to be if
both of them wants to know the value of $f(x,y)$.

Define the identity function as $I(x,y): \{0,1\}^n\times \{0,1\}^n\rightarrow \{0,1\}$ with $I=1$ if and only if $x=y$.

$\bf Question.$ How much is $S(I)$?

$\bf Remarks.$ I know it is between $\log n$ and $\log n - \log \log n$, but which? Is it possible to solve it in $\log n-\omega(1)$ space? Anyone heard of any related things?

$\bf Example/Easy Claim.$ $S(I) \le \log(n) + O(1)$.
$\bf Proof.$ We present a construction. A sends her bits one after the other along with their ordinal number and a leading 1, meaning that it is up to B to speak. B replies to each message with his bit with the same ordinal number and a leading 0. This requires $2 + \log n$ space. If in a step his bit differs from her, they know that the answer is 0, the algorithm is over. If they finish sending all their bits, the answer is 1. Therefore, $S(I) \leq \log n + O(1)$.

Why can't they just send each other 1 bit at a time? In n steps, they'll know whether x=y. So 1 bit shared memory suffices. Or do you mean that they only get to pass information to each other once?
–
Joel David HamkinsFeb 13 '10 at 20:09

1

Oh, I think I get it. You mean they can't even remember what they've done themselves. Are they allowed to change their input memory as the computation proceeds? (i.e. gaining useful access to that memory, but still required to answer correctly for the original input)
–
Joel David HamkinsFeb 13 '10 at 20:15

the point is that each bit of communication costs them, so the n rounds of 1 bit each is n bits of communication
–
Suresh VenkatFeb 13 '10 at 20:15

1

How do you get the lower bound of log n - log log n?
–
Peter ShorMay 22 '10 at 0:18

1 Answer
1

Here's an argument that I believe shows that $\log n - \omega(1)$ is impossible. (This argument came out of a discussion I had with Steve Fenner.)

Let Alice's input be $x\in\{0,1\}^n$ and let Bob's input be $y\in\{0,1\}^n$. Assume the shared memory stores states in $\{0,1\}^m$, and its initial state is $0^m$. We are interested in lower-bounding $m$ for any protocol that computes EQ$(x,y)$.

A given protocol is defined by two collections of functions $\{f_x\}$ and $\{g_y\}$, representing the functions applied to the shared memory by Alice on each input $x$ and Bob on each input $y$, respectively, along with some answering criterion. To be more specific, let us assume that if the shared memory ever contains the string $1^{m-1}b$ then the output of the protocol is $b$ (for each $b\in\{0,1\}$). For convenience, let us also assume that $f_x(1^{m-1}b) = 1^{m-1}b\:$ and $g_y(1^{m-1}b) = 1^{m-1}b\:$ for every $x,y\in\{0,1\}^n$ and $b\in\{0,1\}$. In other words, Alice and Bob don't change the shared memory once they know the answer.

Now, for each $x,y\in\{0,1\}^n$, consider what happens when Alice and Bob run the protocol on the input $(x,y)$. Define $A_{x,y}\subseteq\{0,1\}^m$ to be the set of all states of the shared memory that Alice receives at any point in the protocol, and likewise define $B_{x,y}\subseteq\{0,1\}^m$ to be the states of the shared memory that Bob receives. Also define
\[
S_{x,y} = \{0w\,:\,w\in A_{x,y}\} \cup \{1w\,:\,w\in B_{x,y}\}.
\]
We will assume Alice goes first, so $0^m \in A_{x,y}$ for all $x,y$. Let us also make the following observations:

By the definition of $A_{x,y}$ and $B_{x,y}$, it holds that $f_x(A_{x,y}) \subseteq B_{x,y}$ and $g_y(B_{x,y}) \subseteq A_{x,y}$ for all $x,y$.

For every $x,y$ with $x\not=y$ it holds that $1^{m-1}0\in A_{x,y} \cup B_{x,y}$, because Alice and Bob output 0 when their strings disagree.

For every $x$ it holds that $1^{m-1}0\not\in A_{x,x} \cup B_{x,x}$, because Alice and Bob do not output 0 when their strings agree.

Now let us prove that $S_{x,x}\not=S_{y,y}$ whenever $x\not=y$. To do this, let us assume toward contradiction that $x\not=y$ but $S_{x,x} = S_{y,y}$ (i.e., $A_{x,x} = A_{y,y}$ and $B_{x,x} = B_{y,y}$), and consider the behavior of the protocol on the input $(x,y)$.

But now we have our contradiction, assuming the protocol is correct: given that $A_{x,y}\subseteq A_{x,x}$ and $B_{x,y} \subseteq B_{y,y}$, it follows that $1^{m-1}0 \in A_{x,x}\cup B_{y,y}$, so Alice and Bob output the incorrect answer 0 on either $(x,x)$ or $(y,y)$.

Each $S_{x,x}$ is a subset of $\{0,1\}^{m+1}$, so there are at most $2^{2^{m+1}}$ choices for $S_{x,x}$. Given that the sets $S_{x,x}$ must be distinct for distinct choices of $x$, it follows that $2^{2^{m+1}} \geq 2^n$, so $m \geq \log n - 1$.

Wow, this is so nice and simple! I even posted a combinatorial version of this problem which remained unsolved, or course your argument also works for that: mathoverflow.net/questions/15243/two-n-to-n-function-families In fact, your argument gives a lower bound of log log F, where F is the size of the largest fooling set (as defined in the Kushilevitz-Nisan book). I guess the next interesting question would be to examine functions for which the fooling set lower bound is not sharp in the classical CC model, like the Inner Product: <x,y>.
–
domotorpJul 31 '10 at 9:45