2 Answers
2

Let $p_1, p_2, \ldots, p_n$ be probabilities adding up to $1$. Each $p_k$ represents the desired probability for the $k$th straw to be drawn.

Let

$q_0 = 0$

$q_1 = p_1$

$q_2 = p_1 + p_2$

$q_3 = p_1 + p_2 + p_3$

$\cdots$

$q_n = 1$

Consider the intervals $I_1 = (0, q_1), I_2=(q_1, q_2), \ldots, I_n=(q_{n-1},q_n)$ of the real line. We are going to produce a uniformly random real number between $0$ and $1$ and see which interval it lies in. The probability that it lies in the $k$th interval will then be $p_k$.

We will construct our number by flipping a coin repeatedly to form the bits of a binary fraction. In practice, we will stop as soon as we can determine which interval it lies in, but for simplicity we will deal with the entire sequence at once.

Let $r_1,r_2,\ldots$ be the results of the coin flips, where 1 represents heads and 0 represents tails.

Let $r = 0.r_1r_2\ldots$, interpreted as a binary fraction.

We will assume without proof that this procedure produces a uniform random distribution.

We will consider the bits of $r$ one at a time (as the coin is flipped).

Looking at the first $j$ bits of $r$ will allow us to see that $r$ lies in a certain interval $K_j$ of length $2^{-j}$: each additional bit halves the length of the interval. For example, if the first bit is $1$, then we know that $r \in [0.1,1]$. If the first bit is $1$ and the second is $0$, then we know that $r \in [0.1,0.11]$.

Flip a fair coin. If the coin comes up $0$, then our number is in the interval $(0, 0.1)$, so we discard from consideration any interval whose left endpoint has a first bit of $1$. That is, if $b_{k-1,1} = 1$, then we discard $I_k$. If the first coin comes up $1$, we discard from consideration any interval whose right endpoint has a first bit of $0$. That is, if $b_{k,1} = 0$, we discard $I_k$.

Flip the coin again, this time focusing on the second bit of each endpoint.

Keep flipping until only one interval is left.

Note: there is probably a more efficient/effective to describe this procedure, but I don't remember where I read of it.

@Dror: I don’t think that you’ve understood the method: your last two comments don’t really make much sense.
–
Brian M. ScottMay 18 '13 at 19:44

@Dror: I can’t elaborate: I literally have no idea what you’re thinking or trying to say, because what you’ve said seems to have almost no connection with the method that dfeuer describes.
–
Brian M. ScottMay 18 '13 at 19:51

3

@Dror: Since your comment about not distinguishing probabilities such as $1/2$ from ‘infinitesimal probabilities’ is nonsense, I think that another explanation is rather more likely: you don’t understand the method.
–
Brian M. ScottMay 18 '13 at 20:12

Note that this method, while it is certainly a really cool way to do this, has the same basic "problem" that the simpler version above does. Namely, it provides an algorithm that is not guaranteed to terminate (one might get the sequence corresponding to $\frac{1}{3}$ in which case, it will never be possible to determine if it is the first or second person that wins). I did not calculate the expected number of coin tosses for this method, but it is $\frac{8}{3}$ for the other one and I think it is probably close to the same for this method).
–
Tobias KildetoftMay 19 '13 at 23:06

1

This method may not halt in all cases, but it halts with probability 1, which is the best that you can hope for.
–
MJDMar 9 '14 at 19:47

To expand on Tobias' comment, designate that Player 1 wins on TT (Tails followed by Tails), Player 2 wins on TH, and Player 3 wins on HT. On HH, simply discard the result and flip again. Since all outcomes are equally likely and each player has a single winning outcome, the winner will be chosen uniformly at random.