We decide to play the following game: An integer N will be randomly selected from the interval 0 - 100, inclusive. You try to guess N. After each guess, I tell you whether N is higher or lower than your guess.

If you successfully guess the integer, you win N dollars. Each guess costs you K dollars.

For each of the variants (a) and (b) below, what is the maximum value of K for which you'd be willing to play this game? Which strategy would you use to try to maximize your winnings in the long run?

(a) Once you start a round, you must continue until you guess N exactly.

(b) You may stop playing a round if you determine that N is too small to keep paying more money to guess N exactly. The money you've already spent on guesses is lost, but you may then start a new round with a new N.

And perhaps this can be used to prove Steve's insight that in the (b) variant of the game, you should quit after your initial guess if N is lower.

If your first guess, x, is greater than N, then your expected number of additional guesses to find N is E(x), your expected cost is E(x) * K, and your expected winnings are (x-1)/2. So your expected net profit if you continue is p(x) = (x-1)/2 - E(x)K.

We know from part (a) that we'd be willing to play for at least $8.60 - the variant (b) would only increase this amount. So we can set K = 8.60 and find the first value x for which p(x) is greater than 0. It turns out that x = 101 is that point. Increasing K only increases the value x at which p(x) becomes positive.

So I think this shows that for the maximum K for which you'd be willing to play the (b) version of the game, there is no initial guess x for which you'd be willing to keep playing if N < x.

But I haven't had my coffee yet today so hopefully someone smarter than me can take a look and see if this makes sense. :)