I know 72% of the test takers got it right, but just can't seem to get it. Any takers?

57. Consider the following procedure for determining whether a given name appears in an alphabetical list of n names.

Step 1. Choose the name at the middle of the list (if n=2k, choose the kth name); if that is the given name, you are done; if the list is only one name long, you are done. If you are not done, go to Step 2.

Step 2. If the given name comes alphabetically before the name at the middle of the list, apply Step 1 to the first half of the list; otherwise, apply Step 1 to the second half of the list.

If n is very large, the maximum number of steps required by this procedure is close to(A) n(B) n^2(C) log_2 (n)(D) n log_2 (n)(E) n^2 log_2 (n)

After the first split, the list size becomes floor(n/2), which is close enough to n/2 for large n. After the second, it is around n/4. So after s steps, the list size is approximately n/(2^s). The maximum number of steps occurs when you are left with a list size of 1. Therefore, n/(2^s)=1, or s=log_2(n).

Ivanjam's solution is pretty much spot on. However, I have one other suggestion to add.

On a problem like this, if you can't figure it out analytically, try to use an actual number. So let's say your list has n=64 items in it. (We pick 64 because we see all those log_2's in our answer, and it's nice to take the log_2 of a power of 2.)

Now, suppose that you have the worst case scenario - that you keep going until you have a list of 1. So let's take a look at how many steps it takes.

64 items gets cut down to 32. Then 32 -> 16, 16 -> 8, 8 -> 4, 4 ->2, and finally 2 -> 1 and we're done. So it seems that it took 6 steps to get from 64 down to 1.

This is basic binary search. If you've taken any CS, you'll know the answer without much thought.

If you want a proof, look at a recurrence relation for the algorithm: T(n) = T(n/2) + 3 and solve. Or you can apply the Master theorem to see T(n) = O(logn). This imprecise answer rules out all the other answer choices.

Hey, Ivanjam, DMashura, and p-adic, you're all amazing: A, for immediately knowing how to do this, and B, for taking the time to help me and others out. That's crystal clear now. All great approaches. I attempted some sort of number plug in, but I failed to recognize to use a number that is a power of two. Still need to work on recognizing some of those obvious things sometimes since the log_2 really threw me off. Thanks again!!!!!

Normally in CS, log is assumed to be base 2. (Using other bases doesn't affect the asymptotic runtime if you're familiar with big-Oh notation.)

Not assuming things to be a power of 2 makes it a little more interesting actually. If you were to prove results about the recurrence relation, you may need to start by assuming n = 2^k and doing some sort of induction, and then bounding n between two powers of 2 for the other cases: 2^s < n < 2^t and making some arguments with inequalities from there.

The way I thought of it was to consider what happens if you double the amount of entries in a list. After the first step, you're back to a list of the original size (either the first half or second half of the doubled list), so by doubling the size you added 1 step. That is exactly what happens in log base 2.