CS 410 Fall 99
Homework 1 Solutions

a: 26 b: 600,000 c: 133,378,058

Input is of length 3.5E+9, on an O(n2) algorithm, so it takes 1.225E+10
seconds, or over 388 years. Most people don't want to wait that long for a solution. On an
O(n lg n) algorithm, it takes about 111 seconds, which is definitely feasible.

a. Show (n choose k) = Theta(nk).
(n choose k) = [n * (n-1) * ... * (n-k+1)]/[k * (k-1) * ... * 1]
= [n/k]*[n-1/k-1]*...*[n-k+1/1] <nk by inspection. Thus (n choose k) = O(nk)
To show (n choose k) is Omega(nk), we need to show (n choose k) > (n/k)k.
From the expansion above, it suffices to show that [n-i/k-i] > n/k for 0 >= i
>=k-1. Crossmultiplying, we get that we need to show nk-ik> nk-ni, or that ni-ki
> 0. But this is clearly the case, since k<n, and i>= 0. Thus (n choose k) >
(n/k)k. Since k is a constant, this also means that (n choose k) = Omega(nk).
Since (n choose k) is now both Omega(nk) and O(nk), we have (n
choose k) = Theta(nk).

c. Show logan = Theta(logbn) for any positive constants a and b.
We simply note that logan = [logbn/logba], and since logba
is a constant, the two log's differ by a constant, which is exactly what Theta notation is
designed to capture. Thus logan = Theta(logbn).

lg(lg n), lg n, (lg n)2, sqrt(n), n, n2/lg n, n2, 100sqrt
n ,2n

No. While subsitituting the Linear Search for the Binary search will find the correct
location in O(log N) time, it still takes linear time to shift all the following items
over to make space at that location. Thus, the algorithm would still be O(N2).

Phase 1 : Sort the numbers in ascending order using a sorting algorithm with time
complexity Theta(n log n), such as mergesort.
Phase 2 : Suppose sorted numbers are store in an array A[1..n]. For each A[i], find a
number X - A[i] in A[i..n] using binary search algorithm. If found, return Yes, otherwise
No.
The first phase takes Theta(n log n) time. For the second phases, binary searching is done
n times and it takes O(log n) each time. Thus time complexity for second phase is O(n log
n). Combining together, the algorithm has Theta(n log n) time complexity.

We are given the recurrence T(n) = T(n/2) + Theta(1). The master method applies to
recurrences of the form T(n) = aT(n/b) + f(n), so this recurrence can be solved by the
master method. We note that for this case, a=1, b=2, and f(n)=Theta(1). For these values
of a and b, we note that logba = log21 = 0, so nlogba
= n0 = 1. We wish to show that we are in case 2 of the master theorem, or that
f(n) = Theta(nlogba).
RHS = Theta(nlogba) = Theta(n0) = Theta(1) = LHS. Thus we
are in case 2. In this case, the master theorem tells us that T(n) = Theta(nlogba
lg n), or that T(n) = Theta(lg n).

Details of the data structure were covered in class. Basically, we maintain the
invariant that check[check_index[key]] == key when the information in the
dictionary for key is valid, and one of the indices will be out of range or have an
incorrect value otherwise.