Let $A[1, …n]$ be an array storing a bit ($1$ or $0$) at each location, and $f(m)$ is a function whose time complexity is $\Theta(m)$. Consider the following program fragment written in a C like language:

Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
only counter++ is executed n times.
b) All 0s in A[]: Time taken is Θ(n) as
only f(0) is called n times
c) Half 1s, then half 0s: Time taken is Θ(n) as
only f(n/2) is called once.

Your identity must be verified before you can post a comment. Please wait if already uploaded identity proof or upload your proof here

10 Answers

+41 votes

Best answer

The key part in the code is "$counter = 0$" in the else part as we can see below.

Lets take the best case. This happens when $a[i] = 1$ for all $i$, and then the loop executes with time complexity $\Theta(1)$ for each iteration and hence overall time complexity of $\Theta(n)$ and we can say time complexity of the code fragment is $Ω(n)$ and hence options A and B are false.

Now, consider the worst case. This happens when $a[i] = 0$ or when else part is executed. Here, the time complexity of each iteration will be $\Theta(counter)$ and after each else, counter is reset to $0$. Let $k$ iterations go to the else part during the worst case. Then the worst case time complexity will be $\Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k)$, where $x_i$ is the value of the counter when, $A[i] = 0$ and $f(counter)$ is called. But due to $counter = 0$ after each call to $f()$,

we have, totally n iterations, in which first k iterations are performed by else part and remaining n-k iterations are performed by if part

∴ if counter = 0, it leads to θ(1) ===> this will repeated k times ( one time for each xi = 0 )

Since the time complexity is $Ω(n)$ and $\Theta(n)$ we can say it is $\Theta(n)$ - Option (C). (Option D is false because the small o needs the growth rate to be STRICTLY lower and not equal to or lower as the case for big 0)

If $counter = 0$ was not there in else part, then time complexity would be $Ω(n)$ and $O(n^2)$ as in worst case we can have equal number of $0$'s and $1$'s in array $a$ giving time complexity $\Theta(1) + \Theta(2) + \dots + \Theta(n/2) + \Theta(n/2)$ would give $O(n^2)$.

If counter=0 was not there in else part, then time complexity would be Ω(n) and O(n2) as in worst case we can have equal number of 0's and 1's in array aa giving time complexity Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n/2)would give O(n2). @Shaik Masthan

Your identity must be verified before you can post a comment. Please wait if already uploaded identity proof or upload your proof here

+10 votes

Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
only counter++ is executed n times.
b) All 0s in A[]: Time taken is Θ(n) as
only f(0) is called n times
c) Half 1s, then half 0s: Time taken is Θ(n) as
only f(n/2) is called once.

I think your analysis in case of worst time complexity is wrong
because each time func(m)is invoked counter becomes 0 and as the
time complexity of fun(m)is O(m) and when counter would be zero then
the func would take O(0) time .
Worst time would be O(n).

Your identity must be verified before you can post a comment. Please wait if already uploaded identity proof or upload your proof here

+1 vote

ans should be C which is Θ(n)

because in best case time complexity is Ω(n) when array contain all 1's then only if condition executing n times

in worst case time complexity is O(n)

let if condition executing n/2 times then counter value is n/2 and "(n/2)+1" th iteration else condition executing so f(n/2) executing n/2 times and make counter value is 0 and again remaing (n/2) iterations running if condition (n/2) times so total time (n/2)+(n/2)+(n/2) so worst case time complexity is O(n)

If there are N elements and the array consists of Alternate 1's and 0's then almost half of the elements will takes constant time and other half will call f(1) which takes ø(m). Total = (N/2)*c + (N/2) * ø(m) = O(N*m).

Your identity must be verified before you can post a comment. Please wait if already uploaded identity proof or upload your proof here

0 votes

answer = option D

Consider the best case for this code snippet, for which test case is a string of all 1's
so, best case time complexity = $\mathcal{O}(n)$

for considering worst case for this code snippet, it has a variable time complexity, coz it depends on the function $f(m)$ whose time complexity is $\Theta(m)$
To show it has a variable time complexity, consider two cases to get the idea:

Case 1: $0\ 1 \ 0 \ 11 \ 0 \ 111 \ 0 \ 1111 \ 0 \dots$
in such a case the function behaves as some part of code snippet which has a linear increment i.e. time complexity of $\mathcal{O}(n)$