2
Asymptotic Analysis. The Big-O notation is typically written as: f(n) = O(g(n)). It reads: “f(n) is of the order of g(n)”or “f(n) is proportional to g(n)”. If f(n) = O(g(n)), then there exists c and n o such that: f(n) n 0 The above equation states that as n increases, the algorithm complexity grows NO faster than a constant multiple ofg(n). n0n0 Input size Require Time

6
Big-O Notation: Limitation The big-O notation has some problems. Let’s assume that we have the following complexity function: We can find so many g(n) limits that can be considered as a valid bound of the complexity f(n). So one cannot decide which g(n) will be the best for the algorithm at hand. Different choices of c and n o are possible.

8
Computing Big-O Notation: Guideline Loops such as for, while, and do-while: –The number of operations is equal to the number of iterations (e.g., n) times all the statements inside the for loop. Nested loops: –The number of statements in all the loops times the product of the sizes of all the loops. Consecutive statements: –Use the addition rule of order arithmetic: O(f(n)+g(n))=(max[f(n), g(n)]). if/else and if/else if statement: –The number of operations is equal to running time of the condition evaluation and the maximum of running time of the if and else clauses. So, the complexity is: O(Cond) + O(max[if, else])

9
Computing Big-O Notation: Guideline (Contd.) switch statements: Take the complexity of the most expensive case (with the highest number of operations). Methods call: First, evaluate the complexity of the method being called. Recursive methods: If it is a simple recursion, convert it to a for loop.