In the best case the polynomials' terms' have the same
exponents: the whileloop is done the
number of terms times and the for loops are not done.
The count is 12N + 8, for N terms.

In the worst case the polynomials' terms' have different
exponents; the whileloop is done twice
the mimimum number of terms times, and a for loop is done
the difference between the numbers of terms times.
The count is 8(N+M) + 5(N-M) + 8 = 13N + 3M + 8, for
N and M terms, N > M.
If N = M, that's 16N + 8.

Building a worst case table

In any case, step count does not give exact data, as the size of
a step is not constant.

Asymptotic Complexity Analysis
(HSM Ch.1.6.1.3)

A motivation for analysis is to compare algorithms

Exact step count may be difficult or impossible

Step count depends on step size

Complexity is typically proportional to input "size"

For small input, complexity is often small

For large input, complexity comparison may be based on
orders of magnitude

Orders of magnitude

Given an array of N integers, how complex is:

Add 1 to the first element. Write down the result.

Look at the middle element; if it is odd then repeat this
operation on the top half of the array, otherwise repeat
it on the bottom half of the array, until the portion
of the array being used has only one element. Write down
that last element.

Add 7 to each element. Write down each result.

For each element do the operation of recursively looking at
the middle element, described above, but making the
decision based on the polarity of the difference between
the middle element and the element under consideration.

For each element, multiply it by each other element. Write
down each result.

Write down 0. For each element of the array, write down the
sum and difference between the element and each value
written down when using the previous element (starting
with 0 the first time round).

Commonly used orders of magnitude are:

Constant

1

Logarithmic

logb(N)

Linear

N

NlogN

N * logb(N)

Polynomial (of order m)

Nm

Exponential (of base k)

kN

For large N, the differences due to the factors of proportion
and differences due to added terms of a smaller order of
magnitude, are insignificant compared to the differences
caused by higher complexity.
See HSM Figure.1.3 and Table 1.7

Example: an algorithm of complexity
c1N2 + c2N is
worse than one of complexity c3NlogN, for
sufficiently large values of N.

For smaller N, the factors of proportion and smaller added terms
are significant.
There is some value of N after which the larger order of
magnitude dominates.
This is called the break-even point.

The break-even point cannot be determined analytically;
empirical evaluation is needed.

For small N, and equal asymptotic complexity, factors of
proportion and differences due to added terms of a smaller order
do matter.

Constant complexity is boring,
logarithmic complexity is magic,
linear complexity is good,
N-logarithmic complexity is realistic,
polynomial complexity can get bad,
exponential complexity is ugly, and it could be worse.
See HSM Table.1.8.