Wow. Why is sqrt(x) lagging? Common square root algorithms work very similarly to division & reciprocal algorithms. Unless they totally ran out of ROM space and picked a reduction in sqrt(X) precision as the best compromise, this looks like an oversight.

I believe - it's been awhile - that there's a CORDIC-like shift/add algorithm for square root too - each iteration would deliver one digit (either binary bit, hex digit or BCD digit, according to base of numeration system and algorithm flavora) of result. Square root is fairly well-behaved too - unlike asymptotic tan(x) near 90deg and log(x), ln(x) near 0.