# Square roots with a calculator (Part 11)

This is the last in a series of posts about square roots and other roots, hopefully providing a deeper look at an apparently simple concept. However, in this post, we discuss how calculators are programmed to compute square roots quickly.

Today’s movie clip, therefore, is set in modern times:

So how do calculators find square roots anyway? First, we recognize that $\sqrt{a}$ is a root of the polynomial $f(x) = x^2 - a$. Therefore, Newton’s method (or the Newton-Raphson method) can be used to find the root of this function. Newton’s method dictates that we begin with an initial guess $x_1$ and then iteratively find the next guesses using the recursively defined sequence

$x_{n+1} = x_n - \displaystyle \frac{f(x_n)}{f'(x_n)}$

For the case at hand, since $f'(x) = 2x$, we may write

$x_{n+1} = x_n - \displaystyle \frac{x_n^2 -a}{2 x_n}$,

which reduces to

$x_{n+1} = \displaystyle \frac{2x_n^2 - (x_n^2 -a)}{2 x_n} = \frac{x_n^2 + a}{2x_n} = \frac{1}{2} \left( x_n + \frac{a}{x_n} \right)$

This algorithm can be programmed using C++, Python, etc.. For pedagogical purposes, however, I’ve found that a spreadsheet like Microsoft Excel is a good way to sell this to students. In the spreadsheet below, I use Excel to find $\sqrt{2000}$. In cell A1, I entered $1000$ as a first guess for $\sqrt{2000}$. Notice that this is a really lousy first guess! Then, in cell A2, I typed the formula

=1/2*(A1+2000/A1)

So Excel computes

$x_2 = \frac{1}{2} \left( x_1 + \displaystyle \frac{2000}{x_1} \right) = \frac{1}{2} \left( 1000 + \displaystyle \frac{2000}{1000} \right) = 501$.

Then I filled down that formula into cells A3 through A16.

Notice that this algorithm quickly converges to $\sqrt{2000}$, even though the initial guess was terrible. After 7 steps, the answer is only correct to 2 significant digits ($45$). After 8 steps, the answer is correct to 4 significant digits ($44.72$). On the 9th step, the answer is correct to 9 significant digits ($44.7213595$).

Indeed, there’s a theorem that essentially states that, when this algorithm converges, the number of correct digits basically doubles with each successive step. That’s a lot better than the methods shown at the start of this series of posts which only produced one extra digit with each step.

This algorithm works for finding $k$th roots as well as square roots. Since $\sqrt[k]{a}$ is a root of $f(x) = x^k - a$, Newton’s method reduces to

$x_{n+1} = x_n - \displaystyle \frac{x_n^k - a}{k x_n^{k-1}} = \displaystyle \frac{(k-1) x_n^k + a}{k x_n^{k-1}} = \displaystyle \frac{k-1}{k} x_k + \frac{1}{k} \cdot \frac{a}{x_n}$,

which reduces to the above sequence if $k =2$.