22

A quadratic equation of the form $x^2+bx+c=0$ can be solved with the classical formula that gives all solutions.

Here I want discuss some other methods to find one solution. The best known is by means of continued fraction. In this case, from the given equation we find: $$ x^2+bx+c=0 \Rightarrow x+b+\dfrac{c}{x}=0 \Rightarrow x=-b-\dfrac{c}{x} \Rightarrow x=-b+\cfrac{-c}{-b+\cfrac{-c}{-b+\ddots}} $$ But there are at least other two methods:

Solving for $x^2$ the given equation we find $$ x^2=-bx-c \Rightarrow x=\pm\sqrt{-c-bx} \Rightarrow x=\sqrt{-c\mp b\sqrt{-c\mp b\sqrt{-c\mp \dots}}} $$ that is a solution in form of infinite nested radicals.

And solving for $x$ we find: $$ x=\dfrac{1}{b}\left( -c-x^2 \right) \Rightarrow x=\dfrac{1}{b}\left(-c-\dfrac{1}{b}\left(-c-\dfrac{1}{b}\left(-c \cdots\right)^2\right)^2\right) $$ that is a solution in form of infinite nested squares.

These methods cannot give all the two solutions, but only an approximation of one of these. The method of continued fraction is well studied and documented in its limits and potentialities (here), but for the other methods I found little or nothing on the Web.

I have done some numerical experiment with a spreadsheet (see the figure for $x^2-3x-4=0 $), and it seems the methods works (with some care for the nested radicals where we have to be careful for the signs).

enter image description here

From these experiments I see that the method of nested radicals gives the root that has the greatest absolute value ( as the continued fraction), and the method of continued squares seems that gives the lowest absolute value solution, with a convergence rate that seems not so different than for continued fractions. I'm interested to know if we can prove that this is always true and if there are some general results about these methods.

My final interest is to know if such methods can be extended to the solution of higher degree equations.

Emilio Novati
  • 60,834
  • 5
  • 42
  • 106
  • 2
    For the squares, I would consider the sequence $x_{n+1} = \frac 1 b (-c - x_n ^2)$ and try to show that for any $x_0$ it converges. For the radicals, $x_{n+1} = \sqrt {-c -b x_n}$ should do the job, but this one might have issues when the quantity under the root is negative. I would either try to show that these sequences are bounded and monotonous, or show that $f(x) = r.h.s. (x)$ is defined on some interval $I$ with values in itself and is a contraction on it. Depending on the signs of $b$ and $c$, the r.h.s. might sometimes decrease, showing that $x = f(x)$ has at most 1 solution (if any). – Alex M. Oct 08 '15 at 19:08
  • also, you can try Newton Raphson. – L F Oct 09 '15 at 15:39
  • @EmilioNovati. Thank you this wonderful and thoughtful question and for choosing my answer. I appreciate it! – hbp Oct 14 '15 at 17:01
  • 2
    @hbp: You are welcome! I'm working about your answer, especially about the cases of chaotic behavior, that it seems to we can have also for nested radicals and continued fraction when there are not real solutions of the given equation. May be that from this will come another question :). Do you know some resource on the web about this topic? – Emilio Novati Oct 14 '15 at 17:07
  • @EmilioNovati: This would be interesting. A related issue is the apparent convergence. For example, for $b = -2, c = 1$, we have an expression of $\sqrt{-1+2\sqrt{-1+2\sqrt{-1+\dots}}}$, which should be $1$. But given so many $\sqrt{-1 +\cdots}$, can we really argue that this expression makes sense? Particularly, the sign of the square root of a non-positive or even complex number appears to be somewhat ambiguous. It would be an interesting question though. Looking forward to it. – hbp Oct 14 '15 at 17:28

1 Answers1

12

This is an interesting observation. Let us first address the problem of convergence.

Convergence of the nested radical formula

For the nested radical formula, we have \begin{align} x_{n+1}^2 = -c -b \, x_{n}. \end{align} This means, around the solution $x^*$, we have \begin{align} 2 \, x_{n+1} \, \Delta x_{n+1} \approx -b \, \Delta x_{n}, \end{align} where $\Delta x_n \equiv x_n - x^*$. In other words, after a round of iteration, the error is reduced by a factor of $$ \left| \frac{b}{2\,x_{n+1}} \right| \approx \left| \frac{b}{2\,x^*} \right|. $$ This means the nested radical formula works only if $$ |x^*| > \frac{|b|}{2}. $$

Convergence of the nested square formula

The nested square formula is the opposite, we can similarly show from \begin{align} x_{n+1} = -\frac{c+x_{n}^2}{b}, \end{align} that around the solution $x^*$, \begin{align} \frac{ \Delta x_{n+1} } { \Delta x_n } \approx -\frac{ 2 \, x_n }{b} \approx -\frac{ 2 \, x^* }{b}, \end{align} which means it only works for $$ |x^*| < \frac{|b|}{2}. $$

Comparison of the convergence of the two formulas

This explains your observation that the nested radical formula often works for the larger root and the nested square formula works for the smaller root. In fact the nested radical formula works at least for one of the roots, for $$ \left| \frac{2 \, x^*}{b} \right| = \left| 1 \pm \sqrt{1 - \frac{4c}{b^2}} \right|. $$ Now with the plus sign, we always have $|x^*| > |b|/2$. Further if $$ -c > \frac 3 4 \, b^2, $$ it works for both roots. For example if $b=-1, c=-2$, with roots $x_1 = 2$, and $x_2 = -1$, the nested square formula works for both roots: \begin{align} 2 &= +\sqrt{2 + \sqrt{2 + \sqrt{2 + \cdots}}}, \\ -1 &= -\sqrt{2 - \sqrt{2 - \sqrt{2 - \cdots}}}, \end{align}

Conversely, it means that the nested square formula works for at most one root, if $$ -c < \frac{3}{4} \, b^2, $$ which is, fortunately, your case, with $b = -3, c = -4$.

The nested square formula is actually a variant of the logistic map or a general quadratic map, which is a model of studying chaos. So maybe not the best formula for convergence.

Convergence of the continued fraction method

For the continued fraction formula, we have \begin{align} x_{n+1} = -b -\frac{c}{x_n}, \end{align} and, around the solution $x^*$, we have \begin{align} \Delta x_{n+1} \approx \frac{c}{ x_n^2} \, \Delta x_{n}, \end{align} with the rate of convergence being $$ \left| \frac{c}{x_{n}^2} \right| \approx \left| 1 + \frac{b \, x^*}{c} \right|^{-1}. $$ This means the continued fraction formula works only if $$ -\frac{c}{b \, x^*} < \frac{1}{2}. $$ or, equivalently, with $1/x^* = (-b\pm\sqrt{b^2-4c})/(2c)$, we have $$ 1 \pm \sqrt{1 - \frac{4 \, c}{b^2} } < 1. $$ So there is precisely one root (with the minus sign) that satisfies this condition.

Generalization to higher-order polynomial equations

This method is useful for numerically solving higher-order polynomial equations, although I don't suppose it is new. For example, for $$ x^7 - 3 \, x + 1 = 0, $$ the nested radical formula $x = \sqrt[7]{3x-1}$ is definitely a convenient way of solving it. But it usually works for the largest root. For example, for $$ x^7 - 2 \, x^6 + 1 = 0 $$ the formula $x = \sqrt[7]{2 \, x^6 - 1}$ does not converge to the root $x = 1$, because there is a larger root $x \approx 1.98358$. And, for $$ 2 \, x^7 + 2 \, x^6 - 1 = 0 $$ The formula $x = \sqrt[7]{\frac 1 2 -x^6}$ does not converge at all.

Improving the convergence

We can improve the convergence. Take the nested radical formula for example. For a suitable value of $d$, we have $$ x_{n+1} + \epsilon = \sqrt{-c + d^2 - (b - 2 \epsilon) \, x_n}, $$ This formula is convergent if $|x^*| > |b/2 - \epsilon|$. So if we can make $\epsilon$ close to $b/2$, the nest radical formula will almost always be convergent. Indeed if $\epsilon = b/2$, this becomes the exact formula, and no iteration is needed.

We mention in passing that if rapid convergence is the aim, also please consider series acceleration methods.

hbp
  • 157
  • 1
  • 5
  • 20