Recently I came up with an algebra problem with a nice geometric representation. Basically, I would like to know what happens if we repeatedly circumscribe a rectangle by another rectangle which is rotated by $\alpha \in \left( 0, \frac {\pi} {4}\right)$ radians. Use this picture as reference:

enter image description here

In particular, do the resulting rectangles converge to a square?

It is rather easy to show that the rectangles do converge to a square if $\alpha$ is constant throughout the process. However, if we make it more general by defineing a sequence $\left(\alpha_n\right)_{n=1}^{\infty}$ of angles and use $\alpha_i$ in the $i$'th operation, then the answer seems to depend on the chosen sequence. So, for which sequences $\left(\alpha_n\right)_{n=1}^{\infty}$ do the rectangles converge to a square?

Algebraically this problem can be defined like this:

Define two real sequences by $A_0=a, B_0=b, a \neq b, a,b \in R_{\gt0}$ and $A_{n+1}=B_n\sin\alpha_n + A_n\cos\alpha_n, B_{n+1}=A_n\sin\alpha_n + B_n\cos\alpha_n \forall n \in N_{\gt 0}$, where $\alpha_i \in \left( 0, \frac {\pi} {4}\right) \forall i \in N_{\ge 0}$. Is it true that $\lim_{n \to \infty}\frac{A_n}{B_n}=1$?

I tried out a few sequences in C++ to notice some patterns. Interestingly, rectangles seem to converge to a square if and only if $\lim_{n \to \infty} \left( \sum_{i=0}^n \alpha_i \right) = \infty$. In particular, for $\alpha_n = \frac{1}{n}$ the convergence is really slow, however, it still seems to be converging.

Also, I believe that showing $\lim_{n \to \infty}\left(A_n-B_n\right)=1$ would be an even stronger result for this problem. Does such a replacement have any influence on the result?

For the record, I am still a high school student, so I have no idea how hard this problem might actually be. Any help would be highly appreciated.

P.S. This is my first question on the site, so please don't judge my wording and style too much. Feel free to ask questions if anything is unclear.

  • 364
  • 3
  • 14
  • 753
  • 7
  • 14
  • 29
    This is an excellent first question. – Chappers Jun 03 '17 at 23:16
  • 3
    Oh goodness... I think you'll want $$A_n-B_n\stackrel{n\to\infty}\longrightarrow0$$ and the ratio $\frac{A_n}{B_n}$ won't suffice. For example, consider Stirling's approximation to the factorial for classic example. – Simply Beautiful Art Jun 03 '17 at 23:25
  • 1
    @SimplyBeautifulArt: I think that's a matter of definition, and both definitions seem reasonable to me. Your criterion is stricter -- it excludes something like { (9 x 10), (99 x 100), (999 x 1000), (9999 x 10,000), ... }, which the OP's criterion allows -- but I'm not sure there's any *a priori* reason to say that your criterion is a better formalization of "converges to a square". – ruakh Jun 04 '17 at 04:27
  • 1
    Hm, then I suppose $\frac{A_n}{B_n}\to1$ is the best option to define a square, as one can have $A_n=n^2+n$ and $B_n=n^2$ and it'll tend to look like a square even though their differences are unbounded. – Simply Beautiful Art Jun 04 '17 at 12:03
  • A problem like billiards? http://mathworld.wolfram.com/Billiards.html it's open problem, even now. – Takahiro Waki Jun 04 '17 at 17:40

2 Answers2


This is a particularly good first question!

Using a little linear algebra, we can actually write sufficiently explicit formulas for $A_n$ and $B_n$ to confirm your conjecture, namely that the divergence of $\sum_{i = 1}^{\infty} \alpha_i$ is sufficient and necessary.

First, by writing the recurrence relations for $A_n$ and $B_n$ in matrix notation and applying a straightforward induction, we get that, for all nonnegative integers $n$, $$\require{cancel} \pmatrix{A_n \\ B_n} = \left(\prod_{i = 1}^n \underbrace{\pmatrix{\cos \alpha_i & \sin \alpha_i \\ \sin \alpha_i & \cos \alpha_i}}_{\Gamma_i}\right) \pmatrix{a \\ b} . $$ We can make this more explicit by employing the standard trick of diagonalization: The eigenvalues of $\Gamma_i$ are $\lambda_i^{\pm} := \cos \alpha_i \pm \sin \alpha_i$, so for $\alpha_i$ in the given range, we have that $$0 < \lambda_i^- < \lambda_i^+ ,$$ and in particular the eigenvalues are distinct, guaranteeing that we can write $$\Gamma_i = P_i \pmatrix{\lambda_i^- & 0 \\ 0 & \lambda_i^+} P_i^{-1}$$ for some $P_i$. In fact, since the rotations $\Gamma_i$ all commute with one another (and are diagonalizable) they are simultaneously diagonalizable, that is, we can choose all of the $P_i$ to be the same matrix $P$. Computing eigenvectors corresponding to the eigenvalues shows that we may take $$P = \pmatrix{\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}}},$$ the rotation matrix for a clockwise rotation of $\frac{\pi}{4}$ (see the remark below).

Our observation about being able to use the same $P$ for all rotations pays off immediately, as the expression for the product $\prod_{i = 1}^n \Gamma_i$ of rotations simplifies dramatically: $$ \prod_{i = 1}^n \Gamma_i = \prod_{i = 1}^n \left[P \pmatrix{\lambda_i^- & 0 \\ 0 & \lambda_i^+} P^{-1} \right] = P \pmatrix{\prod_{i = 1}^n \lambda_i^- & 0 \\ 0 & \prod_{i = 1}^n \lambda_i^+} P^{-1}. $$ This yields explicit formulas for $A_n, B_n$ in terms of $a, b, (\alpha_i)$: $$\pmatrix{A_n \\ B_n} = \left(\prod_{i = 1}^n \Gamma_i\right) \pmatrix{a \\ b} = \pmatrix{\frac{1}{2}(a + b) \prod_{i = 1}^n \lambda_i^+ + \frac{1}{2} (a - b) \prod_{i = 1}^n \lambda_i^- \\ \frac{1}{2}(a + b) \prod_{i = 1}^n \lambda_i^+ + \frac{1}{2}(b - a) \prod_{i = 1}^n \lambda_i^-} .$$

Dividing and rewriting gives $$\frac{A_n}{B_n} = \frac{\frac{1}{2}(a + b) \prod_{i = 1}^n \lambda_i^+ + \frac{1}{2} (a - b) \prod_{i = 1}^n \lambda_i^-}{\frac{1}{2}(a + b) \prod_{i = 1}^n \lambda_i^+ + \frac{1}{2}(b - a) \prod_{i = 1}^n \lambda_i^-} = \frac{1 + \mu_n}{1 - \mu_n},$$ where $$ \mu_n := \frac{a - b}{a + b} \prod_{i = 1}^n \frac{\lambda_i^-}{\lambda_i^+} = \frac{a - b}{a + b} \prod_{i = 1}^n \frac{\cos \alpha_i - \sin \alpha_i}{\cos \alpha_i + \sin \alpha_i} = \frac{a - b}{a + b} \prod_{i = 1}^n \tan \left(\frac{\pi}{4} - \alpha_i\right) , $$ and $0 < \mu_n < 1$.

So, if $\limsup \alpha_i > 0$, then $\mu_n \to 0$ and hence $\frac{A_n}{B_n} \to 1$. If, on the other hand, $\limsup \alpha_i = 0$, we need not have $\mu_n \to 0$. For small $\alpha_i$, $\tan \left(\frac{\pi}{4} - \alpha_i\right) \sim 1 - 2 \alpha_i$, so for $\frac{A_n}{B_n} \to 1$ I believe it's sufficient and necessary for $\sum_{i = 1}^{\infty} \alpha_i$ to diverge.

Similarly, we get \begin{align*}\lim_{n \to \infty} (A_n - B_n) &= (a - b) \prod_{i = 1}^n (\cos \alpha_i - \sin \alpha_i) . \end{align*} Again, $\limsup \alpha_i > 0$ is sufficient to guarantee vanishing. For the case $\limsup \alpha_i = 0$ we can use $\cos \alpha_i - \sin \alpha_i \sim 1 - \alpha_i$ to conclude that $A_n - B_n \to 0$ iff $\sum_{i = 1}^{\infty} \alpha_i$ diverges.

Remark The Jordan decomposition $\Gamma_i = P \pmatrix{\lambda_i^- & 0 \\ 0 & \lambda_i^+} P^{-1}$ is in this problem more than a convenient tool---it also gives some geometrical insight into why rotating and circumscribing makes rectangles more squarelike. The matrix $P^{-1}$ rotates a vector in the $ab$-plane (which we can view as a vector from the rectangle to one of its corners) by $\frac{\pi}{4}$. Then $\Gamma_i$ shears the rotated vector, lengthening its component in the direction that corresponds to the quantity $a + b$ and shortening its component in the direction that corresponds to $a - b$, which we may view as the absolute defect from squareness. Finally, the matrix $P$ rotates back the one-eighth of a turn, which we may view as returning back to the original $ab$-coordinates.

Travis Willse
  • 83,154
  • 8
  • 100
  • 207
  • Are $\dfrac{A_n}{B_n}\to1$ and $(A_n-B_n)\to0$ necessarily the same thing? The latter implies the former but not necessarily vice versa... – Steven Stadnicki Jun 04 '17 at 01:03
  • Like you say, for general sequences they need not be, but $A_n$ and $B_n$ here are not general. It is, of course, possible that I've anyway overlooked something. – Travis Willse Jun 04 '17 at 01:06

This will not explain all your observations, but might shed some first light on what is going on.

As you noted we can find this nice recursive definition for the sequence of the side lengths:

\begin{align} A_{n+1} &= \cos\alpha \cdot A_n+\sin\alpha \cdot B_n,\\ B_{n+1} &= \sin\alpha \cdot A_n+\cos\alpha \cdot B_n. \end{align}

This looks pretty linear and can be expressed via some vectors and matrices:

$$ \begin{pmatrix}A_{n+1}\\B_{n+1}\end{pmatrix} = \underbrace{\begin{pmatrix} \cos \alpha &\sin\alpha \\ \sin\alpha &\cos \alpha \end{pmatrix}}_{R(\alpha)} \begin{pmatrix}A_{n}\\B_{n}\end{pmatrix}. $$

So the problem reduces to studying the behavior of the matrices $R(\alpha)$ and their compositions $\prod_i R(\alpha_i)$. This is because

$$ \begin{pmatrix}A_{n+1}\\B_{n+1}\end{pmatrix} = R(\alpha_n)R(\alpha_{n-1})\cdots R(\alpha_0) \begin{pmatrix}A_{0}\\B_{0}\end{pmatrix}. $$

Note that

$$\det R(\alpha)=\cos^2\alpha-\sin^2\alpha\in(0,1),\qquad\text{for }\alpha\in(0,\pi/4).$$

This means that (at least for $\alpha_i$ bounded from below by some $\epsilon$) the matrix defined by the product

$$\prod_{i=0}^n R(\alpha_n)$$

becomes more and more singular for $n\to\infty$ as $\prod_i \det R(\alpha_i)\to 0$. Further, note that $R(\alpha)$ is in the form

$$\begin{pmatrix} a&b\\b&a \end{pmatrix}$$

and that the product of two matrices in such a form is also of such a form. Hence the product convergec to a singular matrix of such a form (this is not rigorous but certainly can be made to). All such matrices are of the form

$$\begin{pmatrix} c&c\\c&c \end{pmatrix}=c\cdot\mathbf 1$$

Applying this to the vector $(A_0,B_0)^\top$ describing your initial rectangle will give you

$$c\begin{pmatrix} A_0+B_0\\A_0+B_0 \end{pmatrix}.$$

And this is indeed a square as both side lengths are equal (in the limit of course).

As I said, this is neither complete nor fully rigorous. For example, this does not discuss the case $\alpha_i=1/i$ as I assumed $\alpha_i>\epsilon>0$. In such a case we probably have $A_n,B_n\to\infty$ and cannot expect $A_n-B_n\to 0$ (e.g. choose $\alpha_i$ always close to $\pi/4$), but my reasoning shows that we still expect $A_n/B_n\to 1$.

M. Winter
  • 28,034
  • 8
  • 43
  • 92