TransWikia.com

Showing an infinite sequence is constant under some condition

Mathematics Asked on November 19, 2021

Let $a_1,a_2,…$ be an infinite sequence of positive real numbers such that for each positive integer $n$ we have

$$
frac{a_1+a_2+..+a_n}ngesqrt{frac{a^2_1+a^2_2+…+a^2_{n+1}}{n+1}}.
$$

Prove that the sequence $a_1,a_2,… $ is constant.

MY ATTEMPT/THOUGHTS:

My initial plan is to show that the sequence is bounded and then proving it is constant.

For that I considered the following.

Let $m_n=min{a_1,a_2,…,a_n}$, $M_n=max{a_1,a_2,…,a_n}$, and $$S_n=frac{a^2_1+a^2_2+…+a^2_n}{n}.$$

Then we have $$m^2_nle S_{n+1} le M^2_n.$$

Also from the given inequality, we have, on squaring,

$$frac{1}{n}S_n+2frac{a_1a_2+a_1a_3+….+a_{n-1}a_n}{n^2}ge S^2_{n+1}.$$

I have no idea how to proceed after this or even if I am moving in the right direction!

Do you have any suggestions? Thanks for your time.

One Answer

We have $$ underbrace{text{QM}(a_1,ldots,a_n)}_{Q(n)}geq underbrace{text{AM}(a_1,ldots,a_n)}_{A(n)} geq underbrace{text{QM}(a_1,ldots,a_{n+1})}_{Q(n+1)}geq underbrace{text{AM}(a_1,ldots,a_{n+1})}_{A(n+1)} tag{0}$$ so both $A(n)$ and $Q(n)$ are non-increasing and $a_{n+1}leq A(n)$. The central inequality can be written as

$$ a_{n+1}^2 leq (n+1)A(n)^2 - nQ(n)^2 tag{1} $$ so we must have $$ A(n)^2 geq frac{n}{n+1} Q(n)^2,qquad A(n)geq Q(n)sqrt{1-tfrac{1}{n+1}}.$$ We may consider that the average value of $a_1,ldots,a_n$ is $A(n)$ and $$ V(n)=frac{1}{n}sum_{k=1}^{n}(a_k-A(n))^2 = Q(n)^2-A(n)^2leq frac{Q(n)^2}{n+1}.$$ $Q(n)$ is non-increasing and $frac{1}{n+1}$ is decreasing to zero, so the variance goes to zero as $nto +infty$.
We may write $(1)$ as

$$ a_{n+1}^2 leq A(n)^2 - nV(n) tag{2}$$

and define a sequence in the following way:

$$ a_1=2,quad a_2=1,quad a_{n+1}=sqrt{A(n)^2-nV(n)} $$

leading to

$$ {a_n}_{ngeq 1}=left{2,1,frac{sqrt{7}}{2},frac{1}{6} sqrt{48 sqrt{7}-71},frac{1}{12} sqrt{frac{15}{2} sqrt{979+1212 sqrt{7}}+3 sqrt{7}-293},ldotsright} $$ This seems to work for a few terms, but at some point $n V(n)=sum_{k=1}^{n}(a_k-A(n))^2$ becomes larger than $A(n)^2$. Now we have to prove that unless ${a_n}_{ngeq 1}$ is constant we cannot avoid this phenomenon.

$$begin{eqnarray*} (n+1)V(n+1)-n V(n) &=& (n+1)Q(n+1)^2-(n+1)A(n+1)^2-n Q(n)^2+n A(n)^2\&=&(a_{n+1}-A(n+1))^2+n(A(n)-A(n+1))^2end{eqnarray*} $$ shows that $n V(n)$ is weakly increasing.

$$ (n+1)V(n+1)=sum_{k=1}^{n}((k+1)V(k+1)-k V(k))geq sum_{k=1}^{n}k(A(k)-A(k+1))^2 $$ and $$nsum_{k=1}^{n}k(A(k)-A(k+1))^2stackrel{text{CS}}{geq}left(sum_{k=1}^{n}sqrt{k}(A(k)-A(k+1))right)^2 $$ can be lower-bounded by using summation by parts:

$$ sum_{k=1}^{n}sqrt{k}(A(k)-A(k+1)) geq (A(1)-A(n+1))sqrt{n}. $$

Answered by Jack D'Aurizio on November 19, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP