TransWikia.com

Maximum eigenvalue of a covariance matrix of Brownian motion

MathOverflow Asked by Weiqiang Yang on December 6, 2021

$$ A := begin{pmatrix}
1 & frac{1}{2} & frac{1}{3} & cdots & frac{1}{n}\
frac{1}{2} & frac{1}{2} & frac{1}{3} & cdots & frac{1}{n}\
frac{1}{3} & frac{1}{3} & frac{1}{3} & cdots & frac{1}{n}\
vdots & vdots & vdots & ddots & frac{1}{n}\
frac{1}{n} & frac{1}{n} & frac{1}{n} & frac{1}{n} & frac{1}{n}
end{pmatrix}$$

How to prove that all the eigenvalues of $A$ are less than $3 + 2 sqrt{2}$?

This question is similar to this one.

I have tried the Cholesky decomposition $A = L^{T} L$, where

$$L^{T} = left(begin{array}{ccccc}
1 & 0 & 0 & cdots & 0\
frac{1}{2} & frac{1}{2} & 0 & cdots & 0\
frac{1}{3} & frac{1}{3} & frac{1}{3} & cdots & 0\
vdots & vdots & vdots & ddots & vdots\
frac{1}{n} & frac{1}{n} & frac{1}{n} & frac{1}{n} & frac{1}{n}
end{array}right)$$

then

$$(L^{T})^{-1}=left(begin{array}{ccccc}
1 & & & cdots\
-1 & 2 & & cdots\
& -2 & 3 & cdots\
vdots & vdots & vdots & ddots\
& & & -(n-1) & n
end{array}right)$$

$$A^{-1}=L^{-1}(L^{T})^{-1}$$

How to prove the eigenvalues of $A^{-1}$

$$lambda_{i}geqfrac{1}{3+2sqrt{2}}$$

Further, I find that $A$ is the covariance matrix of Brownian motion at time $1, 1/2, 1/3, ldots, 1/n$

2 Answers

Inspired by @Mateusz Wasilewski I find another method.

begin{eqnarray*} langle x,Axrangle & = & langle Lx,Lxrangle\ & = & sum_{i=1}^{n}u_{i}^{2} end{eqnarray*}

where $u_{i}=sum_{j=i}^{n}frac{1}{j}x_{j}$.

begin{eqnarray*} sum_{i=1}^{n}u_{i}^{2} & = & sum_{i=1}^{n}(sum_{k=i}^{n}b_{k})^{2}quad(text{where} b_{k}=frac{1}{k}x_{k})\ & = & sum_{i=1}^{n}(sum_{k=i}^{n}b_{k}^{2}+2sum_{k>jgeq i}b_{k}b_{j})\ & = & sum_{k=1}^{n}sum_{i=1}^{k}b_{k}^{2}+2sum_{j=1}^{n-1}sum_{k=j+1}^{n}sum_{i=1}^{j}b_{k}b_{j}\ & = & sum_{k=1}^{n}frac{x_{k}^{2}}{k}+2sum_{j=1}^{n-1}sum_{k=j+1}^{n}b_{k}x_{j}\ & = & sum_{k=1}^{n}frac{x_{k}^{2}}{k}+2sum_{k=2}^{n}sum_{j=1}^{k-1}b_{k}x_{j}\ & = & sum_{k=1}^{n}frac{x_{k}^{2}}{k}+2sum_{k=2}^{n}b_{k}z_{k-1}\ & = & x_1^2 +sum_{k=2}^{n}frac{(z_{k}-z_{k-1})^{2}}{k}+2sum_{k=2}^{n}frac{(z_{k}-z_{k-1})}{k}z_{k-1}\ & = & x_{1}^{2}+sum_{k=2}^{n}frac{z_{k}^{2}-z_{k-1}^{2}}{k}\ & = & sum_{k=1}^{n}frac{z_{k}^{2}}{k}-sum_{k=1}^{n-1}frac{z_{k}^{2}}{k+1}\ & = & sum_{k=1}^{n-1}z_{k}^{2}(frac{1}{k}-frac{1}{k+1})+frac{z_{n}^{2}}{n} end{eqnarray*}

Answered by Weiqiang Yang on December 6, 2021

In this answer I show that the largest eigenvalue is bounded by $5< 3 + 2sqrt{2}$. I will first use the interpretation of this matrix as the covariance matrix of the Brownian motion at times $(frac{1}{n},dots, 1)$ (I reversed the order so that the sequence of times is increasing, which is more natural for me).

We have $A_{ij} = mathbb{E} (B_{t_{i}} B_{t_j})$. The largest eigenvalue will be the supremum over the unit ball of the expression $langle x, A xrangle$, which is equal to $sum_{i,j} A_{ij} x_{i} x_{j}$. This is equal to $mathbb{E} (sum_{i=1}^{n} x_{i} B_{t_{i}})^2$. In order to exploit the independence of increments of the Brownian motion, we rewrite the sum $sum_{i=1}^{n} x_i B_{t_{i}}$ as $sum_{i=1}^{n} y_{i} (B_{t_{i}} - B_{t_{i-1}})$, where $y_{i}:= sum_{k=i}^{n} x_{k}$ and $t_0:=0$. Thus we have

$ mathbb{E} (sum_{i=1}^{n} x_{i} B_{t_{i}})^2 = sum_{i=1}^{n} y_{i}^2 (t_{i}-t_{i-1}). $

The case $i=1$ is somewhat special and its contribution is $frac{y_1^2}{n} leqslant sum_{k=1}^{n} x_{k}^2 = 1$. For the other ones we have $t_{i} - t_{i-1} = frac{1}{(n-i+1)(n-i+2)}leqslant frac{1}{(n-i+1)^2}$. At this point, to get a nicer expression, I will reverse the order again by defining $z_{i}:= y_{n-i+1}$. So we want to estimate the expression

$ sum_{i=1}^{n} left(frac{z_i}{i}right)^2. $

We can now use use Hardy's inequality to bound it by $4 sum_{i=1}^{n} x_{i}^2 =4$. So in total we get 5 as an upper bound, if I haven't made any mistakes.

Answered by Mateusz Wasilewski on December 6, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP