TransWikia.com

Variance of random variable decreasing in parameter

MathOverflow Asked on November 7, 2021

I did quite a few numerical computations and think the following is true, but I cannot prove it:

Let $varphi(x):=sum_{i=1}^n varphi_i(x_i)$ where $x=(x_1,…,x_n) in mathbb{R}^n$ and $varphi_i in C^{infty}$ are even scalar convex functions such that $varphi”$ is strictly increasing on $[0,infty).$

We then define a probability measure (which under apppropriate normalization) is defined as $$p_y(x) propto e^{langle y, xrangle}e^{-varphi(x) } dx. $$

Can we show that for all unit vectors $z in mathbb{R}^n$ we have for all $y in mathbb{R}^n$

$$Var_{p_0}(langle z,X_0 rangle_{mathbb{R}^n}) ge Var_{p_y}(langle z,X_y rangle_{mathbb{R}^n})?$$

In other words, the variance of $langle z,X_yrangle$ where $X_y$ is distributed according to $p_y$ is maximized at $y=0$ for any unit vector $z.$

Is this a known theorem or somehow easy to show?-Any pointers are highly appreciated and please let me know if there are any questions.

One Answer

Your probability measure is a product measure, so by $$text{Var}_y(langle z,Xrangle) = sum_{i=1}^nz_i^2text{Var}_{y_i}(X_i)$$ everything reduces to the 1d case. Let $q_y(dx)=e^{xy-varphi(x)-C(y)}dx$ be one of the marginals, where $yinmathbb{R}$ and $C(y)$ is chosen such that $q_y$ is normalized, and denote by $U_y$ the 1d r.v. with distributon $q_y$. It can be shown that $C(y)-C(0)$ is the cumulant-generating function of $U_0$, and $text{Var}_y(U_y)=C''(y)$. Thus the variance has a local maximum at $y=0$ iff the third cumulant $kappa_0^{(3)}$ of $U_0$ vanishes and the fourth cumulant $kappa_0^{(4)}$ is negative. Indeed, if we denote by $m_y^{(k)}$ the $k$'th moment of $U_y$, we have $$kappa_0^{(3)}=m_0^{(3)}-3m_0^{(1)}m_0^{(2)}+2left(m_0^{(1)}right)^3=0,$$ because $p_0$ is symmetric and therefore the first and third moments are zero. Similarly, dropping the odd-numbered moments, we have $$kappa_0^{(4)}=m_0^{(4)}-3left(m_0^{(2)}right)^2.$$ Therefore assuming that $varphi(x)>lambda x^2$ a.e. for some $lambda>0$ $(star)$, then begin{align*} kappa_0^{(4)}&=int_{mathbb{R}^2}left(x^4-3x^2y^2right)e^{-varphi(x)-varphi(y)-2C(0)}dx dy \ &< e^{-2C(0)}int_{mathbb{R}^2}left(x^4-3x^2y^2right)e^{-lambda x^2-lambda y^2}dx dy\ &=0, end{align*} because the last expression is proportional to the fourth cumulant of a Gaussian. In order to show that this is the global maximum, let's show that $C''(y)$ is concave, i.e. $C^{(4)}(y)<0$ for all $yinmathbb{R}$. In fact, since $$C(y)=logint_{-infty}^{infty}e^{xy-varphi(x)}dx,$$ $C^{(4)}(y)$ is the 4'th cumulant of $U_y$, i.e. begin{align*} C^{(4)}(y)&=m_y^{(4)}-4m_y^{(3)}m_y^{(1)}-3left(m_y^{(2)}right)^2+12m_y^{(2)}left(m_y^{(1)}right)^2-6left(m_y^{(1)}right)^4\ &proptoint_{mathbb{R}^4}left(x_1^4-4x_1^3x_2-3x_1^2x_2^2+12x_1^2x_2x_3-6x_1x_2x_3x_4right)e^{sum_{i=1}^4(x_iy-varphi(x_i))}dx\ &<int_{mathbb{R}^4}left(x_1^4-4x_1^3x_2-3x_1^2x_2^2+12x_1^2x_2x_3-6x_1x_2x_3x_4right)e^{sum_{i=1}^4(x_iy-lambda x_i^2)}dx\ &=0, end{align*} again because the last expression is proportional to the fourth cumulant of a Gaussian (with non-zero mean).

$(star)$ This is slightly different than your 'increasing convexity' assumption, but seems sufficiently close.

Answered by S.Surace on November 7, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP