TransWikia.com

proof of Cramer-Rao lower bound

Cross Validated Asked by manav on November 2, 2021

I am trying to understand the proof for this theorem from the book Casella and Berger (2nd ed.) page 336. $W(X)$ if any estimator for samples $X_1,ldots,X_n$ on distribution $f(X|theta)$. I notice that there is no assumption of unbiasedness for $W(X)$ till equation 7.3.8 which reads
$$E_{theta}left(frac{partial}{partialtheta}log f(X|theta)right)=frac{d}{dtheta}E_{theta}[1]=0$$
This makes sense according to the argument given in the book where they substitute $W(X)=1$ to come to the above result. But I fail to see this equation holding in general. More specifically, if I think of a uniform distribution between 0 and $theta$ (where $theta$ parameterize the distribution), then $f(x|theta)=frac{1}{theta}$. And hence, $E_{theta}left(frac{partial}{partialtheta}log f(X|theta)right)=int_{mathcal{X}}frac{partial}{partial{theta}}log f(x|theta)f(x|theta)dx=int_{mathcal{X}}frac{partial}{partial{theta}}f(x|theta)dx=-int_{0}^{theta}frac{dx}{theta^2}neq0$

What am I not understanding?

edit: After πr8 suggestion, I tried it on exponential and normal distribution and it does give 0. Is discontinuity at boundary of the example I gave the reason?

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP