TransWikia.com

What does it mean if magnitude of the variance of each measurement is allowed to be a function of its predicted value?

Cross Validated Asked by Kurtis Pykes on November 2, 2021

To better understand Logistic Regression and why it is called regression still, I was reading about Generalized Linear Models on Wikipedia, and I came across the below statement:
"The GLM generalizes linear regression by allowing the linear model to be related to the response variable via a link function and by allowing the magnitude of the variance of each measurement to be a function of its predicted value."

I am not quite sure I understand the latter part as highlighted in the question. Is it possible this can be clarified for me please?

One Answer

The link function is a transformation of the outcome variable that is used to associate the predictors with the outcome. In linear regression you construct a linear predictor* to estimate the outcome. Ordinary least squares can be thought of as having an identity link function; that is, the value of the linear predictor is itself the prediction. But with logistic regression you map the linear predictor to the logit, the link function, of the probability. That spreads out the [0,1] range of probabilities to cover the entire real axis.

Such generalized linear models don't have closed-form solutions like ordinary linear regression, so they are fit by maximum-likelihood methods. You need to take the actual relationship between the mean and the variance into account to calculate the likelihood.

One simple example with a relationship between mean and variance is the Poisson distribution for count data. If data are distributed that way, the true mean and the variance are identical. For individual Bernoulli trials with probability of success $p$, which underlie logistic regression, the variance is $p(1-p)$. Those are different from the normal distribution, for which the mean and variance can be independent.

So it's the combination of the link function and the model for the variance that generalizes ordinary linear regression to these other situations.


*The linear predictor is a linear function of the model coefficients, but those can be coefficients of non-linear transformations of the original predictor variables. That's another way in which the term "linear regression" can seem misleading.

Answered by EdM on November 2, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP