# R lmer model: degree of freedom and chi square values are zero

Cross Validated Asked by RoroMario on December 9, 2020

I have built the following models:

full <- lmer(DV~ A*B + (1|speaker), data, REML=FALSE)

A <- lmer(DV~ A+ A:B + (1|speaker), data, REML=FALSE)
B <- lmer(DV~ B+ A:B + (1|speaker), data, REML=FALSE)
interaction <- lmer(DV~ A + B + (1|speaker), data, REML=FALSE)


I use anova to compare the first full model to the other ones:

anova(full, A)
anova(full, B)
anova(full, interaction)


The first two comparisons generated results with both df and chi square values being zeros, as shown below: However, I have also tried to compare the null model with another model only include A or B:

null <- lmer(DV~ 1 + (1|speaker), data, REML=FALSE)
AA <- lmer(DV~ A + (1|speaker), data, REML=FALSE)
BB <- lmer(DV~ B + (1|speaker), data, REML=FALSE)
AB <- lmer(DV~ A:B + (1|speaker), data, REML=FALSE)


all the comparisons generated reasonable results (i.e. not 0 df and all comparisons are significant)

I have looked online and found this post: https://www.researchgate.net/post/What_is_a_Likelihood_ratio_test_with_0_degree_of_freedom

And my guess is that maybe for my full model, the interaction might be able to predict everything without the main effects (A and B).

I have a few questions:

1. Is my guess possibly true?
2. If it is true, why did the comparison with the null model show a significant effect?
3. On a more general scale, when I build linear mixed effect models, can I start from the Null model and add a factor at a time, then compare with the previous models? Or do I have to reduce from the full model?
4. If I use A+B as the base model:
base <- lmer(DV~ A+B + (1|speaker), data, REML=FALSE)

A <- lmer(DV~ A + (1|speaker), data, REML=FALSE)
B <- lmer(DV~ B + (1|speaker), data, REML=FALSE)
interaction <- lmer(DV~ A*B + (1|speaker), data, REML=FALSE)


Is it ok to report the comparison between the base model and A, B, interaction respectively?

Please find the data file and the R markdown document here: dropbox.com/sh/88m8h6blow2xbn5/AABiNccsUlu3AlfPyamQP4n_a?dl=0
I also asked a question about the procedures I used in the R script in this post R lmer model: add factors or reduce factors

I’d be most grateful if you could help me please. Thank you!

This happens because models full, A and B are in fact the same. They are just parameterised differently. To see this, inspect the estimates for the full model:

(Intercept)  6.03977    0.34949  17.282
AT2         -0.55051    0.07597  -7.246
AT3         -1.16472    0.07597 -15.331
AT4          0.48228    0.07597   6.348
BS          -0.64024    0.07597  -8.427
AT2:BS       0.35379    0.10744   3.293
AT3:BS       0.47244    0.10824   4.365
AT4:BS       0.05247    0.10744   0.488


In model A, we have removed the main effect for the variable B and then obtain:

            Estimate Std. Error t value
(Intercept)  6.03977    0.34949  17.282
AT2         -0.55051    0.07597  -7.246
AT3         -1.16472    0.07597 -15.331
AT4          0.48228    0.07597   6.348
AT1:BS      -0.64024    0.07597  -8.427
AT2:BS      -0.28645    0.07597  -3.770
AT3:BS      -0.16781    0.07710  -2.177
AT4:BS      -0.58777    0.07597  -7.737


We immediately see that the estimates for the intercept AT2- AT4 are the same. The estimate for AT1:BS in the second model is identical to the estimate for the main effect for B in the full model (because the second model does not include the main effect for B). Then, for the same reason, the remaining interaction terms in the second model will be the sum of the main effect for B in the full model, and the equivalent interaction terms:

> -0.64024 + 0.35379
 -0.28645
> -0.64024 + 0.47244
 -0.1678
> -0.64024 + 0.05247
 -0.58777


I think it is good general advice to always include both main effects in a model which includes their interaction. This type of problem will then not occur.

Answered by Robert Long on December 9, 2020

## Related Questions

### How to test consistency of responses?

2  Asked on January 7, 2022 by user4451922

### How can I understand these variograms?

1  Asked on January 7, 2022 by zheyuan-li

### Trying to wrap my arms around copulas

2  Asked on January 7, 2022 by esurfsnake

### Justification for and optimality of $R^2_{adj.}$ as a model selection criterion

3  Asked on January 7, 2022

### How to interpret results of mixed longitudinal model in R?

1  Asked on January 7, 2022 by lazylarry

### $E(xy)<infty$ proof

0  Asked on January 7, 2022

### Calculating eigen values from principal components and deciding on the number of principal components?

1  Asked on January 7, 2022

### Question on method for analysis

0  Asked on January 5, 2022 by paul-kumar

### Selecting uncorrelated samples from a set of bulk data that contains correlated and dependent samples

1  Asked on January 5, 2022 by sarmes

### Test for significance of peaks (maximum) in time series

1  Asked on January 5, 2022 by dmort

### How does autoregression work in R?

2  Asked on January 5, 2022

### Error propagation in combined linear models

1  Asked on January 5, 2022 by chochot

### When calculating the Gini coefficient for the US, how should the portion of the population which has not filed a return be incorporated?

0  Asked on January 5, 2022

### Sampling uncertainty of posterior probability distribution

1  Asked on January 5, 2022 by milan-bosnic

### Why can’t do ridge regression with one predictor?

1  Asked on January 5, 2022

### How to model properly sequential data when the output has to be used as part of the next input? Model off completely when it makes single mistake

1  Asked on January 5, 2022 by iuppiter

### Determining whether logistic regression with robust variance for repeated measures is appropriate for my data, or which other model type to use

0  Asked on January 5, 2022

### How to create an ROC model using three classes

0  Asked on January 5, 2022 by kliocontar

### What is the origin of the name “conjugate prior”?

2  Asked on January 5, 2022

### Linear Mixed Regression Variance Decomposition

0  Asked on January 5, 2022 by lstdnce