# Why does scipy use Wald Statistic + t-test as opposed to Wald Statistic + Wald test for linear regression?

When performing linear regression, why does scipy use Wald Statistic followed by a t-test, as opposed to Wald Statistic followed by a Wald test?

The following code performs linear regression on two random 5-element vectors using scipy:

import scipy.stats
import random

# Draw some random X and Y
sampleSize = 5
X = [random.uniform(0, 1) for i in range(sampleSize)]
Y = [random.uniform(0, 1) for i in range(sampleSize)]

# named tuple for scipy's linear regression
resultScipy = scipy.stats.linregress(X, Y)

print(resultScipy)


Example output is:

LinregressResult(slope=-0.36654925096390012, intercept=0.67985896267369861, rvalue=-0.32357812225448918, pvalue=0.59531436421063166, stderr=0.61883685654551934)


I very much wanted to know how scipy obtains the pvalue, in this case 0.59531436421063166. Note that the scipy’s documentation isn’t particularly enlightening:

p-value: float

two-sided p-value for a hypothesis test whose null hypothesis is that the slope is zero.

After some python heavy-lifting I’ve written some code, which fully reproduces scipy’s behaviour:

import random
import math
import collections
import scipy.stats
import scipy.stats.distributions as distributions
import numpy as np
import sys

# Draw some random X and Y
sampleSize = 5
X = [random.uniform(0, 1) for i in range(sampleSize)]
Y = [random.uniform(0, 1) for i in range(sampleSize)]

# Compute their average
avgX = sum(X)/sampleSize
avgY = sum(Y)/sampleSize

# Partial steps to compute estimators of linear regression parameters.
XDiff = [X_i - avgX for X_i in X]
XDiffSquared = [i*i for i in XDiff]
YDiff = [Y_i - avgY for Y_i in Y]

# B1 is the estimator of slope.
# B0 is the estimator of intercept.
# r is the estimator of Y given X.
B1 = sum(x * y for x, y in zip(XDiff, YDiff)) / sum(XDiffSquared)
B0 = avgY - B1*avgX
r = lambda x: B0 + B1*x

# Partial steps to compute Wald Statistic.
errs = [y - r(x) for x, y in zip(X, Y)]
errStd = math.sqrt((1/(sampleSize-2))*(sum([err**2 for err in errs])))
XStd = math.sqrt((1/(sampleSize))*sum([diff**2 for diff in XDiff]))
stdB1 = errStd / (XStd * math.sqrt(sampleSize))

# Wald Statistic.
W = (B1 - 0)/stdB1

# pvalue of Wald Test of B1 = 0.
pvalueWald = 2*scipy.stats.norm.cdf(-abs(W))

# pvalue of T test of B1 = 0.
pvalueT = 2*distributions.t.sf(abs(W), sampleSize - 2)

# named tuple for our linear regression, with p-value coming from Wald Statistic
LinregressResult = collections.namedtuple("LinregressResult", ["slope", "intercept", "rvalue", "pvalue", "stderr"])
resultWald = LinregressResult(slope=B1, intercept=B0, rvalue=None, pvalue=pvalueWald, stderr=stdB1)

LinregressResult = collections.namedtuple("LinregressResult", ["slope", "intercept", "rvalue", "pvalue", "stderr"])
resultT = LinregressResult(slope=B1, intercept=B0, rvalue=None, pvalue=pvalueT, stderr=stdB1)

# named tuple for scipy's linear regression
resultScipy = scipy.stats.linregress(X, Y)

print(resultWald)
print(resultT)
print(resultScipy)


The code computes Wald Statistic on the estimator of the slope (gradient), followed by a (sampleSize - 2) degree of freedom t-test.

Why is the p-value computed in such way? My go-to statistics textbook by Larry Wasserman, suggests that I compute the p-value of linear regression in a different way, by computing Wald Statistic, followed by a Wald test.

The two tests give very different results when the sample size is small and the slope is extreme, with Wald’s test p-values being consistently lower than t-test’s (even by orders of magnitude).

Which approach is better, more correct, and which one should I use?

Cross Validated Asked by Adam Kurkiewicz on December 29, 2020

This issue was discussed at scipy's issue tracker:

https://github.com/scipy/scipy/issues/7074

It turns out that using Wald Statistic increases the type 1 error rate, as seen in the plots above.

Correct answer by Adam Kurkiewicz on December 29, 2020

## Related Questions

### Time series tracking queue optimization problem

1  Asked on January 14, 2021 by doxav

### Sample log geometric distribution from log probability

1  Asked on January 14, 2021

### what is the likelihood function $p(y|a,tau)$ of simple linear regression model?

1  Asked on January 14, 2021 by user261225

### Forecasting with mixed models

1  Asked on January 13, 2021 by katy

### Why do some researchers use the oxymoron “prevalence rate”?

0  Asked on January 13, 2021

### How to calculate out of sample R squared?

2  Asked on January 13, 2021 by crazydriver

### Denoising 3D matrix

0  Asked on January 13, 2021 by haohan-wang

### In this Bayesian network, where does this posterior probability come from?

1  Asked on January 13, 2021 by vin

### What is wrong with my approach on a custom way of creating Gabor-filter convolution kernels?

0  Asked on January 12, 2021 by g-s-luimstra

### Pseudo-inverse matrix for multivariate linear regression

1  Asked on January 12, 2021 by somethingsomething

### Assessing the representativeness of population sampling

1  Asked on January 12, 2021 by user3136

### Question about an exponential Bernoulli distribution

1  Asked on January 11, 2021 by ad-van-der-ven

### How can an A/B test show significant result without enough data

0  Asked on January 11, 2021 by jonas-palaionis

### Cross-lagged model and supplement regressions: Do I have to include my control variables in the supplement regression analyses?

0  Asked on January 11, 2021 by sventon

### Is it Valid to Grid Search Cross Validation for Model Hyperparameter Selection then a separate Cross Validation for Generalisation Error?

2  Asked on January 11, 2021 by benjamin-phua

### Find $E[N^2 | N > 2]$ for a frequency distribution

1  Asked on January 10, 2021 by confusedmathstudent

### Finding meaningful boundaries between two continuous variables in R

0  Asked on January 10, 2021

### Using categorical feature as both a continuous feature, and also doing One hot encoding. Is this overkill?

2  Asked on January 10, 2021 by stats_nerd

### Formal book about GLM

0  Asked on January 10, 2021 by anto_zoolander

### Expected value of the residuals

2  Asked on January 10, 2021 by snoopy

### Ask a Question

Get help from others!