TransWikia.com

Fast calculation of normalization factor for normalized cross correlation

Signal Processing Asked on October 24, 2021

There has been a number of posts here I’ve browsed through that explain implementations of normalized cross-correlation in Python. One such implementation that is frequently cited is found below.

def normxcorr2(template, image, mode="full"):

template = template - np.mean(template)
image = image - np.mean(image)

a1 = np.ones(template.shape)
# Faster to flip up down and left right then use fftconvolve instead of scipy's correlate
ar = np.flipud(np.fliplr(template))
out = fftconvolve(image, ar.conj(), mode=mode)

image = fftconvolve(np.square(image), a1, mode=mode) - 
        np.square(fftconvolve(image, a1, mode=mode)) / (np.prod(template.shape))

# Remove small machine precision errors after subtraction
image[np.where(image < 0)] = 0

template = np.sum(np.square(template))
out = out / np.sqrt(image * template)

# Remove any divisions by 0 or very close to 0
out[np.where(np.logical_not(np.isfinite(out)))] = 0

return out

https://github.com/Sabrewarrior/normxcorr2-python/blob/master/normxcorr2.py

My understanding is that the normalization term is the square root of the autocorrelation of the two input images.
begin{equation}C=frac{sum_{i=1}^{n}(I(i)-bar{I})(J(i)-bar{J})}{sqrt{sum_{i=1}^{n}(I(i)-bar{I})^{2} sum_{j=1}^{n}{(J(i)-bar{J})^{2}}}}end{equation}

for two images $I$ and $J$ respectively. I am having difficulty understanding this line:

image = fftconvolve(np.square(image), a1, mode=mode) - 
        np.square(fftconvolve(image, a1, mode=mode)) / (np.prod(template.shape))

Is this a clever way of calculating the autocorrelation of one of the input images? Why not take the autocorrelation like the ‘template’ image?

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP