TransWikia.com

general matrix determinant lemma

Mathematics Asked by Silbraz on November 19, 2021

Starting from Prove $mathbf{det(I+xy^T+uv^T)}=(1+mathbf{y^Tx})(1+mathbf{v^Tu)-(x^Tv)(y^Tu)}$
for example, is it possible to generalize as follows?

If $w_i$, $1leq ileq n$ is a basis for $mathbb{R}^n$, is there a closed formula for $det(nI_n-sum_{i=1}^n w_iotimes w_i)$?

It is very easy for example if $w_i$ is the standard basis.

One Answer

The formula you linked to is just Leibniz's formula in disguise. If you put $U=pmatrix{y&v},,V=pmatrix{x&u}$ and $A=U^TV=pmatrix{a&b\ c&d}$, then begin{aligned} det(I_n+xy^T+uv^T) &=det(I_n+VU^T)=det(I_2+U^TV)=det(I_2+A)\ &=(1+a)(1+d)-bc\ &=(1+y^Tx)(1+v^Tu)-(y^Tu)(v^Tx). end{aligned} This, of course, can be generalised to higher dimensions because Leibniz's formula also works in higher dimensions, but from a computational point of view, the practical usefulness of this formula diminish quickly when the matrices are getting larger and larger.

In general, suppose $U$ and $V$ are two $mtimes n$ matrices. Denote the $j$-columns of $U$ and $V$ by $u_j$ and $v_j$ respectively. Let $A=U^TV$. Then begin{aligned} det(xI_n-U^TV)&=det(xI-A)\ &=sum_{r=0}^nsum_{|J|=r}(-1)^rdetleft(A(J,J)right)x^{n-r}\ &=sum_{r=0}^{min(m,n)}sum_{|J|=r}(-1)^rdetleft(A(J,J)right)x^{n-r}\ &=sum_{r=0}^{min(m,n)}sum_{|J|=r}(-1)^r sum_{sigmain S_r}operatorname{sign}(sigma)prod_{i=1}^r a_{J(i),J(sigma(i))} x^{n-r}\ &=sum_{r=0}^{min(m,n)}sum_{|J|=r}(-1)^r sum_{sigmain S_r}operatorname{sign}(sigma)prod_{i=1}^r u_{J(i)}^Tv_{J(sigma(i))} x^{n-r}. end{aligned} Using the identity $x^mdet(xI_n-U^TV)=x^ndet(xI_m-VU^T)$, one obtains $det(xI_m-VU^T)$ as well.

Answered by user1551 on November 19, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP