TransWikia.com

Show that if the matrix of $T$ with respect to all bases of $V$ is same, then $T = alpha I$ where $T$ is a linear operator on $V$

Mathematics Asked on November 24, 2021

Now the only hint I could derive from the question that we might have to use eigenvalues as $Tx = lambda x$ for $T = lambda I$. I think eigenvalues are invariant with basis.

Any help would be appreciated

4 Answers

To complement the other answers, here is a straightforward method, that just uses the definition of the matrix of $T$.

Fix a basis $v_1,ldots,v_n$. We have begin{align} Tv_1&=T_{11}v_1+T_{21}v_2+ldots+T_{n1}v_n\ Tv_2&=T_{12}v_1+T_{22}v_2+ldots+T_{n2}v_n\ end{align} As the matrix of $T$ is the same with respect to the basis $v_2,v_1,v_3,ldots,v_n$, begin{align} Tv_2&=T_{11}v_2+T_{21}v_1+ldots+T_{n1}v_n\ Tv_1&=T_{12}v_2+T_{22}v_1+ldots+T_{n2}v_n\ end{align} Comparing the expressions, and using the linear independence of the basis, we get $T_{11}=T_{22}$ and $T_{21}=T_{12}$. As we can do this with any pair of elements in the basis we get $$ T_{11}=T_{22}=cdots=T_{nn} $$ and $T_{kj}=T_{jk}$ for any $kne j$. Now consider the basis $$ v_1-v_2,v_1+v_2,v_3,ldots,v_n. $$ Using this basis, begin{align} Tv_1-Tv_2&=T_{11}(v_1-v_2)+T_{21}(v_1+v_2)+T_{31}v_3+cdots+T_{n1}v_n\ Tv_1+Tv_2&=T_{12}(v_1-v_2)+T_{22}(v_1+v_2)+T_{32}v_3+cdots+T_{n2}v_n\ end{align} Adding, $$ 2Tv_1=(T_{11}+T_{12}+T_{21}+T_{22})v_1+(T_{21}+T_{22}-T_{11}-T_{12}),v_2+(T_{31}+T_{32})v_3+cdots $$ The previous relations we found allow us to reduce this to $$ Tv_1=(T_{11}+T_{12})v_1+0,v_2+T_{32}v_3+cdots $$ Comparing with the first $Tv_1$ we obtain $T_{11}+T_{12}=T_{11}$, so $T_{12}=0$. As we can do this for any pair $kne j$ of indices, we obtain $T_{kj}=0$. Then $T=alpha,I$, with $alpha=T_{11}$.

Answered by Martin Argerami on November 24, 2021

There are several possible proofs. One goes like this - it uses elementary matrices, but implicitly.

Suppose first of all that there is a vector $v$ such that $v, T v$ are linearly independent. Choose a basis that starts with $v, T v$. Then the first column of $T$ will have a $1$ in the second position, and a zero elsewhere. Now choose a basis that starts with $v, v + T v$. This time the first column of $T$ will have two $1$'s in the first two positions, and then all zeroes.

Since this is a contradiction, we will have that $v, T v$ are always dependent, for all vectors $v$. Let $v, w$ be two independent vectors (if there aren't any, the space has dimension $1$, and we are done). Let $T v = a v$, $T w = b v$, and $T (v + w) = c (v + w)$ for some $a, b, c$. Then $$ c v + c w = T(v + w) = T v + T w = a v + b w, $$ so that $a = c = b$, and you are done.

Answered by Andreas Caranti on November 24, 2021

It seems that this is related to the isotropic matrices, i.e. matrices proportional to the identity. When you use a change of basis you transform $T$ into something else (in general components of the transformed matrix are different, but this does not occur in here). Recall the transformation of $T$ into $Q$: $Q=V^{-1}T : V$. You can replace $T$ by $alpha I$ to check the result for any $V$ (hint: pull out the scalar and use properties of the identity).

Answered by Basco on November 24, 2021

$B^{-1}AB=A$ is the same as $AB=BA$ for every invertible $B$. Now use elementary non-diagonal matrices $B$ to prove the result.

Answered by markvs on November 24, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP