# Decomposition of a linear operator to a partially orthogonal operator and a semi-definite self-adjoint operator

Mathematics Asked by Zhanxiong on January 7, 2022

$$DeclareMathOperator{A}{mathscr{A}}$$
$$DeclareMathOperator{B}{mathscr{B}}$$
$$DeclareMathOperator{C}{mathscr{C}}$$
$$DeclareMathOperator{kernel}{mathrm{Ker}}$$
$$DeclareMathOperator{diag}{mathrm{diag}}$$
$$DeclareMathOperator{span}{mathrm{span}}$$
$$DeclareMathOperator{real}{mathbb{R}^2}$$
$$DeclareMathOperator{rank}{text{rank}}$$

The question is:

Let $$A$$ be a linear operator on the $$n$$-dimensional Euclidean space $$V$$. Prove there exists a partially orthogonal operator $$B$$ and a semi-definite self-adjoint operator $$C$$,
$$kernel(B) = kernel(C)$$, such that $$A = BC$$, and operators $$B$$ and $$C$$ are unique; Prove the linear operator $$A$$ is normal if and only if $$B$$ commutes with $$C$$.

An operator $$B$$ is called partially orthogonal if there exists a $$B$$-invariant subspace $$U$$ such that $$|B(alpha)| = |alpha|$$ for any $$alpha in U$$, and $$B(alpha) = 0$$ for any
$$alpha in U^bot$$. It is easy to show that an operator is partially orthogonal if and only there exists an orthonormal basis $${xi_1, ldots, xi_n}$$ such that $$B(xi_1, ldots, xi_n) = (xi_1, ldots, xi_n)diag(O, 0)$$, where $$O$$ is an order $$r = dim(U)$$ orthogonal matrix.

Oddly enough, so far I am able to prove the uniqueness and commuting statement but the construction itself. Inspired by the polar decomposition, I tried using the singular value decomposition of $$A$$ (the matrix of $$A$$ under a fixed orthonormal basis) as follows:
begin{align*} A = O_1diag(M, 0)O_2 = O_1diag(I_{(r)}, 0)O_2 times O_2’diag(M, 0)O_2 =:BC, end{align*}
where $$M = diag(mu_1, ldots, mu_r)$$ is a diagonal matrix with its diagonal elements of all eigenvalues. However, the $$B$$ defined in this way is not partially orthogonal. If enforcing $$B$$ to be partially orthogonal, then $$C$$ cannot be made symmetric.

So probably a new perspective is needed here. I appreciate any insights.

As pointed out by @Ben Grossmann, such decomposition is not universal. For example, consider the matrix $$A = begin{pmatrix}0 & 1 \ 0 & 0end{pmatrix}$$. Below we show this $$A$$ does not admit the required decomposition.

Suppose $$A = BC$$, where $$B$$ is partial orthogonal and $$C geq 0$$. Since $$kernel(A) = span({(1, 0)’}) supset kernel(C)$$, $$kernel(C) = span((1, 0)’)$$ or $$kernel(C) = {0}$$. If $$kernel(C) = {0}$$, then
$$kernel(B) = {0}$$, which implies $$rank A = 2$$, contradiction. Hence $$kernel(B) = kernel(C) = span((1, 0)’)$$.

Since $$C geq 0$$ and $$rank C = 1$$, we can accordingly assume $$C$$‘s spectral decomposition is
$$C = Odiag(lambda, 0)O’$$ where $$lambda neq 0$$ and $$O$$ is an order $$2$$ orthogonal matrix. By solving $$Odiag(lambda, 0)O'(1, 0)’ = (0, 0)’$$ and noting $$O$$ is orthogonal, it follows that $$O = begin{pmatrix} 0 & 1 \ 1 & 0 end{pmatrix}$$, thereby $$C = begin{pmatrix} 0 & 0 \ 0 & lambda end{pmatrix}$$.

Since $$B$$ is partially diagonal and $$rank B = 1$$, there exists an order $$2$$ orthogonal matrix $$P$$ such that $$B = Pdiag(1, 0)P’$$. In the same argument as above, $$P = begin{pmatrix} 0 & 1 \ 1 & 0 end{pmatrix}$$, resulting
$$B = begin{pmatrix} 0 & 0 \ 0 & 1 end{pmatrix}$$.

As a result, $$A = BC = begin{pmatrix} 0 & 0 \ 0 & lambda end{pmatrix}$$, this is a contradiction.

Either your understanding of the question is wrong, or the author has suggested something that isn't true. In either case, there exist operators $$mathscr A$$ for which such a decomposition does not exist.

For example, take $$A = pmatrix{0&1\0&0}.$$ We note that if $$ker mathscr B = ker mathscr C$$, then both operators must have kernel equal to that of $$A$$, i.e. the span of $$(1,0)$$. This means that the image of $$mathscr B$$ must be the orthogonal complement to $$(1,0)$$, which would mean (assuming that the subspace on which $$|mathscr Balpha| = |alpha|$$ is invariant) that the image of $$mathscr A$$ is a subspace of the span of $$(0,1)$$.

However, this is not the case.

Here is an answer if we drop the requirement that $$ker mathscr B = ker mathscr C$$, but assume that your "partially orthogonal" is accurate.

Using the singular value decomposition, under a suitable choice of orthonormal basis $$mathscr A$$ has the form $$A = pmatrix{Sigma & 0\0 & 0} U,$$ where $$U$$ is an orthogonal matrix and $$Sigma$$ is diagonal with positive entries. Partition the matrix $$U$$ conformally to get $$A = pmatrix{Sigma & 0\0 &0}pmatrix{U_{11} & U_{12}\ U_{21} & U_{22}} = pmatrix{Sigma U_{11} & Sigma U_{12}\ 0 & 0}.$$ Note that $$mathscr B$$, which contains the image of $$mathscr A$$, must be such that the restriction of $$mathscr B$$ to its image is an orthogonal operator. In other words, the matrix $$B$$ of $$mathscr B$$ must be block diagonal. With that in mind, define $$B = pmatrix{U_{11} & 0\0 & 0}, \ C = pmatrix{U_{11}^T Sigma U_{11} & U_{11}^T Sigma U_{12}\ U_{12}^T Sigma U_{11} & U_{12}^T Sigma U_{12}} = pmatrix{U_{11} & 0\0 & U_{12}}^T pmatrix{Sigma & Sigma\ Sigma & Sigma} pmatrix{U_{11} & 0\0 & U_{12}}.$$

Answered by Ben Grossmann on January 7, 2022

## Related Questions

### Fourier expansions of Eisenstein series as a Poincare series for the Fuchsian group

1  Asked on January 7, 2022 by lww

### Is eigenvalue multiplied by constant also an eigenvalue?

1  Asked on January 7, 2022 by ruby-cho

### Can someone explain the proof of the following linear differential equation

3  Asked on January 7, 2022 by lucas-g

### Matroid induced by a matrix where a circuit’s nullspace is spanned by a non-negative vector

1  Asked on January 7, 2022 by kaba

### Area between parabola and a line that don’t intersect? 0 or infinity

1  Asked on January 5, 2022

### Positive integer solutions to $frac{1}{a} + frac{1}{b} = frac{c}{d}$

3  Asked on January 5, 2022

### Are all complex functions onto?

4  Asked on January 5, 2022 by truth-seek

### Proving Euler’s Totient Theorem

3  Asked on January 5, 2022

### Why is identity map on a separable Hilbert space not compact? False proof.

1  Asked on January 5, 2022

### Finding a general way to construct least degree polynomial having rational coefficient having irrational roots

1  Asked on January 5, 2022

### Checking the MLE is consistent or not in $mathcal{N}(theta,tautheta)$.

1  Asked on January 5, 2022 by confuse_d

### Conditions on inequalities $a>b$ and $b<c$ to deduce $a<c.$

3  Asked on January 5, 2022

### Are $mathbb{C}-mathbb{R}$ imaginary numbers?

2  Asked on January 5, 2022 by unreal-engine-5-coming-soon

### Show that $|uv^T-wz^T|_F^2le |u-w|_2^2+|v-z|_2^2$

3  Asked on January 5, 2022

### Let $f,g$ be holomorphic function in $mathbb{D}$ that are continuous in $overline{mathbb{D}}$. Show that if $f=g$ on $|z|=1$, then $f=g$

1  Asked on January 5, 2022

### What is the Fourier transform of the bump function $e^{-frac{1}{1-|x|^2}}$?

0  Asked on January 5, 2022 by medo

### What is the valus of this integral?

0  Asked on January 5, 2022 by bachamohamed

### Why are the limits of integration set as they are for the Laplace Transform?

1  Asked on January 5, 2022 by jonathan-x

### Finding the local extrema of $f(x, y) = sin(x) + sin(y) + sin(x+y)$ on the domain $(0, 2 pi) times (0, 2 pi)$

2  Asked on January 5, 2022

### Sum $sum_{(k_1, k_2, k_3): k_1+k_2+k_3=K, ,, n_1+n_2+n_3=N}k_1^{n_1}times k_2^{n_2} times k_3^{n_3}$

0  Asked on January 5, 2022