TransWikia.com

Efficient projection of a vector onto matrix kernel

Computational Science Asked on October 23, 2021

Given an $m times n$ matrix $A$ and a vector $xinmathbb R^n$, with $m<n$, what’s an efficient way of computing the projection of $x$ onto the kernel of $A$?

One Answer

One part of the fundamental theorem of linear algebra is that the kernel/nullspace of $mathbf A$ is orthogonal to the range of $mathbf A^T$. By applying the $mathbf Q mathbf R$ decomposition to $mathbf A^T$, you can generate the orthogonal projector $mathbf P = mathbf I - mathbf Q mathbf Q^T$. The vector $mathbf P mathbf x$ is what you're looking for. A brief matlab demo follows:

clear all
close all

% Form random A and x.
m = 23;
n = 39;
A = rand(m,n);
x = rand(n,1);

% Find Q = span(A')
[Q,~] = qr(A',0);

% Decompose x = Qx + Px
Qx = Q*(Q'*x);
Px = x-Qx;
norm_Px = norm(Px)
norm_Qx = norm(Qx)
error_x = norm(x-Px-Qx)

% Verify Px is in nullspace of A.
error_APx = norm(A*Px)

If $mathbf A$ is too large but has exploitable structure (sparsity? some kind of H-matrix like rank-deficiency?), you might be better off using using randomized sampling / Krylov ideas, instead of dense $mathbf Q mathbf R$ decomposition.

Answered by rchilton1980 on October 23, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP