, 18 tweets, 3 min read Read on Twitter
And now, block 2-by-2 linear systems: a thread for a jet-lagged Sunday afternoon.
A block 2-by-2 linear system is a system involving a matrix that looks like

M = [A, B;
C, D]

where A and D are square matrices (though B and C may be rectangular).
Lots of things that don't look initially like they are block 2-by-2 linear systems can be turned into such systems. As a standard example: consider solving a linear system

(A+b c^T) x = f

If we introduce a new variable z = c^T x, we get a block 2-by-2 system in x and z.
That is we get, M*u = h where

M = [A, b; c', -1]
u = [x; z]
h = [f; g]
We can do Gaussian elimination on block 2-by-2 systems the same as we would without blocks, as long as we're careful about not assuming things commute. Eliminating x in the above gives

(1+c' inv(A) b) z = c' inv(A) f

Back-substitute z to get the Sherman-Morrison update.
One can also use borders to solve a piece of a linear system. For example, to solve all but row i of

Ax = b

with a zero in the jth entry of x, we solve Mu = h with

M = [A, ei; ej, 0]
u = [x; y]
h = [b; 0]

where ei, ej are the ith / jth columns of the identity.
That one's useful for deriving fast formulas for things like leave-one-out cross-validation (LOOCV) for linear least squares. Also good for fast iteratively reweighted least squares with Huber loss (or related) for dealing with outliers.
The 2-by-2 linear system Mu = h with

M = [I, A; A', d*I]
u = [r; x]
h = [b; 0]

corresponds to a regularized least squares problem

min norm(Ax-b)^2 + d^2 norm(x)^2

where r = b-Ax is the residual. Eliminating r gives the usual normal equations.
This one is also great because in the limit as d goes to zero, we recover a minimal norm solution to a (possibly) under-determined problem. And we can replace the diagonal blocks by something positive definite to get least squares with other norms.
Because this single linear system can be used to deal with (regularized) least squares problems or with minimal norm problems, it's great for thinking about connections between the two.
Combine the idea of "think about least squares / minimal norm via block 2-by-2" and "add border to eliminate an equation and unknown," and you get lots of fun linear algebra tricks relevant for fast Gaussian process computations (or other kernel problems).
Of course, the block 2-by-2 least squares formulation is closely connected to the matrix

M = [0, A; A', 0],

which I think of as the Golub-Kahan matrix; the eigenvalues / vectors of this matrix encode the singular values / vectors of A.
The KKT matrix for constrained quadratic optimization is another good example:

minimize f(x) = x' A x / 2 - b' x subject to c' x = d

The minimizer (assuming A is positive definite) satisfies Mu = h where

M = [A, c; c', 0]
u = [x; y]
h = [b; d]
It's easy to explain a lot about the primal-dual relationships of convex optimization in this type of setting: just do Gaussian elimination in different orders.
Block Gaussian elimination on M = [A, B; C, D] can be written as a block LU factorization

M = [I, 0; -C*inv(A), I] * [A, B; 0, S]

where S = D-C*inv(A)*B is the Schur complement. If we let P = inv(M), then the (2,2) submatrix of P (partitioned as with M) is P22 = inv(S).
This relationship is useful for thinking about the precision and covariance for jointly distributed random variables. It's where the formula for predictive variance for a Gaussian process comes from -- or the formula for the power function in RBF interpolation theory.
One can also understand an awful lot about linear boundary integral equations by understanding how Schur complements work (matrices get replaced by operators, but the block 2-by-2 idea remains the same).
Really, every time I teach matrix computations, I feel at the end like I've spent a fair fraction of the course talking about some manipulation on block 2-by-2 matrices.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to David Bindel
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!