Projections and Orthogonality
Dot Products
Dot products are rather simple. Let's say you have a vector with elements and so on, and a vector with elements Then the dot product is as follows.
Here are some properties of dot products.
It's a somewhat uncommon way of expressing it but you can also express the length (or norm) of a vector using dot products.
Orthogonal Vectors
Two vectors are orthogonal if their dot product equals zero.
The above concept is an important one to understand. It can be proven with the law of cosines, which I can't be bothered to format in LaTeX, so if you really want to know the proof here it is.
Let's talk about vector spaces and
We're gonna assume that is a subspace of .
is vector space made of vectors where every vector in is orthogonal to every vector in
is a subspace of .
is referred to as the orthogonal complement of
There are a few important orthogonal complements to take note of based on the vectors that we have learned.
In words, this means that all of the columns of Row A are orthogonal to every column in Nul A. The second statement is saying all of the columns of Col A are orthogonal to every row in Nul A.
The proof for this is simply that the definition of a null space of A is that , meaning that all the dot products of each row of A and x is 0, fitting the definitions of an orthogonal vector.
Orthogonal Sets
An orthogonal set is a set of vectors in which each vector in the set is orthogonal to every other vector in the set. Note that if the set contains all nonzero vectors, the vectors in the set are linearly independent.
An orthogonal basis for a subspace W in R^n is exactly what it sounds like it is.
Here is a theorem of orthogonal sets. Assuming is an orthogonal basis, the weights in the linear combination for each vector in (the orthogonal basis), the weights are .
Decomposition and Projection
Let's say we had a vector in We can decompose or break up into parts, that vector, into two vectors where one a multiple of some nonzero vector and some other vector orthogonal to it. In other words, observe the diagram.
Here is the notation for the that bottom vector — it is written as . In accordance of our original goal of decomposing the vector into the sum of two vectors, we find that However, this form useless insofar as we don't know the constant Let's look at a more useful form below that doesn't involve calculating the constant at all.
The Orthogonal Decomposition Theorem
Let be a subspace of Then each incan be expressed uniquely in the following form.
where is in and is in Keeping in mind that is an orthogonal basis for , the following holds true.
And of course,
The Best Approximation Theorem
Given that is a subspace of and is any vector in and being the orthogonal projection of onto , is the closest point in to.
Properties of Orthonormal Matrices
An orthonormal matrix is a matrix whose columns from an orthogonal basis in and are all unit vectors (meaning that each vector's magnitude is one). Let's look at some theorems of orthonormal bases, given is an orthonormal matrix.
The Gram-Schmidt Process
The Gram-Schmidt process is a painful method of finding an orthonormal basis for a subspace. Given any basis for a nonzero subspace of the following holds true.
QR Decomposition
A matrix A can be decomposed into a product between orthonormal matrix Q and an upper triangular matrix R. The columns of Q can be found with the Gram-Schmidt process and can be turned into unit vectors.
Well, we know Q is orthonormal, so we can rewrite this. Remember a property mentioned earlier, that where is orthonormal. Therefore, we can rewrite the above equation like so.
Which simplifies to
Last updated