Matrix Algebra
Last updated
Last updated
If A is an m x n matrix and B is an n x p matrix, AB will result in an m x p matrix.
Here is an example using a 2x2 A matrix and a 2x3 B matrix.
Given A is a m x n matrix and B and C are matrices with sizes for which the following are defined, the following are true.
A(BC) = (AB)C
A(B + C) = AB + AC
(B + C)A = BA + CA
r(AB) = (rA)B = A(rB)
However, the following are typically not true.
AB = BA
AB = AC meaning B = C
AB = 0 therefore A = 0 or B = 0
The transpose of A, denoted as A^T is a matrix whose columns are found by the rows of A.
Of course, there are a few theorems associated with transposing as well.
(A^T)^T = A
(A + B)^T = A^T + B^T
(rA^T) = rA^T
(AB)^T = B^T A^T
A square matrix is invertible if it satisfies the following conditions.
The matrix C is the inverse matrix of A. C is unique. Inverse matrices are denoted by ^-1. We can use the determinant to find out of a matrix has an inverse and what the inverse is.
The determinant is denoted as follows.
If the determinant does not equal zero, there exists an inverse.
If A has an inverse, then for each b in R^n, the equation Ax = b has a unique solution, x = A^-1b.
Also note the effects of inversion on the products of invertible matrices.
The order of the above theorem is important. It cannot be swapped.
The inverse of a non-square matrix can be found by row-reducing the following matrix.
Reminder that I_n is an n x n identity matrix.
By row-reducing A to get I_n, you should have this matrix.
If A does not reduce to I_n, A is not invertible.
There are an absolute slew of logically equivalent statements to the invertible matrix theorem, which is simply that A is an invertible matrix.
If A^-1 exists, then the following also holds true.
A is row equivalent to I_n.
A has n pivot positions.
Ax = 0 only has the trivial solution.
Columns of A are linearly independent.
The transformation T: x -> Ax is one-to-one.
The equation Ax = b has at least one solution for each b in R^n (the unique solution, A^-1b).
The columns of A must span all of R^n
T: x-> Ax is onto R^n
There exists a matrix D that AD = I_n
The transpose of T is invertible.
A subspace of R^n is defined as a set H in R^n that satisfies the following conditions:
Vector O is in H
H is closed under vector addition — meaning that if U and V are in H, then U + V must be in H.
H is closed under scalar multiplication — meaning that if u is in H than Cu must be in H.
The column space of a matrix (with the notation Col A) is the set of all linear combinations of the columns of A. Col A is a set of vectors that live in R^m.
The null space of matrix A is denoted by all vectors x such that Ax = 0. Notation is Nul A.
Nul A is the set of vectors that live in R^n.
A basis for a subspace H in R^n is a set of vectors that do 2 things.
The set of vectors is linearly independent in the subspace H.
Those vectors span all of the subspace H.
The pivot columns of A forms a basis for Col A. NOT the pivot columns of a row reduced version of A.