About
The Traditional notion of a matrix is:
- a two-dimensional array
- a rectangular table
of known or unknown numbers
One simple role for a matrix: packing together a bunch of columns or rows
Example
Matrix <math>A_(i,j)</math> of <math>A[i,j]</math>
<math>j=@</math> | <math>j=\#</math> | <math>j=\%</math> | |
---|---|---|---|
<math>i=a</math> | 1 | 2 | 3 |
<math>i=b</math> | 4 | 5 | 6 |
where:
Index | Type | In Matrix A |
---|---|---|
i | rows | 2 rows [1,2,3] and [4,5,6] |
j | columns | 3 columns: [1,4], [2,5] and [3,6] |
Rows and columns are vectors:
- Row ’a’ is the vector [1,2,3]
- Column '%' is the vector [3,6]
Order
The size, or order, of a matrix is given by identifying the number of rows and columns.
The order of matrix A is 2×3
Function
Matrix as a function: An R x C Matrix over the field F is a function from R x C to F where:
- R is finite set of unique Row labels: {a, b}
- C is finite set of unique Column labels: {@, #, %}
As it's a function it can be interpreted as an R x C-vector:
From matrix to function: <math>f (x) = M * x</math> where M is a matrix
If M is an R x C matrix over <math>\mathbb{F}</math> then
- domain of f is <math>\mathbb{F}^C</math>
- co-domain of f is <math>\mathbb{F}^R</math>
Example of function:
- Rotation
- Translation
Example
Example 1
Let M be the matrix <math> \begin{array}{r|rrr} & \# & @ & ? \\ \hline a & 1 & 2 & 3 \\ b & 10 & 20 & 30 \end{array} </math> and define <math>f ({\bf x}) = M * {\bf x}</math> then:
- domain of f is <math>\mathbb{F}^{\#,@,?}</math>
- co-domain of f is <math>\mathbb{F}^{a,b}</math>
f maps <math> \begin{array}{rrr} \# & @ & ? \\ \hline 2 & 2 & -2 \\ \end{array} </math> to <math> \begin{array}{rr} a & b \\ \hline 0 & 0 \\ \end{array} </math>
Example 2
Define <math> f({\bf x}) = \begin{bmatrix} 1 & 2 & 3 10 & 20 & 30 \end{bmatrix} * {\bf x} </math> :
- domain of f is <math>\mathbb{R}^3</math>
- co-domain of f is <math>\mathbb{R}^2</math>
f maps [2, 2,-2] to [0,0]
Dimension
For the following function
- <math>f : \mathbb{F}^A \rightarrow \mathbb{F}^B</math>
- also noted as <math>f : \mathbb{F}^n \rightarrow \mathbb{F}^m</math>
- <math>f ({\bf x}) = M * {\bf x}</math>
we know:
- Since the domain is <math>\mathbb{F}^A</math> , that the input x is an A-vector.
- Since the co-domain is <math>\mathbb{F}^B</math> , that the output is a B-vector
- then that M must be a B x A matrix
Operations
Transpose
Transpose swaps rows and columns. See Dimensional Data Operation - (Pivot|Transpose|Cross-tab|Matrix)
<MATH> \begin{bmatrix} \begin{array}{rrr} 4 & 1 & -3 \\ 2 & 2 & -2 \end{array} \end{bmatrix}^T = \begin{bmatrix} \begin{array}{rr} 4 & 2 \\ 1 & 2 \\ -3 & -2 \end{array} \end{bmatrix} </MATH>
<MATH> \begin{bmatrix} \begin{array}{r} 4 \\ 1 \\ -3 \end{array} \end{bmatrix}^T = \begin{bmatrix} \begin{array}{rrr} 4 & 1 & -3 \end{array} \end{bmatrix} </MATH>
<MATH> \begin{bmatrix} \begin{array}{rr} 4 & 2 \\ 1 & 2 \\ -3 & -2 \end{array} \end{bmatrix}^T = \begin{bmatrix} \begin{array}{rrr} 4 & 1 & -3 \\ 2 & 2 & -2 \\ \end{array} \end{bmatrix} </MATH>
Inverting
Only square matrices can be inverted, and square matrices are not guaranteed to have an inverse. If the inverse exists, then multiplying a matrix by its inverse will produce the identity matrix.
Theorem: The transpose of an invertible matrix is invertible.
If <math>A</math> has an inverse <math>A^{-1}</math> then <math>AA^{-1}</math> is identity matrix
Converse: If BA is identity matrix then A and B are inverses? Not always true.
Theorem: Suppose A and B are square matrices such that BA is an identity matrix 1. Then A and B are inverses of each other.
Matrices A and B are inverses of each other if and only if both AB and BA are identity matrices.
<MATH> A B = A A^{-1} = I_n </MATH>
A invertible matrix has an inverse.
Corollary
Let A be an R x C matrix. Then A is Linear Algebra - Function (Set) if and only if |R| = |C| and the columns of A are linearly independent.
Proof: Let <math>\mathbb{F}</math> be the field. Define <math>f : \mathbb{F}^C \mapsto \mathbb{F}^R</math> by <math>f (x) = Ax</math> Then A is an invertible matrix if and only if f is an invertible function.
The function f is invertible
- iff dim Ker f = 0 and dim <math>\mathbb{F}^C</math> = dim <math>\mathbb{F}^R</math>
- iff nullity A = 0 and |C| = |R|.
nullity A = 0
- iff dim Null A = 0
- iff Null A = {0}
- iff the only vector x such that Ax = 0 is x = 0
- iff the columns of A are linearly independent
Example:
- <math>\begin{bmatrix}1 & 2 & 3 \\ 4 & 5 & 6\end{bmatrix}</math> is not square and so cannot be invertible.
- <math>\begin{bmatrix}1 & 2 \\ 3 & 4 \end{bmatrix}</math> is square and its columns are linearly independent so it is invertible.
- <math>\begin{bmatrix}\begin{array}{r|r|r}1 & 1 & 2 \\ 2 & 1 & 3 \\ 3 & 1 & 4 \end{array}\end{bmatrix}</math> is square but its columns are not linearly independent so it is not invertible.
Null Space
Null space of a matrix A (Written Null A) is: <MATH> \{u : A * u = 0\} </MATH>
Nullity
The nullity of matrix A is the dimension of the Null Space written:
- dim Null A
Type
One-column / One-row
A vector can be:
- a one-column matrix called a column vector
- a one-row matrix called a row vector
Identity
D x D identity matrix is the matrix <math>1_D</math> such that <math>1_D [k, k] = 1 \text{ for all } k \in D</math> and zero elsewhere.
Often letter I (for “identity”) is used instead of 1
<MATH> \mathbf{I_n} = \begin{bmatrix} 1 & 0 & 0 & \dots & 0 \\\ 0 & 1 & 0 & \dots & 0 \\\ 0 & 0 & 1 & \dots & 0 \\\ \vdots & \vdots & \vdots & \ddots & \vdots \\\ 0 & 0 & 0 & \dots & 1 \end{bmatrix} </MATH>
def identity(D): return Mat((D,D), {(k,k):1 for k in D})
Diagonal
<math> \begin{bmatrix} d_1 & & \\ & \ddots & \\ & & d_3 \end{bmatrix} </math>Let <math>d_1, \dots , d_n</math> be real numbers. Let <math>f : \mathbb{R}^n \rightarrow \mathbb{R}^n </math> be the function such that <math>f ([x_1, \dots , x_n]) = [d_1*x_1, \dots , d_n*x_n]</math> .
For a domain D, a D x D matrix M is a diagonal matrix if M[r , c] = 0 for every pair <math>r, c \in D</math> such that <math>r \neq c</math> .
A matrix is called a diagonal matrix when the only entries allowed to be nonzero form a diagonal
Triangular
Column-orthogonal
If columns of a matrix are orthonormal then it's a column-orthogonal matrix (should be called orthonormal ….)
Orthogonal
If a matrix is square and column-orthogonal, it's an orthogonal matrix
Property
Rank
For a matrix M:
- the row rank of M is the rank of its rows,
- and the column rank of M is the rank of its columns.
Equivalently, the row rank of M is the dimension of Row M, and the column rank of M is the dimension of Col M.
Consider the matrix <MATH> M = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 2 & 0 \\ 2 & 4 & 0 \end{bmatrix} </MATH> where:
- The rows are the vectors: [1, 0, 0], [0, 2, 0], [2, 4, 0]. The set of these vectors has rank two, so the row rank of M is two.
- The columns of M are [1, 0, 2], [0, 2, 4], and [0, 0, 0]. Since the third vector is the zero vector, it is not needed for spanning the column space. Since each of the first two vectors has a nonzero where the other has a zero, these two are linearly independent, so the column rank is two.
Rank Theorem: For every matrix M, row rank equals column rank.
Lemma: For any matrix A, row rank of A column rank of A
Sparse
A sparse matrix has many positions with a value of zero.
Systems designed to efficiently support sparse matrices look a lot like databases: They represent each cell as a record (i,j,value).
The benefit is that you only need one record for every non-zero element of a matrix.
For example, the matrix
0 | 2 | -1 |
1 | 0 | 0 |
0 | 0 | -3 |
0 | 0 | 0 |
can be represented as a table
row # (i) | column # (j) | value |
---|---|---|
0 | 1 | 2 |
0 | 2 | -1 |
1 | 0 | 1 |
2 | 2 | -3 |
Operations
Multiplication
Addition/Subtraction
Two matrices may be added or subtracted only if they are of the same order.