Table of Contents

Linear Algebra - Orthogonal complement Vector Space

Definition

Let U be a Linear Algebra - Vector Space (set of vector) of W. For each vector b in W, we can write b as the following projections: <MATH>b = b^{||U} + b^{\perp U}</MATH> where:

Let V be the set <math>V = \{b^{\perp U} : b \in W\}</math> . V is the orthogonal complement of U in W. Every vector in V is orthogonal to every vector in U

Direct sum

Every vector b in W can be written as the sum of a vector in U and a vector in V: <MATH>U \oplus V = W</MATH>

Proof: To show direct sum of U and V is defined, we need to show that the only in vector that is in both U and V is the zero vector. Any vector w in both U and V is orthogonal to itself. Thus <math>0 = \langle w,w \rangle = \|w\|^2</math> . By Property N2 of norms, that means w = 0.

Example

Gf2

Let U = Span {[1, 1, 0, 0], [0, 0, 1, 1]}. Let V denote the orthogonal complement of U in R4. What vectors form a basis for V?

Every vector in U has the form [a, a, b, b]. Therefore any vector of the form [c,−c, d,−d] is orthogonal to every vector in U.

Every vector in Span {[1,−1, 0, 0], [0, 0, 1,−1]} is orthogonal to every vector in U…. … so Span {[1,−1, 0, 0], [0, 0, 1,−1]} is a subspace of V, the orthogonal complement of U in R4.

And:

Basis for a null space

Find a basis using orthogonal complement

Computation

We have:

One way

These span the same space as input vectors u1, . . . , uk ,w1, . . . ,w⇤n, namely W, which has dimension n.

Therefore exactly n of the output vectors u⇤1, . . . , u⇤k ,w⇤1, . . . ,w⇤n are nonzero.

The vectors u⇤1 , . . . , u⇤k have same span as u1, . . . , uk and are all nonzero since u1, . . . , uk are linearly independent.

Therefore exactly n - k of the remaining vectors w⇤1, . . . ,w⇤n are nonzero.

By Direct-Sum Dimension Lemma, orthogonal complement has dimension n-k, so the remaining nonzero vectors are a basis for the orthogonal complement.