Section 18 — Linear Algebra (Axler)

Daily lesson from Magic Internet Math Academy: Section 18 (Linear Algebra (Axler), Section 18)

📐 Today’s Lesson: Section 18 From: Linear Algebra (Axler) — Section 18

Chapter 6.C: Orthogonal Complements and Minimization

The orthogonal complement of a subspace and the orthogonal projection onto it are among the most powerful tools in applied mathematics. They solve the fundamental minimization problem: find the point in a subspace closest to a given vector. This section develops these ideas and connects them to the geometry of inner product spaces.

Orthogonal Complements

📖 Definition — Orthogonal Complement

If U is a subset of an inner product space V, the orthogonal complement of U is

String.rawU^⊥ = {v ∈ V : ⟨ v, u ⟩ = 0 for every u ∈ U}.

In words, U^⊥ consists of all vectors orthogonal to every vector in U.

📐 Theorem — Properties of the Orthogonal Complement

That U^⊥ is a subspace follows from linearity of the inner product in the first slot. For U ∩ U^⊥ = {0}: if v ∈ U ∩ U^⊥, then ⟨ v, v ⟩ = 0, so v = 0 by definiteness.

Let U be a subspace of V. Then:

• U^⊥ is a subspace of V. • {0}^⊥ = V and V^⊥ = {0}. • U ∩ U^⊥ = {0}. • If U₁ ⊆ U₂, then U₂^⊥ ⊆ U₁^⊥.

Orthogonal Decomposition

📐 Theorem — Direct Sum with Orthogonal Complement

Let e₁, …, eₘ be an orthonormal basis of U (obtained via Gram-Schmidt). For any v ∈ V, write

String.rawv = ⟨ v, e₁ ⟩ e₁ + ⋯ + ⟨ v, eₘ ⟩ eₘ_∈_ _U + v - ⟨ v, e₁ ⟩ e₁ - ⋯ - ⟨ v, eₘ ⟩ eₘ_∈_ _U_^_⊥.

The second term is in U^⊥ because for each eₖ:

String.raw⟨ v - Σⱼ ⟨ v, eⱼ ⟩ eⱼ, eₖ ⟩ = ⟨ v, eₖ ⟩ - ⟨ v, eₖ ⟩ = 0.

This shows V = U + U^⊥. The sum is direct because U ∩ U^⊥ = {0}. The dimension formula V = U + U^⊥ follows from the direct sum.

Suppose U is a finite-dimensional subspace of V. Then

String.rawV = U ⊕ U^⊥.

In particular, V = U + U^⊥.

Double complement: An immediate consequence is that (U^⊥)^⊥ = U when V is finite-dimensional. The orthogonal complement is an involution on the lattice of subspaces.

Orthogonal Projection

📖 Definition — Orthogonal Projection

Suppose U is a finite-dimensional subspace of V. The orthogonal projection of V onto U, denoted P_U, is the operator defined by: for v = u + w with u ∈ U and w ∈ U^⊥,

String.rawP_U v = u.

📐 Theorem — Properties of Orthogonal Projection

If e₁, …, eₘ is an orthonormal basis for U, then from the decomposition in the previous theorem:

String.rawP_U v = ⟨ v, e₁ ⟩ e₁ + ⋯ + ⟨ v, eₘ ⟩ eₘ.

This shows range P_U = U and null P_U = U^⊥. Also P_U² = P_U because projecting a vector already in U gives itself. Finally, v - P_U v ∈ U^⊥ by construction.

Let U be a finite-dimensional subspace of V. Then:

• P_U ∈ L(V) and P_U² = P_U. • range P_U = U and null P_U = U^⊥. • v - P_U v ∈ U^⊥ for every v ∈ V. • ‖P_U v‖ ‖v‖ for every v ∈ V.

✏️ Example — Projection in R^3

Let U = span{(1,0,0), (0,1,0)} (the xy-plane in R³). For v = (3, 4, 5):

String.rawP_U v = ⟨ v, e₁ ⟩ e₁ + ⟨ v, e₂ ⟩ e₂ = 3(1,0,0) + 4(0,1,0) = (3,4,0).

The residual v - P_U v = (0, 0, 5) ∈ U^⊥.

Minimization: Closest Point in a Subspace

📐 Theorem — Minimization Principle

Let u ∈ U. Then

String.raw‖v - u‖² = ‖v - P_U v + P_U v - u‖².

Since v - P_U v ∈ U^⊥ and P_U v - u ∈ U, the Pythagorean theorem gives

String.raw‖v - u‖² = ‖v - P_U v‖² + ‖P_U v - u‖² ‖v - P_U v‖².

Equality holds iff ‖P_U v - u‖ = 0, i.e., u = P_U v.

Suppose U is a finite-dimensional subspace of V and v ∈ V. Then

String.raw‖v - P_U v‖ ‖v - u‖ for every u ∈ U.

Furthermore, equality holds if and only if u = P_U v.

The geometry: The orthogonal projection P_U v is the unique point in U closest to v. The “error” vector v - P_U v is perpendicular to the subspace. This is the geometric essence of least-squares approximation.

✏️ Example — Least-Squares Approximation

Find the point in U = span{(1,1,0), (0,1,1)} closest to v = (1, 0, 0). First apply Gram-Schmidt to get an orthonormal basis of U:

String.rawe₁ = 1/√(2)(1,1,0), e₂ = 1/√(6)(-1, 1, 2).

Then compute the projection:

String.rawP_U v = ⟨ v, e₁ ⟩ e₁ + ⟨ v, e₂ ⟩ e₂ = 1/√(2) · 1/√(2)(1,1,0) + (-1)/√(6) · 1/√(6)(-1,1,2).

String.raw= 1/2(1,1,0) + 1/6(1,-1,-2) = (2/3, 1/3, -1/3).

One can verify: v - P_U v = (1/3, -1/3, 1/3) is orthogonal to both (1,1,0) and (0,1,1).

Key Takeaways

• 1.

U^⊥ is the set of all vectors orthogonal to every vector in U. It is always a subspace. • 2.

Orthogonal decomposition: V = U ⊕ U^⊥. Every vector splits uniquely into a component in U and a component perpendicular to U. • 3.

The orthogonal projection P_U extracts the U-component. It satisfies P_U² = P_U. • 4.

Minimization: P_U v is the unique closest point in U to v. The error is perpendicular to the subspace. • 5.

This is the mathematical foundation of least-squares methods used throughout science, engineering, and data science.


🔗 Interactive version: https://magicinternetmath.com

🏴‍☠️ Subscribe to the Pioneers Club — free courses from high school algebra to elliptic curve cryptography.

fundamentals@zeuspay.com


Write a comment
No comments yet.