Orthogonal complements of vector subspaces

 
 
 
 
 

Definition of the orthogonal complement

Let’s remember the relationship between perpendicularity and orthogonality. We usually use the word “perpendicular” when we’re talking about two-dimensional space.

If two vectors are perpendicular, that means they sit at a 9090^\circ angle to one another.

Hi! I'm krista.

I create online courses to help you rock your math class. Read more.

 

This idea of “perpendicular” gets a little fuzzy when we try to transition it into three-dimensional space or nn-dimensional space, but the same idea still does exist in higher dimensions. So to capture the same idea, but for higher dimensions, we use the word “orthogonal” instead of “perpendicular.” So two vectors (or planes, etc.) can be orthogonal to one another in three-dimensional or nn-dimensional space.

The orthogonal complement

With a refresher on orthogonality out of the way, let’s talk about the orthogonal complement. If a set of vectors VV is a subspace of Rn\mathbb{R}^n, then the orthogonal complement of VV, called VV^{\perp}, is a set of vectors where every vector in VV^{\perp} is orthogonal to every vector in VV.

The \perp symbol means “perpendicular,” so you read VV^{\perp} as “v perpendicular,” or just “v perp.”

So if we’re saying that VV is a set of vectors v\vec{v}, and VV^\perp is a set of vectors x\vec{x}, then every v\vec{v} will be orthogonal to every x\vec{x} (or equivalently, every x\vec{x} will be orthogonal to every v\vec{v}), which means that the dot product of any v\vec{v} with any x\vec{x} will be 00.

So we could express the set of vectors VV^{\perp} as

V={xRn  xv=0for everyvV}V^{\perp}=\{\vec{x}\in \mathbb{R}^n\ | \ \vec{x}\cdot\vec{v}=0\quad\text{for every}\quad\vec{v}\in V\}

This tells us that VV^{\perp} is all of the x\vec{x} in Rn\mathbb{R}^n that satisfy xv=0\vec{x}\cdot\vec{v}=0, for every vector v\vec{v} in VV, which is VV^{\perp}’s orthogonal complement.

And this should make some sense to us. We learned in the past that two vectors were orthogonal to one another when their dot product was 00. For instance, if xv=0\vec{x}\cdot\vec{v}=0, that tells us that the vector x\vec{x} is orthogonal to the vector v\vec{v}.

We want to realize that defining the orthogonal complement really just expands this idea of orthogonality from individual vectors to entire subspaces of vectors. So two individual vectors are orthogonal when xv=0\vec{x}\cdot\vec{v}=0, but two subspaces are orthogonal complements when every vector in one subspace is orthogonal to every vector in the other subspace.

 
 

How to find the orthogonal complement of a vector space


 
 

Take the course

Want to learn more about Linear Algebra? I have a step-by-step course for that. :)

 
 

 
 

Building the orthogonal complement of a subspace

Example

Describe the orthogonal complement of VV, VV^\perp.

V=Span([132],[011])V=\text{Span}\Big(\begin{bmatrix}1\\ -3\\ 2\end{bmatrix},\begin{bmatrix}0\\ 1\\ 1\end{bmatrix}\Big)

The subspace VV is a plane in R3\mathbb{R}^3, spanned by the two vectors v1=(1,3,2)\vec{v}_1=(1,-3,2) and v2=(0,1,1)\vec{v}_2=(0,1,1). Therefore, its orthogonal complement VV^\perp is the set of vectors which are orthogonal to both v1=(1,3,2)\vec{v}_1=(1,-3,2) and v2=(0,1,1)\vec{v}_2=(0,1,1).

V={xR3  x[132]=0 andx[011]=0}V^{\perp}=\{\vec{x}\in \mathbb{R}^3\ | \ \vec{x}\cdot\begin{bmatrix}1\\ -3\\ 2\end{bmatrix}=0\ \quad\text{and}\quad\vec{x}\cdot\begin{bmatrix}0\\ 1\\ 1\end{bmatrix}=0\}

If we let x=(x1,x2,x3)\vec{x}=(x_1,x_2,x_3), we get two equations from these dot products.

x13x2+2x3=0x_1-3x_2+2x_3=0

x2+x3=0x_2+x_3=0

Put these equations into an augmented matrix,

augmented matrix containing both equations in the system

then put it into reduced row-echelon form.

the matrix in reduced row-echelon form

The rref form gives the system of equations

x1+5x3=0x_1+5x_3=0

x2+x3=0x_2+x_3=0

and we can solve the system for the pivot variables. The pivot entries we found were for x1x_1 and x2x_2, so we’ll solve the system for x1x_1 and x2x_2

x1=5x3x_1=-5x_3

x2=x3x_2=-x_3

So we could also express the system as

[x1x2x3]=x3[511]\begin{bmatrix}x_1\\ x_2\\ x_3\end{bmatrix}=x_3\begin{bmatrix}-5\\ -1\\ 1\end{bmatrix}

Which means the orthogonal complement is

V=Span([511])V^{\perp}=\text{Span}\Big(\begin{bmatrix}-5\\ -1\\ 1\end{bmatrix}\Big)


We want to realize that defining the orthogonal complement really just expands this idea of orthogonality from individual vectors to entire subspaces of vectors.

VV^\perp is a subspace

We’ve already assumed that VV is a subspace. If we’re given any subspace VV, then we know that its orthogonal complement VV^{\perp} is also a subspace. Of course, that means there must be some way that we know that VV^{\perp} is closed under addition and closed under scalar multiplication.

We know VV^{\perp} is closed under addition because, if we say that v\vec{v} is in VV and x1\vec{x}_1 and x2\vec{x}_2 are in VV^{\perp}, then

x1v=0\vec{x}_1\cdot \vec{v}=0

x2v=0\vec{x}_2\cdot \vec{v}=0

because every vector in VV^{\perp} is orthogonal to every vector in VV. If we add these equations, we get

x1v+x2v=0+0\vec{x}_1\cdot \vec{v}+\vec{x}_2\cdot \vec{v}=0+0

(x1+x2)v=0(\vec{x}_1+\vec{x}_2)\cdot \vec{v}=0

This shows us that the vector x1+x2\vec{x}_1+\vec{x}_2 will also be orthogonal to v\vec{v}, which means x1+x2\vec{x}_1+\vec{x}_2 is also a member of VV^{\perp}, which tells us that VV^{\perp} is closed under addition.

And we know that VV^{\perp} is closed under scalar multiplication because, if v\vec{v} is in VV and x1\vec{x}_1 is in VV^{\perp}, then it must also be true that

cx1v=c(x1v)=c(0)=0c\vec{x}_1\cdot \vec{v}=c(\vec{x}_1\cdot \vec{v})=c(0)=0

This shows us that the vector cx1c\vec{x}_1 will also be orthogonal to v\vec{v}, which means cx1c\vec{x}_1 is also a member of VV^{\perp}, which tells us that VV^{\perp} is closed under scalar multiplication.

Complement of the complement

In the same way that transposing a transpose gets you back to the original matrix, (AT)T=A(A^T)^T=A, the orthogonal complement of the orthogonal complement is the original subspace. So if VV^\perp is the orthogonal complement of VV, then

(V)=V(V^\perp)^\perp=V

Intuitively, this makes sense. If all the vectors in VV^\perp are orthogonal to all the vectors in VV, then all the vectors in VV will be orthogonal to all the vectors in VV^\perp, so the orthogonal complement of VV^\perp will be VV.

 
 

Get access to the complete Linear Algebra course