Orthogonal complements of vector subspaces

 
 
 
 
 

Definition of the orthogonal complement

Let’s remember the relationship between perpendicularity and orthogonality. We usually use the word “perpendicular” when we’re talking about two-dimensional space.

If two vectors are perpendicular, that means they sit at a ???90^\circ??? angle to one another.

Hi! I'm krista.

I create online courses to help you rock your math class. Read more.

 

This idea of “perpendicular” gets a little fuzzy when we try to transition it into three-dimensional space or ???n???-dimensional space, but the same idea still does exist in higher dimensions. So to capture the same idea, but for higher dimensions, we use the word “orthogonal” instead of “perpendicular.” So two vectors (or planes, etc.) can be orthogonal to one another in three-dimensional or ???n???-dimensional space.

The orthogonal complement

With a refresher on orthogonality out of the way, let’s talk about the orthogonal complement. If a set of vectors ???V??? is a subspace of ???\mathbb{R}^n???, then the orthogonal complement of ???V???, called ???V^{\perp}???, is a set of vectors where every vector in ???V^{\perp}??? is orthogonal to every vector in ???V???.

The ???\perp??? symbol means “perpendicular,” so you read ???V^{\perp}??? as “v perpendicular,” or just “v perp.”

So if we’re saying that ???V??? is a set of vectors ???\vec{v}???, and ???V^\perp??? is a set of vectors ???\vec{x}???, then every ???\vec{v}??? will be orthogonal to every ???\vec{x}??? (or equivalently, every ???\vec{x}??? will be orthogonal to every ???\vec{v}???), which means that the dot product of any ???\vec{v}??? with any ???\vec{x}??? will be ???0???.

So we could express the set of vectors ???V^{\perp}??? as

???V^{\perp}=\{\vec{x}\in \mathbb{R}^n\ | \ \vec{x}\cdot\vec{v}=0\quad\text{for every}\quad\vec{v}\in V\}???

This tells us that ???V^{\perp}??? is all of the ???\vec{x}??? in ???\mathbb{R}^n??? that satisfy ???\vec{x}\cdot\vec{v}=0???, for every vector ???\vec{v}??? in ???V???, which is ???V^{\perp}???’s orthogonal complement.

And this should make some sense to us. We learned in the past that two vectors were orthogonal to one another when their dot product was ???0???. For instance, if ???\vec{x}\cdot\vec{v}=0???, that tells us that the vector ???\vec{x}??? is orthogonal to the vector ???\vec{v}???.

We want to realize that defining the orthogonal complement really just expands this idea of orthogonality from individual vectors to entire subspaces of vectors. So two individual vectors are orthogonal when ???\vec{x}\cdot\vec{v}=0???, but two subspaces are orthogonal complements when every vector in one subspace is orthogonal to every vector in the other subspace.

 
 

How to find the orthogonal complement of a vector space


 
 

Take the course

Want to learn more about Linear Algebra? I have a step-by-step course for that. :)

 
 

 
 

Building the orthogonal complement of a subspace

Example

Describe the orthogonal complement of ???V???, ???V^\perp???.

???V=\text{Span}\Big(\begin{bmatrix}1\\ -3\\ 2\end{bmatrix},\begin{bmatrix}0\\ 1\\ 1\end{bmatrix}\Big)???

The subspace ???V??? is a plane in ???\mathbb{R}^3???, spanned by the two vectors ???\vec{v}_1=(1,-3,2)??? and ???\vec{v}_2=(0,1,1)???. Therefore, its orthogonal complement ???V^\perp??? is the set of vectors which are orthogonal to both ???\vec{v}_1=(1,-3,2)??? and ???\vec{v}_2=(0,1,1)???.

???V^{\perp}=\{\vec{x}\in \mathbb{R}^3\ | \ \vec{x}\cdot\begin{bmatrix}1\\ -3\\ 2\end{bmatrix}=0\ \quad\text{and}\quad\vec{x}\cdot\begin{bmatrix}0\\ 1\\ 1\end{bmatrix}=0\}???

If we let ???\vec{x}=(x_1,x_2,x_3)???, we get two equations from these dot products.

???x_1-3x_2+2x_3=0???

???x_2+x_3=0???

Put these equations into an augmented matrix,

augmented matrix containing both equations in the system

then put it into reduced row-echelon form.

the matrix in reduced row-echelon form

The rref form gives the system of equations

???x_1+5x_3=0???

???x_2+x_3=0???

and we can solve the system for the pivot variables. The pivot entries we found were for ???x_1??? and ???x_2???, so we’ll solve the system for ???x_1??? and ???x_2???. 

???x_1=-5x_3???

???x_2=-x_3???

So we could also express the system as

???\begin{bmatrix}x_1\\ x_2\\ x_3\end{bmatrix}=x_3\begin{bmatrix}-5\\ -1\\ 1\end{bmatrix}???

Which means the orthogonal complement is

???V^{\perp}=\text{Span}\Big(\begin{bmatrix}-5\\ -1\\ 1\end{bmatrix}\Big)???


We want to realize that defining the orthogonal complement really just expands this idea of orthogonality from individual vectors to entire subspaces of vectors.

???V^\perp??? is a subspace

We’ve already assumed that ???V??? is a subspace. If we’re given any subspace ???V???, then we know that its orthogonal complement ???V^{\perp}??? is also a subspace. Of course, that means there must be some way that we know that ???V^{\perp}??? is closed under addition and closed under scalar multiplication.

We know ???V^{\perp}??? is closed under addition because, if we say that ???\vec{v}??? is in ???V??? and ???\vec{x}_1??? and ???\vec{x}_2??? are in ???V^{\perp}???, then

???\vec{x}_1\cdot \vec{v}=0???

???\vec{x}_2\cdot \vec{v}=0???

because every vector in ???V^{\perp}??? is orthogonal to every vector in ???V???. If we add these equations, we get

???\vec{x}_1\cdot \vec{v}+\vec{x}_2\cdot \vec{v}=0+0???

???(\vec{x}_1+\vec{x}_2)\cdot \vec{v}=0???

This shows us that the vector ???\vec{x}_1+\vec{x}_2??? will also be orthogonal to ???\vec{v}???, which means ???\vec{x}_1+\vec{x}_2??? is also a member of ???V^{\perp}???, which tells us that ???V^{\perp}??? is closed under addition.

And we know that ???V^{\perp}??? is closed under scalar multiplication because, if ???\vec{v}??? is in ???V??? and ???\vec{x}_1??? is in ???V^{\perp}???, then it must also be true that

???c\vec{x}_1\cdot \vec{v}=c(\vec{x}_1\cdot \vec{v})=c(0)=0???

This shows us that the vector ???c\vec{x}_1??? will also be orthogonal to ???\vec{v}???, which means ???c\vec{x}_1??? is also a member of ???V^{\perp}???, which tells us that ???V^{\perp}??? is closed under scalar multiplication.

Complement of the complement

In the same way that transposing a transpose gets you back to the original matrix, ???(A^T)^T=A???, the orthogonal complement of the orthogonal complement is the original subspace. So if ???V^\perp??? is the orthogonal complement of ???V???, then

???(V^\perp)^\perp=V???

Intuitively, this makes sense. If all the vectors in ???V^\perp??? are orthogonal to all the vectors in ???V???, then all the vectors in ???V??? will be orthogonal to all the vectors in ???V^\perp???, so the orthogonal complement of ???V^\perp??? will be ???V???.

 
 

Get access to the complete Linear Algebra course