Data Science: What are eigenvectors and eigenvalues?

Salman Arif
4 min readFeb 9, 2019

The mathematics behind Machine Learning is very interesting and Linear Algebra is at the heart of Data Science. There is nothing daunting about mathematics if the building blocks are laid out correctly. My motivation to write this article is to create an intuition behind eigenvectors and eigenvalues that are frequently used in various machine learning. You can find them being used in a whole swath of supervised and unsupervised learning algorithms.

We might have all learnt these concepts in high school but with very little understanding as most of the effort was spent on getting the calculation right.

So what is an eigenvector? Before we touch upon this lets quickly brush upon what we mean by a vector and a matrix. Vector is nothing but an object that moves around a space. This space can be physical or space of data. The vectors can be used to define a coordinate space. The vectors defining a coordinate space are known as basis vectors.

The picture below shows two basis vectors i and j both of unit length one that forms the coordinate space. The vector r’ in that coordinate space is represented as 3i’s away on the x-axis and 2j’s away on the y-axis.

coordinate space

This is what we call a coordinate system that defines r’. There is no reason why we could not have defined r’ had there been another coordinate space where the basis vectors creating the coordinate space were not unit length and were not perpendicular to each other.

A matrix is just a transformation that operates on a vector and nothing more. It can stretch, rotate and sheer our vector. A matrix just tells us where our basis vectors go.

Eigenvalues & Eigenvectors

Taking the above concept one step further. When we apply transformation we simply look for vectors that lie on the same plane these are known as eigenvectors. We measure their lengths and these are known as eigenvalues

or

Eigenvectors are vectors that lie along the same plane before and after the transformation and eigenvalues are just the amount by which the eigenvectors have stretched.

When applying a matrix transformation we often think of it being applied to a single vector but it won't be a stretch to think a matrix transformation is applied to all the vectors in space. This can be visualized by the following figure which will make it clear.

Horizontal scaling transformation

The square represents a vector space. I am only highlighting 3 vectors in vector space to explain the concept. However, the transformation would apply to each and every vector in that vector space (the square box). When we apply a horizontal matrix transformation the vector space is stretched horizontally. Now we can see that green and red vector stay exactly where they are but the orange vector moves away from its span. The red and the green vectors are known as the eigenvectors. Actually, it's not hard to see that green and red are the only vectors that won't move away from their span. The red vector although has remained on its span but has changed its size. This change in the amount of size of an eigenvector known is known as its eigenvalue. In the above figure, only a horizontal stretch of 2 units was applied. Therefore red vector has an eigenvalue of 2 while green has an eigenvalue of 1.

Let’s look at 2 more classic examples to make sure we can generalize what we have learnt.

Let’s apply a pure sheer and see what happens. Now I hope you have spotted that it’s only the green horizontal line that has remained at its original position and all others have moved. Therefore it only has one eigenvector.

Sheer Transformation

Finally, let’s look at rotation. As you can see this transformation does not have any eigenvectors.

Rotation Transformation

Let’s look at 2 special type transformations where all the vectors are eigenvectors.

Uniform Scaling: Is just a linear transformation in which vectors are scaled by the same amount in all directions. As you would have seen it’s just not the 3 vectors that are eigenvectors but all the vectors that span this space are eigenvectors.

Uniform Scaling Transformation

Rotation of 180: If we do a rotation in 180 degrees again we will see that all the vectors are eigenvectors (all on the same plane before the transformation and after) but will have an eigenvalue of -1 as they all point in the opposite direction.

Rotation Transformation of 180 degrees

That's all folks!!! that's all you need to know about eigenvalues and eigenvectors. In the next post, I will talk about its application and how we can employ eigenvectors and eigenvalues to speed up computation.

--

--