You do not need to learn linear algebra before you get started in machine learning, but at some time you may wish to dive deeper.
In fact, if there was one area of mathematics I would suggest improving before the others, it would be linear algebra. It will give you the tools to help you with the other areas of mathematics required to understand and build better intuitions for machine learning algorithms.
In this post we take a closer look at linear algebra and why you should make the time to improve your skills and knowledge in linear algebra if you want to get more out of machine learning.
If you already know your way around Eigen Vectors and SVD, this post is probably not for you.
Kick-start your project with my new book Linear Algebra for Machine Learning, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.
What is Linear Algebra
Linear Algebra is a branch of mathematics that lets you concisely describe coordinates and interactions of planes in higher dimensions and perform operations on them.
Think of it as an extension of algebra (dealing with unknowns) into an arbitrary number of dimensions. Linear Algebra is about working on linear systems of equations (linear regression is an example: y = Ax). Rather than working with scalars, we start working with matrices and vectors (vectors are really just a special type of matrix).
Broadly speaking, in linear algebra data is represented in the form of linear equations. These linear equations are in turn represented in the form of matrices and vectors.
— Vignesh Natarajan in answer to the question “How is Linear Algebra used in Machine Learning?”
As a field, it’s useful to you because you can describe (and even execute with the right libraries) complex operations used in machine learning using the notation and formalisms from linear algebra.
Linear algebra finds widespread application because it generally parallelizes extremely well. Further to that most linear algebra operations can be implemented without messaging passing which makes them amenable to MapReduce implementations.
— Raphael Cendrillon in answer to the question “Why is Linear Algebra a prerequisite behind modern scientific/computational research?”
More information on Linear Algebra from Wikipedia:
- Linear Algebra on Wikipedia
- Linear Algebra Category on Wikipedia
- Linear Algebra List of Topics on Wikipedia
Need help with Linear Algebra for Machine Learning?
Take my free 7-day email crash course now (with sample code).
Click to sign-up and also get a free PDF Ebook version of the course.
Minimum Linear Algebra for Machine Learning
Linear Algebra is a foundation field. By this I mean that the notation and formalisms are used by other branches of mathematics to express concepts that are also relevant to machine learning.
For example, matrices and vectors are used in calculus, needed when you want to talk about function derivatives when optimizing a loss function. They are also used in probability when you want to talk about statistical inference.
…it’s used everywhere in mathematics, so you’ll find it used wherever math is used…
— David Joyce, in answer to the question “What is the point of linear algebra?”
If I was to convince you to learn a minimum of linear algebra to improve your capabilities in machine learning, it would be the following 3 topics:
- Notation: Knowing the notation will let you read algorithm descriptions in papers, books and websites to get an idea of what is going on. Even if you use for-loops rather than matrix operations, at least you will be able to piece things together.
- Operations: Working at the next level of abstraction in vectors and matrices can make things clearer. This can apply to descriptions, to code and even to thinking. Learn how to do or apply simple operations like adding, multiplying, inverting, transposing, etc. matrices and vectors.
- Matrix Factorization: If there was one deeper area I would recommend diving into over any other it would be matrix factorization, specifically matrix deposition methods like SVD and QR. The numerical precision of computers is limited and working with decomposed matrices allows you to sidestep a lot of the overflow/underflow madness that can result. Also, a quick LU, SVD or QR decomposing using a library will give you an ordinary least squares for for your regression problem. A bed rock of machine learning and stats.
If you know some linear algebra and disagree with my minimum list, please leave a comment. I’d love to hear your 3 min topics.
If you want to get into the theory of it all, you need to know linear algebra. If you want to read white papers and consider cutting edge new algorithms and systems, you need to know a lot of math.
— Jesse Reiss in answer to the question “How important is linear algebra in computer science?”
5 Reasons To Improve Your Linear Algebra
Of course, I don’t want you to stop at the minimum. I want you to go deeper.
If your need to know more and get better doesn’t motivate you down the path, here are five reasons that might give you that push.
- Building Block: Let me state it again. Linear algebra is absolutely key to understanding the calculus and statistics you need in machine learning. Better linear algebra will lift your game across the board. Seriously.
- Deeper Intuition: If you can understand machine learning methods at the level of vectors and matrices you will improve your intuition for how and when they work.
- Get More From Algorithms: A deeper understanding of the algorithm and its constraints will allow you to customize its application and better understand the impact of tuning parameters on the results.
- Implement Algorithms From Scratch: You require an understanding of linear algebra to implement machine learning algorithms from scratch. At the very least to read the algorithm descriptions and at best to effectively use the libraries that provide the vector and matrix operations.
- Devise New Algorithms: The notation and tools of linear algebra can be used directly in environments like Octave and MATLAB allowing you to prototype modifications to existing algorithms and entirely new approaches very quickly.
Linear Algebra will feature heavily in your machine learning journey whether you like it or not.
3 Video Courses To Learn Linear Algebra
If you are looking to beef up your linear algebra, there are three options that you could start with.
These are video courses and lectures I found and went through recently in preparation for this post. I found each decent and suited to a different audience.
I watch all videos on double time, and defiantly recommend it with all of these sources. Also, take notes.
1. Linear Algebra Refresher
This is a quick whip around the topics in linear algebra you should be familiar with. This is for those who took linear algebra in collage and are looking for a reminder rather than an education.
https://www.youtube.com/watch?v=ZumgfOei0Ak
The video is titled “Linear Algebra for machine learning” and was created by Patrick van der Smagt using slides from University Collage London.
2. Linear Algebra Crash Course
The second option is the Linear Algebra crash course presented as an optional module in Week 1 of his Coursera Machine Learning course.
This is suited to the engineer or programmer who is perhaps less or not at all familiar with linear algebra and is looking for a first bootstrap into the topic.
It contains 6 short videos and you can access a YouTube playlist here titled “Machine Learning – 03. Linear Algebra Review“.
https://www.youtube.com/playlist?list=PLnnr1O8OWc6boN4WHeuisJWmeQHH9D_Vg
The topics covered include:
- Matrices and Vectors
- Addition and Scalar Multiplication
- Matrix Vector Multiplication
- Matrix Matrix Multiplication
- Matrix Multiplication Properties
- Inverse and Transpose
3. Linear Algebra Course
The third option is to take a complete introductory course into Linear Algebra. A slow grind that puts the whole field into your head.
I recommend the Linear Algebra stream on Khan Academy.
It’s amazing. Not only is the breadth impressive and it provides spot check questions throughout, but Sal is a great communicator and cuts straight to the the applied side of the material. Much better than any university course I took.
Sal’s course is divided into 3 main modules:
- Vector Spaces
- Matrix Transformations
- Alternative Coordinate Systems (bases)
Each module contains 5-7 sub modules and each sub-module contains 2-7 videos or question sets that range from 5-25 minutes (faster on double time!).
It’s great material and a low burn and I would recommend doing all of it, perhaps in weekend binges.
More Resources To Learn Linear Algebra
If you are looking for more general advice, check out the answers to the question “How can I self study Linear Algebra?“. There are some real gems in here.
Programming Linear Algebra
As a programmer or engineer, you likely learn best by doing. I know I do.
As such, you may wish to grab a programming environment or library and start coding up matrix multiplication, SVD and QR decompositions with test data.
Below are some options you might like to consider.
- Octave: Octave is the open source version of MATLAB and for most operations they are equivalent. These platforms were built for linear algebra. This is what they do and they do it very well. They are a joy to use.
- R: It can do t, but its less beautiful than Octave. Check out this handy report: “Introduction to linear algebra with R” (PDF)
- SciPy numpy.linalg: Easy and fun if you are a Python programmer with clean syntax and access to all the operations you need.
- BLAS: Basic Linear Algebra Subprograms like multiplication, inverse and the like. Ported or available in most programming languages.
- LAPACK: Linear Algebra Library, successor to LINPACK. The place to go for various matrix factorizations and the like. Like BLAS, ported or available in most programming languages.
There’s also a new Coursera course titled “Coding the Matrix: Linear Algebra through Computer Science Applications” by Philip Klein that also has an accompanying book by the same name “Coding the Matrix: Linear Algebra through Applications to Computer Science“. This may be worth a look if you are a Python programmer and looking to beef up your linear algebra.
Linear Algebra Books
I learn best from applied examples, but I also read a lot. If you’re anything like me, you’ll want a good textbook on the shelf, just in case.
This section lists some of the top textbooks on Linear Algebra for beginners.
Foundations
This is a beginner textbooks that covers the foundations of linear algebra. Either would be a good compliment to taking the course on Khan Academy.
- Introduction to Linear Algebra by Serge Lang.
- Introduction to Linear Algebra by Gilbert Strang
Applied
These are books that lean more towards the application of linear algebra.
- Numerical Linear Algebra by Lloyd Trefethen.
- Linear Algebra and Its Applications by Gilbert Strang.
- Matrix Computations by Gene Golub and Charles Van Loan
I really like the latter book “Matrix Computation” because it gives you snippets of theory and algorithm pseudocode. Very cool for the math guy and the programming guy in me. If you want to implement the procedures yourself from scratch (rather than use a library), this may be the book for you.
For more suggestions of good beginner books on Linear Algebra, check out: What is the best book for learning Linear Algebra?
Summary
In this post you have taken a look at Linear Algebra and the important role it plays in Machine Learning (and really broader mathematics). You also noted a minimum of linear algebra to look at.
We touched on three options that you can use to learn linear algebra, a refresher, crash course or a deeper video course, all available to you now for free. We also looked at the top textbooks on the topic in case you wanted to go deeper.
I hope this has sparked your interest in the importance and power of getting better at linear algebra. Pick one resource and read/watch it to completion. Take that next step and improve your understanding of machine learning.
Update: Two additional high quality resources mentioned on the Reddit discussion of this post are the book Linear Algebra Done Right Axler and the MIT open courseware course on Linear Algebra taught by Gilbert Strang (author of some of the books mentioned above).
Good stuff, as always!
The more I dig into ML, the more I see just how indispensable linear algebra is. There’s no getting around it.
BTW, I’ve been looking for a text on linear algebra for R that does for the topic what Philip Klein’s “Coding the Matrix” does for Python.
I think I found it!
Hrishikesh Vinod’s “Hands-On Matrix Algebra Using R: Active and Motivated Learning with Applications”
http://www.amazon.com/Hands-On-Matrix-Algebra-Using-Applications/dp/9814313696
Pretty good so far!
Great post, thanks.
Have you similar ideas & hints regarding calculus?
These 3 looked promising as well –
Linear Algebra – Gilbert Strang (of the ubiquitous textbook)
http://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/
Coding the Matrix – Philip Klein (linear algebra for people who can code)
https://class.coursera.org/matrix-002
Linear Algebra – Foundations to Frontiers
https://www.edx.org/course/linear-algebra-foundations-frontiers-utaustinx-ut-5-03x
I took the Linear Algebra – Foundations to Frontiers course. It’s very good, and also very interesting because they also discuss the frontier, i.e. what current (well, from the past few years) research has focused on.
(I guess you mentioned the first 2 and have a more recent link, somehow I missed it)
Jason, I HIGHLY RECOMMEND EVERYONE watch these videos, here, called the “Essence of Linear Algebra”. It is simply the most beautiful thing I’ve ever seen. Honestly.
https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab
Thanks for sharing John.
Did you watch some, why do you like them?
Absolutely brilliant videos. It’s very intuitive. I recommend you watch it alongside a course which gets into the notations.
Thanks for the link to the video. The insight that the columns of a matrix are the unit vectors to the transformed space seams like something I should have learned but do not remember.
These are beautiful, intuitive videos
I wanna do research in machine learning(neural networks) so you think khan academy Linear Algebra course would be enough?
It will be more than you need.
Thanks a lot !
You’re welcome.
is the book available for purchase-hard copy only
if so please provide me the the publishese
My books are only available in PDF.
How can a Indian purchase your linear algebra book?
You can use PayPal or Credit Card.
Strang’s paper “The Fundamental Theorem of Linear Algebra” (1993):
http://www.souravsengupta.com/cds2016/lectures/Strang_Paper1.pdf
is an excellent exposition of the fundamental subspaces and the SVD. If you can grok it, you’ll understand a lot.
Great, thanks Brian.
Hey Jason, would you recommend your book for the following problem?
Sorry to bother you but I was wondering if you could recommend a book to learn.
Show that an SVM using the polynomial kernel of degree two, K(u, v) = (1 + u · v)^2, is equivalent to a linear SVM in the feature space (1, x1, x2, x21, x2, x1x2) and hence SVMs with this kernel can separate any elliptic region from the rest of the plane.
Nice.
No, I don’t show any derivations of SVM kernels. Perhaps pickup a book on SVM theory?
When learning the maths needed to do great machine learning, would you say it’s best to start with linear algebra? I did well in calculus, stats, and linear algebra in college but I’m 15 years out of practice, so I’m trying to figure out the best place to start re-learning.
Thanks!
I recommend not starting with the math at all:
https://machinelearningmastery.com/faq/single-faq/what-mathematical-background-do-i-need-for-machine-learning
Thank you????. It helps me a lot to start my journey towards linear algebra..
Good to hear that!