### What is Linear Algebra?

Linear algebra is a field of mathematics that is universally agreed to be a prerequisite to a deeper understanding of machine learning.

Although linear algebra is a large field with many esoteric theories and findings, the nuts and bolts tools and notations taken from the field are practical for machine learning practitioners. With a solid foundation of what linear algebra is, it is possible to focus on just the good or relevant parts.

In this tutorial, you will discover what exactly linear algebra is from a machine learning perspective.

After completing this tutorial, you will know:

- Linear algebra is the mathematics of data.
- Linear algebra has had a marked impact on the field of statistics.
- Linear algebra underlies many practical mathematical tools, such as Fourier series and computer graphics.

Letâ€™s get started.

## Tutorial Overview

This tutorial is divided into 4 parts; they are:

- Linear Algebra
- Numerical Linear Algebra
- Linear Algebra and Statistics
- Applications of Linear Algebra

### Need help with Linear Algebra for Machine Learning?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

## Linear Algebra

Linear algebra is a branch of mathematics, but the truth of it is that linear algebra is the mathematics of data. Matrices and vectors are the language of data.

Linear algebra is about linear combinations. That is, using arithmetic on columns of numbers called vectors and arrays of numbers called matrices, to create new columns and arrays of numbers. Linear algebra is the study of lines and planes, vector spaces and mappings that are required for linear transforms.

It is a relatively young field of study, having initially been formalized in the 1800s in order to find unknowns in systems of linear equations. A linear equation is just a series of terms and mathematical operations where some terms are unknown; for example:

1 |
y = 4 * x + 1 |

Equations like this are linear in that they describe a line on a two-dimensional graph. The line comes from plugging in different values into the unknown x to find out what the equation or model does to the value of y.

We can line up a system of equations with the same form with two or more unknowns; for example:

1 2 3 4 |
y = 0.1 * x1 + 0.4 * x2 y = 0.3 * x1 + 0.9 * x2 y = 0.2 * x1 + 0.3 * x2 ... |

The column of y values can be taken as a column vector of outputs from the equation. The two columns of floating-point values are the data columns, say a1 and a2, and can be taken as a matrix A. The two unknown values x1 and x2 can be taken as the coefficients of the equation and together form a vector of unknowns b to be solved. We can write this compactly using linear algebra notation as:

1 |
y = A . b |

Problems of this form are generally challenging to solve because there are more unknowns (here we have 2) than there are equations to solve (here we have 3). Further, there is often no single line that can satisfy all of the equations without error. Systems describing problems we are often interested in (such as a linear regression) can have an infinite number of solutions.

This gives a small taste of the very core of linear algebra that interests us as machine learning practitioners. Much of the rest of the operations are about making this problem and problems like it easier to understand and solve.

## Numerical Linear Algebra

The application of linear algebra in computers is often called numerical linear algebra.

“numerical” linear algebra is really applied linear algebra.

— Page ix, Numerical Linear Algebra, 1997.

It is more than just the implementation of linear algebra operations in code libraries; it also includes the careful handling of the problems of applied mathematics, such as working with the limited floating point precision of digital computers.

Computers are good at performing linear algebra calculations, and much of the dependence on Graphical Processing Units (GPUs) by modern machine learning methods such as deep learning is because of their ability to compute linear algebra operations fast.

Efficient implementations of vector and matrix operations were originally implemented in the FORTRAN programming language in the 1970s and 1980s and a lot of code, or code ported from those implementations, underlies much of the linear algebra performed using modern programming languages, such as Python.

Three popular open source numerical linear algebra libraries that implement these functions are:

- Linear Algebra Package, or LAPACK.
- Basic Linear Algebra Subprograms, or BLAS (a standard for linear algebra libraries).
- Automatically Tuned Linear Algebra Software, or ATLAS.

Often, when you are calculating linear algebra operations directly or indirectly via higher-order algorithms, your code is very likely dipping down to use one of these, or similar linear algebra libraries. The name of one of more of these underlying libraries may be familiar to you if you have installed or compiled any of Python’s numerical libraries such as SciPy and NumPy.

## Linear Algebra and Statistics

Linear algebra is a valuable tool in other branches of mathematics, especially statistics.

Usually students studying statistics are expected to have seen at least one semester of linear algebra (or applied algebra) at the undergraduate level.

— Page xv, Linear Algebra and Matrix Analysis for Statistics, 2014.

The impact of linear algebra is important to consider, given the foundational relationship both fields have with the field of applied machine learning.

Some clear fingerprints of linear algebra on statistics and statistical methods include:

- Use of vector and matrix notation, especially with multivariate statistics.
- Solutions to least squares and weighted least squares, such as for linear regression.
- Estimates of mean and variance of data matrices.
- The covariance matrix that plays a key role in multinomial Gaussian distributions.
- Principal component analysis for data reduction that draws many of these elements together.

As you can see, modern statistics and data analysis, at least as far as the interests of a machine learning practitioner are concerned, depend on the understanding and tools of linear algebra.

## Applications of Linear Algebra

As linear algebra is the mathematics of data, the tools of linear algebra are used in many domains.

In his classical book on the topic titled “Introduction to Linear Algebra“, Gilbert Strang provides a chapter dedicated to the applications of linear algebra. In it, he demonstrates specific mathematical tools rooted in linear algebra. Briefly they are:

- Matrices in Engineering, such as a line of springs.
- Graphs and Networks, such as analyzing networks.
- Markov Matrices, Population, and Economics, such as population growth.
- Linear Programming, the simplex optimization method.
- Fourier Series: Linear Algebra for functions, used widely in signal processing.
- Linear Algebra for statistics and probability, such as least squares for regression.
- Computer Graphics, such as the various translation, rescaling and rotation of images.

Another interesting application of linear algebra is that it is the type of mathematics used by Albert Einstein in parts of his theory of relativity. Specifically tensors and tensor calculus. He also introduced a new type of linear algebra notation to physics called Einstein notation, or the Einstein summation convention.

## Extensions

This section lists some ideas for extending the tutorial that you may wish to explore.

- Search books and the web for 5 quotations defining the field of linear algebra.
- Research and list 5 more applications or uses of linear algebra in the field probability and statistics.
- List and write short definitions for 10 terms used in the description of linear algebra.

If you explore any of these extensions, I’d love to know.

## Further Reading

This section provides more resources on the topic if you are looking to go deeper.

### Books

- Introduction to Linear Algebra, 2016.
- Numerical Linear Algebra, 1997.
- Linear Algebra and Matrix Analysis for Statistics, 2014.

### Articles

- Linear Algebra on Wikipedia
- Linear Algebra Category on Wikipedia
- Linear Algebra List of Topics on Wikipedia
- LAPACK on Wikipedia
- Basic Linear Algebra Subprograms on Wikipedia
- Automatically Tuned Linear Algebra Software on Wikipedia
- Einstein notation on Wikipedia
- Mathematics of general relativity on Wikipedia

### Related Posts

## Summary

In this tutorial, you discovered a gentle introduction to linear algebra from a machine learning perspective.

Specifically, you learned:

- Linear algebra is the mathematics of data.
- Linear algebra has had a marked impact on the field of statistics.
- Linear algebra underlies many practical mathematical tools, such as Fourier series and computer graphics.

Do you have any questions?

Ask your questions in the comments below and I will do my best to answer.

Thanks Jason for the ongoing contributions you make – this site is a wealth of resources for me to learn to apply ML in my academic research from scratch

You’re welcome Dan.

Jason,

Thank you for the linear algebra introduction, I found it very useful.

As part of your Extensions exercise, I wrote a blog about my own linear algebra exploration https://agentcurry.com/2018/01/linear-algebra-exploration

I only recently found your website and find it to be a great resource for machine learning. Thank you for your contributions.

~David

Thanks.

Hey brownlee,

Thanks for the linear algebra introduction.

You’re welcome.

Thank you so much for this information

I’m glad it helped.

I am learning linear algebra to implement machine learning solutions for solving business problems.

Thanks Jason for the systematic organization of the course.

You’re welcome.

sir can you help about my topic mastery level in linear algebra thesis tnx

Yes, you can start here:

https://machinelearningmastery.com/start-here/#linear_algebra

Problems of this form are generally challenging to solve because there are more unknowns (here we have 2) than there are equations to solve (here we have 3).

But 2 is less than 3.

Yes, that makes 2 x 3 unknowns across the 3 equations. My point was that the unknowns scale with the number of equations.