Linear Algebra for Machine Learning (7-Day Mini-Course)

Last Updated on August 9, 2019

Linear Algebra for Machine Learning Crash Course.

Get on top of the linear algebra used in machine learning in 7 Days.

Linear algebra is a field of mathematics that is universally agreed to be a prerequisite for a deeper understanding of machine learning.

Although linear algebra is a large field with many esoteric theories and findings, the nuts and bolts tools and notations taken from the field are required for machine learning practitioners. With a solid foundation of what linear algebra is, it is possible to focus on just the good or relevant parts.

In this crash course, you will discover how you can get started and confidently read and implement linear algebra notation used in machine learning with Python in 7 days.

This is a big and important post. You might want to bookmark it.

Kick-start your project with my new book Linear Algebra for Machine Learning, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started.

  • Update Mar/2018: Fixed a small typo in the SVD lesson.
Linear Algebra for Machine Learning (7-Day Mini-Course)

Linear Algebra for Machine Learning (7-Day Mini-Course)
Photo by Jeff Kubina, some rights reserved.

Who Is This Crash-Course For?

Before we get started, let’s make sure you are in the right place.

This course is for developers that may know some applied machine learning. Maybe you know how to work through a predictive modeling problem end-to-end, or at least most of the main steps, with popular tools.

The lessons in this course do assume a few things about you, such as:

  • You know your way around basic Python for programming.
    You may know some basic NumPy for array manipulation.
  • You want to learn linear algebra to deepen your understanding and application of machine learning.

You do NOT need to know:

  • You do not need to be a math wiz!
  • You do not need to be a machine learning expert!

This crash course will take you from a developer that knows a little machine learning to a developer who can navigate the basics of linear algebra.

Note: This crash course assumes you have a working Python3 SciPy environment with at least NumPy installed. If you need help with your environment, you can follow the step-by-step tutorial here:

Crash-Course Overview

This crash course is broken down into 7 lessons.

You could complete one lesson per day (recommended) or complete all of the lessons in one day (hardcore). It really depends on the time you have available and your level of enthusiasm.

Below is a list of the 7 lessons that will get you started and productive with linear algebra for machine learning in Python:

  • Lesson 01: Linear Algebra for Machine Learning
  • Lesson 02: Linear Algebra
  • Lesson 03: Vectors
  • Lesson 04: Matrices
  • Lesson 05: Matrix Types and Operations
  • Lesson 06: Matrix Factorization
  • Lesson 07: Singular-Value Decomposition

Each lesson could take you 60 seconds or up to 30 minutes. Take your time and complete the lessons at your own pace. Ask questions and even post results in the comments below.

The lessons expect you to go off and find out how to do things. I will give you hints, but part of the point of each lesson is to force you to learn where to go to look for help on and about the linear algebra and the NumPy API and the best-of-breed tools in Python (hint: I have all of the answers directly on this blog; use the search box).

I do provide more help in the form of links to related posts because I want you to build up some confidence and inertia.

Post your results in the comments; I’ll cheer you on!

Hang in there; don’t give up.

Note: This is just a crash course. For a lot more detail and fleshed out tutorials, see my book on the topic titled “Basics of Linear Algebra for Machine Learning“.

Need help with Linear Algebra for Machine Learning?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

Lesson 01: Linear Algebra for Machine Learning

In this lesson, you will discover the 5 reasons why a machine learning practitioner should deepen their understanding of linear algebra.

1. You Need to Learn Linear Algebra Notation

You need to be able to read and write vector and matrix notation. Algorithms are described in books, papers, and on websites using vector and matrix notation.

2. You Need to Learn Linear Algebra Arithmetic

In partnership with the notation of linear algebra are the arithmetic operations performed. You need to know how to add, subtract, and multiply scalars, vectors, and matrices.

3. You Need to Learn Linear Algebra for Statistics

You must learn linear algebra in order to be able to learn statistics. Especially multivariate statistics. In order to be able to read and interpret statistics, you must learn the notation and operations of linear algebra. Modern statistics uses both the notation and tools of linear algebra to describe the tools and techniques of statistical methods. From vectors for the means and variances of data, to covariance matrices that describe the relationships between multiple Gaussian variables.

4. You Need to Learn Matrix Factorization

Building on notation and arithmetic is the idea of matrix factorization, also called matrix decomposition.You need to know how to factorize a matrix and what it means. Matrix factorization is a key tool in linear algebra and used widely as an element of many more complex operations in both linear algebra (such as the matrix inverse) and machine learning (least squares).

5. You Need to Learn Linear Least Squares

You need to know how to use matrix factorization to solve linear least squares. Problems of this type can be framed as the minimization of squared error, called least squares, and can be recast in the language of linear algebra, called linear least squares. Linear least squares problems can be solved efficiently on computers using matrix operations such as matrix factorization.

One More Reason

If I could give one more reason, it would be: because it is fun. Seriously.

Your Task

For this lesson, you must list 3 reasons why you, personally, want to learn linear algebra.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover a concise definition of linear algebra.

Lesson 02: Linear Algebra

In this lesson, you will discover a concise definition of linear algebra.

Linear Algebra

Linear algebra is a branch of mathematics, but the truth of it is that linear algebra is the mathematics of data. Matrices and vectors are the language of data.

Linear algebra is about linear combinations. That is, using arithmetic on columns of numbers called vectors and 2D arrays of numbers called matrices, to create new columns and arrays of numbers.

Numerical Linear Algebra

The application of linear algebra in computers is often called numerical linear algebra.

It is more than just the implementation of linear algebra operations in code libraries; it also includes the careful handling of the problems of applied mathematics, such as working with the limited floating point precision of digital computers.

Applications of Linear Algebra

As linear algebra is the mathematics of data, the tools of linear algebra are used in many domains.

  • Matrices in Engineering, such as a line of springs.
  • Graphs and Networks, such as analyzing networks.
  • Markov Matrices, Population, and Economics, such as population growth.
  • Linear Programming, the simplex optimization method.
  • Fourier Series: Linear Algebra for Functions, used widely in signal processing.
  • Linear Algebra for Statistics and Probability, such as least squares for regression.
  • Computer Graphics, such as the various translation, scaling and rotation of images.

Your Task

For this lesson, you must find five quotes from research papers, blogs, or books that define the field of linear algebra.

Post your answer in the comments below. I would love to see what you discover.

In the next lesson, you will discover vectors and simple vector arithmetic.

Lesson 03: Vectors

In this lesson, you will discover vectors and simple vector arithmetic.

What is a Vector?

A vector is a tuple of one or more values called scalars.

Vectors are often represented using a lowercase character such as “v”; for example:

Where v1, v2, v3 are scalar values, often real values.

Defining a Vector

We can represent a vector in Python as a NumPy array.

A NumPy array can be created from a list of numbers. For example, below we define a vector with the length of 3 and the integer values 1, 2, and 3.

Vector Multiplication

Two vectors of equal length can be multiplied together.

As with addition and subtraction, this operation is performed element-wise to result in a new vector of the same length.

We can perform this operation directly in NumPy.

Your Task

For this lesson, you must implement other vector arithmetic operations such as addition, division, subtraction, and the vector dot product.

Post your answer in the comments below. I would love to see what you discover.

In the next lesson, you will discover matrices and simple matrix arithmetic.

Lesson 04: Matrices

In this lesson, you will discover matrices and simple matrix arithmetic.

What is a Matrix?

A matrix is a two-dimensional array of scalars with one or more columns and one or more rows.

The notation for a matrix is often an uppercase letter, such as A, and entries are referred to by their two-dimensional subscript of row (i) and column (j), such as aij. For example:

Defining a Matrix

We can represent a matrix in Python using a two-dimensional NumPy array.

A NumPy array can be constructed given a list of lists. For example, below is a 2 row, 3 column matrix.

Matrix Addition

Two matrices with the same dimensions can be added together to create a new third matrix.

The scalar elements in the resulting matrix are calculated as the addition of the elements in each of the matrices being added.

We can implement this in python using the plus operator directly on the two NumPy arrays.

Matrix Dot Product

Matrix multiplication, also called the matrix dot product, is more complicated than the previous operations and involves a rule, as not all matrices can be multiplied together.

The rule for matrix multiplication is as follows: The number of columns (n) in the first matrix (A) must equal the number of rows (m) in the second matrix (B).

For example, matrix A has the dimensions m rows and n columns and matrix B has the dimensions n and k. The n columns in A and n rows b are equal. The result is a new matrix with m rows and k columns.

The intuition for the matrix multiplication is that we are calculating the dot product between each row in matrix A with each column in matrix B. For example, we can step down rows of column A and multiply each with column 1 in B to give the scalar values in column 1 of C.

The matrix multiplication operation can be implemented in NumPy using the dot() function.

Your Task

For this lesson, you must implement more matrix arithmetic operations such as subtraction, division, the Hadamard product, and vector-matrix multiplication.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover the different types of matrices and matrix operations.

Lesson 05: Matrix Types and Operations

In this lesson, you will discover the different types of matrices and matrix operations.


A defined matrix can be transposed, which creates a new matrix with the number of columns and rows flipped.

This is denoted by the superscript “T” next to the matrix.

We can transpose a matrix in NumPy by calling the T attribute.


The operation of inverting a matrix is indicated by a -1 superscript next to the matrix; for example, A^-1. The result of the operation is referred to as the inverse of the original matrix; for example, B is the inverse of A.

Not all matrices are invertible.

A matrix can be inverted in NumPy using the inv() function.

Square Matrix

A square matrix is a matrix where the number of rows (n) equals the number of columns (m).

The square matrix is contrasted with the rectangular matrix where the number of rows and columns are not equal.

Symmetric Matrix

A symmetric matrix is a type of square matrix where the top-right triangle is the same as the bottom-left triangle.

To be symmetric, the axis of symmetry is always the main diagonal of the matrix, from the top left to the bottom right.

A symmetric matrix is always square and equal to its own transpose.

Triangular Matrix

A triangular matrix is a type of square matrix that has all values in the upper-right or lower-left of the matrix with the remaining elements filled with zero values.

A triangular matrix with values only above the main diagonal is called an upper triangular matrix. Whereas, a triangular matrix with values only below the main diagonal is called a lower triangular matrix.

Diagonal Matrix

A diagonal matrix is one where values outside of the main diagonal have a zero value, where the main diagonal is taken from the top left of the matrix to the bottom right.

A diagonal matrix is often denoted with the variable D and may be represented as a full matrix or as a vector of values on the main diagonal.

Your Task

For this lesson, you must develop examples for other matrix operations such as the determinant, trace, and rank.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover matrix factorization.

Lesson 06: Matrix Factorization

In this lesson, you will discover the basics of matrix factorization, also called matrix decomposition.

What is a Matrix Decomposition?

A matrix decomposition is a way of reducing a matrix into its constituent parts.

It is an approach that can simplify more complex matrix operations that can be performed on the decomposed matrix rather than on the original matrix itself.

A common analogy for matrix decomposition is the factoring of numbers, such as the factoring of 25 into 5 x 5. For this reason, matrix decomposition is also called matrix factorization. Like factoring real values, there are many ways to decompose a matrix, hence there are a range of different matrix decomposition techniques.

LU Matrix Decomposition

The LU decomposition is for square matrices and decomposes a matrix into L and U components.

Where A is the square matrix that we wish to decompose, L is the lower triangle matrix, and U is the upper triangle matrix. A variation of this decomposition that is numerically more stable to solve in practice is called the LUP decomposition, or the LU decomposition with partial pivoting.

The rows of the parent matrix are re-ordered to simplify the decomposition process and the additional P matrix specifies a way to permute the result or return the result to the original order. There are also other variations of the LU.

The LU decomposition is often used to simplify the solving of systems of linear equations, such as finding the coefficients in a linear regression.

The LU decomposition can be implemented in Python with the lu() function. More specifically, this function calculates an LPU decomposition.

Your Task

For this lesson, you must implement small examples of other simple methods for matrix factorization, such as the QR decomposition, the Cholesky decomposition, and the eigendecomposition.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover the Singular-Value Decomposition method for matrix factorization.

Lesson 07: Singular-Value Decomposition

In this lesson, you will discover the Singular-Value Decomposition method for matrix factorization.

Singular-Value Decomposition

The Singular-Value Decomposition, or SVD for short, is a matrix decomposition method for reducing a matrix to its constituent parts in order to make certain subsequent matrix calculations simpler.

Where A is the real m x n matrix that we wish to decompose, U is an m x m matrix, Sigma (often represented by the uppercase Greek letter Sigma) is an m x n diagonal matrix, and V^T is the transpose of an n x n matrix where T is a superscript.

Calculate Singular-Value Decomposition

The SVD can be calculated by calling the svd() function.

The function takes a matrix and returns the U, Sigma, and V^T elements. The Sigma diagonal matrix is returned as a vector of singular values. The V matrix is returned in a transposed form, e.g. V.T.

Your Task

For this lesson, you must list 5 applications of the SVD.

Bonus points if you can demonstrate each with a small example in Python.

Post your answer in the comments below. I would love to see what you discover.

This was the final lesson in the mini-course.

The End!
(Look How Far You Have Come)

You made it. Well done!

Take a moment and look back at how far you have come.

You discovered:

  • The importance of linear algebra to applied machine learning.
  • What linear algebra is all about.
  • What a vector is and how to perform vector arithmetic.
  • What a matrix is and how to perform matrix arithmetic, including matrix multiplication.
  • A suite of types of matrices, their properties, and advanced operations involving matrices.
  • Matrix factorization methods and the LU decomposition method in detail.
  • The popular Singular-Value decomposition method used in machine learning.

This is just the beginning of your journey with linear algebra for machine learning. Keep practicing and developing your skills.

Take the next step and check out my book on Linear Algebra for Machine Learning.


How Did You Do with The Mini-Course?
Did you enjoy this crash course?

Do you have any questions? Were there any sticking points?
Let me know. Leave a comment below.

Get a Handle on Linear Algebra for Machine Learning!

Linear Algebra for Machine Learning

Develop a working understand of linear algebra writing lines of code in python

Discover how in my new Ebook:
Linear Algebra for Machine Learning

It provides self-study tutorials on topics like:
Vector Norms, Matrix Multiplication, Tensors, Eigendecomposition, SVD, PCA and much more...

Finally Understand the Mathematics of Data

Skip the Academics. Just Results.

See What's Inside

114 Responses to Linear Algebra for Machine Learning (7-Day Mini-Course)

  1. Avatar
    Brandon March 23, 2018 at 5:38 am #

    As a software developer, building my skills to attain a role in data science, I am interested in learning more about linear algebra because:

    – I am intrigued to understand the math underpinning various ML algorithms
    – I would like to generally improve my mathematics skills
    – I want to improve my fluency in mathematics more generally so I can better understand published academic papers.

  2. Avatar
    Kalyan Banga March 23, 2018 at 4:43 pm #

    Hi Jason, I am founder of Fusion Analytics World, the Leading Digital Platform for News, Industry Analysis, Jobs, Courses, Events & much more. Covering Research & Analytics across Industries.

    I would be happy to feature your course(s) for free on our website and help you reach out to our targeted research intelligence and analytics focussed readers.

    We would be happy to feature your articles on machine learning as well. Let me know your thoughts.

  3. Avatar
    Fati March 24, 2018 at 6:52 pm #


    Since I started to learn about machine learning, I found the importance of math specially linear algebra.
    This 7 mini lessons can help to find:
    -The most important notation and method which you need as a data scientist or ML developer.
    -Better understanding of ML and the math behind it


  4. Avatar
    Prabhjot March 25, 2018 at 1:55 pm #

    I find your lessons very useful. Thanks for sharing this knowledge.
    I have been looking for good resources on a good way to import my own data into Python (data could be images or excel file, etc.)
    I am quite familiar with MATLAB and fairly new to Python.. I can’t seem to find a good way to import things in Python. I would appreciate if you could please point me to some good resources. Thanks.!

  5. Avatar
    Susensio March 30, 2018 at 6:50 am #

    Great crash course man! I’m having a great time implementing these things from zero in python, I needed this linear algebra foundation refresher!

    • Avatar
      Jason Brownlee March 31, 2018 at 6:26 am #

      Thanks, I’m glad it helped.

      • Avatar
        Susensio April 8, 2018 at 3:57 am #

        BTW, there is a small typo in LU Matrix Decomposition section, where you mention ‘…calculates an LPU decomposition…’ I think it should be PLU.
        I drove myself crazy searching the difference between LPU and PLU lol

  6. Avatar
    Amit Mukherjee July 16, 2018 at 2:58 pm #

    I am learning linear algebra because it is a prerequisite for deep learning for solving computer vision problems.

  7. Avatar
    Amit Mukherjee July 16, 2018 at 11:09 pm #

    # dot product of vectors. Both vectors must be of the same size
    from numpy import array, dot
    a = array([1, 2])
    b = array([13, 14])
    c = dot(a,b)
    #c =

  8. Avatar
    Amit Mukherjee July 17, 2018 at 1:56 am #

    # multiply a matrix with a vector
    from numpy import array, dot
    A = array([[1, 2, 3], [3, 4, 5], [5, 6, 7]])
    b = array([7, 8, 9])
    C = dot(A,b)

  9. Avatar
    Amit Mukherjee July 17, 2018 at 5:49 pm #

    Lower triangular matrix
    [[1 0 0]
    [0 2 0]
    [5 6 3]]
    Upper triangular matrix
    [[1 2 3]
    [0 4 0]
    [0 0 6]]
    Diagonal matrix
    [[1 0 0]
    [0 2 0]
    [0 0 3]]

  10. Avatar
    Amit Mukherjee July 17, 2018 at 5:50 pm #

    A symmetric matrix
    [[1 2 3]
    [2 4 6]
    [3 6 5]]

  11. Avatar
    Ramesh Gupta February 12, 2019 at 7:03 am #

    Triangular Matrix


  12. Avatar
    Abid Rizvi March 22, 2019 at 4:03 am #

    I am inspired by you a lot and this is my first comment after constantly viewing your website for one and a half year(almost). I want to work with you remotely. Is it possible in some way?

    • Avatar
      Jason Brownlee March 22, 2019 at 8:36 am #

      Thanks, I’m glad that the tutorials are helpful.

      A great way to work togehter/contribute is for you to go through some of the tutorials and report your results as comments.

  13. Avatar
    Natasa January 28, 2020 at 2:49 am #

    Hi Jason,

    First thank you for this opportunity. My reasons are following:

    1. To recall&recover my university knowledge on some parts of Linear Algebra.
    2. To understand what is behind formulas deployed in python, so I will able to understand reasons for getting “strange” results
    3. Deep understanding of how Linear Algebra “tools” can help us in investigating patterns in data. This is exciting.


  14. Avatar
    Rui Antunes February 14, 2020 at 8:37 pm #

    As a student I’m interested in learning linear algebra because I want to have a greater understanding on mathematics, statistics, probability theory, and machine learning.
    Thank you for your tutorials

  15. Avatar
    Aimen Shahid February 18, 2020 at 3:41 pm #

    1. During my Bachelor’s Degree I never paid attention to the Linear Algebra course and so I barely passed. This time around I want to properly learn it.
    2. I am starting to get into ML and Computer Vision and I’ve been told I need to have a good understanding of Linear Algebra for that.
    3. I’ve been a bit out of practice with maths and would like to get back into it.

  16. Avatar
    Dustin February 26, 2020 at 4:35 am #

    My 3 reasons for taking this course:

    1. I enjoy learning new aspects to computer programming.
    2. I want to build a system that analyzes the use of the English language to assist in improving my students’ writing.
    3. I have never been confident in my math skills and shied away from to subject all through my schooling – I’d like to prove to myself that even higher level math theory is something I can grasp.

  17. Avatar
    Ajay Kumar March 18, 2020 at 8:43 pm #

    I wanted to learn the equation of PCA and SVM where Linear Algebra is used. I am more enthusiastic to go through each and every steps of this study

  18. Avatar
    Venkata Ramana Mantravadi April 4, 2020 at 3:54 am #


    from numpy import array
    from numpy import linalg
    A = array([[1.0, 2.0], [3.0, 4.0]])

    output : -2.0



    output : 5.0

  19. Avatar
    Venkata Ramana Mantravadi April 4, 2020 at 4:41 am #

    If A is matrix then
    if A^(T)=A, (transpose)
    then A is called a symmetric matrix

  20. Avatar
    Travis Carter April 29, 2020 at 2:57 pm #

    7 Day Course: Day 1.
    Reasons for Learning Linear Algebra
    1. To clarify the language of machine learning notation, right now it’s practically gibberish.
    2. In hopes of not only knowing the words and notation, but understanding what the operations do.
    3. To gain enough knowledge of linear algebra to put it into practice. What I learned in school was only retained for the test, and has never transitioned into my toolbox.

  21. Avatar
    Vishwanath Salokye May 8, 2020 at 9:45 pm #

    7 day course completed in one day in fact few hours..
    I enjoyed recollecting my engg days and able to create my own examples and solve them

    Thanks a lot Jason

  22. Avatar
    K. Siva Senthil (Siva) May 11, 2020 at 7:37 pm #

    Hi Jason,

    My first request to you; before I perform my first task is –

    Please move this section “Leave a Reply” right to the top of comment section. I could avoid scrolling all other comments to post mine. I guess if a reader is interested, scrolling down and reading the comments would still be viable.

    Now to the task; This is my first day of the crash course.
    I personally want to learn linear algebra;
    1. This will help me appreciate many nuances involved in ML algorithms.
    2. I will refresh my earlier formal training on this topic during my under graduate studies but have not used since many years.
    3. I will be able read research papers on algorithms a bit more fluently.

  23. Avatar
    K. Siva Senthil (Siva) May 13, 2020 at 2:49 am #

    Hi Jason,

    My day 2 task –

    Jason Brownlee – “[L]inear algebra is the mathematics of data. Matrices and vectors are language of data.” is the best definition I have read. The other definitions I have encountered are –

    Wikipedia – Linear algebra is the branch of mathematics concerning equations which are linear.

    Introduction to Linear Algebra, 2nd edition By T.A Whitelaw – [Linear algebra] solves systems of simultaneous linear equations and rectangular arrays (matrices, as we call them) of coefficients occurring in such systems. It is also true that many ideas of importance in linear algebra could be traced to geometrical sources.

    Byju’s Learning ( – Linear algebra is the study of linear combinations. It is the study of vector spaces, lines and planes, and some mappings that are required to perform the linear transformations. It includes vectors, matrices and linear functions. It is the study of linear sets of equations and its transformation properties.

    I also noticed much of text books treat linear algebra directly by starting with vectors without much discourse on attempting to describe the topic of linear algebra. E.g.

    1. An introduction to linear algebra by L. Mirsky.
    2. Linear algebra: A course for physicsts and engineers by Arak M. Mithai, Hans J Haubold
    3. Introduction to linear algebra by Gilbert Strang
    4. Linear algebra in 25 lectures –
    5. Linear algebra by Jeff Hefferon 3rd edition

    Retrospectively; vectors, matrices are like axes of linear algebra as x, y and z axes are to space. i.e. Vectors and matrices are elementary and inseparable concepts of linear algebra.

    Some applications that I discovered are in the research of –
    1. Spectral clustering –
    2. Adaptive filters used in wireless communicaion –
    3. Latent semantic analysis in NLP :
    4. Information retrieval –
    5. Philosophy of science – Linear algebra applied to linear metatheory –

  24. Avatar
    K. Siva Senthil (Siva) May 20, 2020 at 1:02 am #

    My day 3 activity following is the code –

    from numpy import array
    from numpy import multiply
    from numpy import divide

    def dotProduct(multiplier, multiplicand) :
    return multiply(multiplier, multiplicand)

    def add(augend, addend) :
    return augend + addend

    def division(dividend, divisor) :
    return divide(dividend, divisor)

    def subtraction(minuend, subtrahend) :
    return minuend – subtrahend

    subtrahend = divisor = multiplier = addend = array([2,3])
    minuend = dividend = multiplicand = augend = array([6,9])

    print(“Vector dot product – ” + str(dotProduct(multiplier, multiplicand)))

    print(“Vector addition – ” + str(add(augend, addend)))

    print(“Vector division – ” + str(division(dividend, divisor)))

    print(“Vector subtraction – ” + str(subtraction(minuend, subtrahend)))

  25. Avatar
    Anoop Nayak July 13, 2020 at 9:31 pm #

    Lesson1: Reasons why I want to learn Linear algebra

    1) All the available data is in the form of some matrices, vectors. I hope that learning linear algebra may help we work with this format of available data.

    2) I have got well enough grasp on trying to relate two-three variables and small codes can work out these relations are enough. But when working with many variables, I hope that there are many better parameters that can help me better address the problems.

    3) It like to work around numbers and try another tool to interpret them.

  26. Avatar
    Ayesha Ayub August 7, 2020 at 1:50 am #

    I have just started my research on text summarization. So, I believe that learning linear algebra will surely be helpful while understanding the mathematical foundations of the deep learning models. Thanks.

  27. Avatar
    Sayam Kumar Das October 3, 2020 at 5:24 pm #

    I think rather than importing like
    from numpy import array, dot

    just use the syntax

    import numpy as np

    then use them as

    is much easier, for you don’t have to remember them at the beginning rather use them according to your code. Correct me if I made any mistake.

    • Avatar
      Jason Brownlee October 4, 2020 at 6:50 am #

      Thanks for your suggestion.

      I don’t use that idiom intentionally, I don’t like it.

  28. Avatar
    Sayam Kumar Das October 3, 2020 at 5:29 pm #

  29. Avatar
    Sayam Kumar Das October 3, 2020 at 7:35 pm #

    import numpy as np


  30. Avatar
    Sayam Kumar Das October 3, 2020 at 9:14 pm #

    import numpy.linalg as np
    import scipy.linalg as sp


  31. Avatar
    SAYAM KUMAR DAS October 3, 2020 at 9:50 pm #

    import numpy.linalg as np
    import scipy.linalg as sp


    Applications of SVD are:-
    1.Low Rank Approximation
    2.Total Least Square minimization
    3. Pseudoinverse
    4.Solution of Homogeneous Linear Equations
    5. Finding Range, null space and Rank.

  32. Avatar
    thawatchai December 9, 2020 at 4:11 pm #

    my 3 reason for take this course

    1. I need to know about tools for Machine Learning and Data Science.
    2. I need to see, how to use linear algebra in Machine Learning and Data Science.
    3. I need to know what is Machine Learning and Data Science.

  33. Avatar
    Anil Vaidya December 23, 2020 at 3:19 pm #

    Very nice 7 day course, have done 3 days so far
    Seems lot of learning

  34. Avatar
    Nisha Varghese January 18, 2021 at 5:15 pm #

    Thank you, I joined for learning, as a prerequisite for my research in Machine Learning and Natural Language Processing

  35. Avatar
    Cassandra Augustin February 8, 2021 at 1:19 pm #

    Thank you for sharing this information! I am totally new to the tech world.

    1. I need to learn this information so that I can understand what my professor is saying, lol!
    2. This will help me with Machine Learning and Data Science.
    3. This will help me do well in my major.

  36. Avatar
    Beni Utomo February 16, 2021 at 9:26 am #

    Thank you for the course. I interest learn Linear Algebra or Elementary Linear Algebra because:
    1. I would like to know about the application of Linear Algebra,
    2. This course help me how to use basic mathematics operation on number specially matrice,
    3. This will help me to look at Linear Algebra from another point of view, for an application.

  37. Avatar
    Raj February 20, 2021 at 3:10 pm #


    Linear Algebra is the basic foundation for ML
    It helps to develop logical thinking
    Linear Algebra helps to evaluate various applications in our day to day life- ex: a mobile plan etc.

  38. Avatar
    Manuel February 22, 2021 at 7:23 pm #

    1. Because most of the correlation between variables can be expressed using matrices so it’s manipulation is important.

    2. Simplify hardcore math problems.

    3. Curiosity.

  39. Avatar
    Marek W. March 9, 2021 at 8:12 am #

    Why linear algebra? Beacuse…
    1. I have been not bad during my engineering study
    2. It was a long time ago…
    3. I am curious how linear algebra could be funny (your “one more reason”).

  40. Avatar
    Jeffrey Sparks April 28, 2021 at 11:44 pm #

    – Appear credible when discussing machine learning with other professionals.
    – Understand the basics so that I can build on that foundation.
    – Support my efforts to introduce machine learning in industrial controllers.

  41. Avatar
    Viji May 25, 2021 at 6:06 am #

    I’ve returned to study (after 18.5 years) and doing PhD. I wanted to get deeper knowledge in machine learning. As machine learning tightly ties up with math, I’m working on to refresh my math knowledge, and to understand how machine learning and math algorithms are tied together.

  42. Avatar
    Viji May 25, 2021 at 9:41 pm #

    Lesson 02: Applications:

    Looks what I found are not quotes, I’ll need to search more to get the quotes. But came across interesting books, sharing come content from them. I’ll gradually go through these books.

    1. Management Science – Management decisions often involve making choices between a number of
    alternatives. By assuming that the choices are to be made with a fixed goal in mind
    and should be based on a set of evaluation criteria. These decisions often involve
    a number of human judgments that may not always be completely consistent. The
    analytic hierarchy process is a technique for rating the various alternatives based
    on a chart consisting of weighted criteria and ratings that measure how well each
    alternative satisfies each of the criteria. Set up such a chart or decision tree for the process, then weights and ratings have been assigned to each entry in the chart, an overall
    ranking of the alternatives is calculated using simple matrix-vector operations. Also this book discussed how to use advanced matrix techniques to determine appropriate weights and ratings for the decision process. And then presented a numerical algorithm for computing the weight
    vectors used in the decision process.

    2. Suppose a nation’s economy is divided into many sectors, such as various manufacturing, communication, entertainment, and service industries. Suppose that for each sector we know its total output for one year and we know exactly how this output is divided
    or “exchanged” among the other sectors of the economy. Let the total dollar value of a
    sector’s output be called the price of that output. Leontief proved the following result.
    There exist equilibrium prices that can be assigned to the total outputs of the
    various sectors in such a way that the income of each sector exactly balances its
    expenses. Eg. Find equilibrium prices that make each sector’s income match its expenditures. Leontief – The system of 500 equations in 500 variables, is now known as a Leontief “input–output” (or “production”) model.

    3. Computer Graphics: Mathematics used to manipulate and display graphical images such as a wire-frame model of an airplane. Such an image (or picture) consists of a number of points, connecting lines or curves, and information about how to fill in closed regions bounded by the lines and curves. Often, curved lines are approximated by short straight-line segments, and a figure is defined mathematically by a list of points. Among the simplest 2D graphics symbols are letters used for labels on the screen. Some letters are stored as wire-frame objects; others that have curved portions are stored with additional mathematical formulas for the curves.

    4. Now that powerful computers are widely available, more and more scientific and
    engineering problems are being treated in a way that uses discrete, or digital, data rather
    than continuous data. Difference equations are often the appropriate tool to analyze
    such data. Even when a differential equation is used to model a continuous process, a
    numerical solution is often produced from a related difference equation.
    Eg. Discrete-Time Signals, Linear Difference Equations etc.,

  43. Avatar
    Viji May 26, 2021 at 11:01 am #

    Lesson 03: Vector operations
    import numpy as np
    from numpy import array
    # Vector addition
    a1 = array([2,4,6,8])
    a2 = array([3,4,5,6])

    # Vector multiplication
    b1 = array([2,4,6,8])
    b2 = array([3,4,5,6])
    print(b1 * b2)

    # Vector subtraction
    c1 = array([2,4,6,8])
    c2 = array([3,4,5,6])

    # Vector dot product
    d1 = array([2,4,6,8])
    d2 = array([3,4,5,6])

    [ 5 8 11 14]
    [ 6 16 30 48]
    [-1 0 1 2]

  44. Avatar
    Abu Shamim Mohammad Arif July 7, 2021 at 5:56 pm #

    I want to learn Linear Algebra because

    1. Linear Algebra deals the vectors and matrices which are heavily used in Machine Learning

    2. Linear Algebra deals with statistics which is the important element of Machine Learning

    3. Linear Algebra is a key foundation to the field of Machine Learning, as the notations and operations of Linear Algebra are used to describe the operation of algorithms and the implementation of algorithms in code

  45. Avatar
    David August 22, 2021 at 5:04 am #

    Ive been wanting to learn since I was kid
    I already watched videos on it
    I’m 20 years old

    • Avatar
      Adrian Tam August 23, 2021 at 5:14 am #

      Thanks. Hope you enjoyed.

  46. Avatar
    Amir Bahmanyari November 5, 2021 at 5:08 am #

    To strengthen the in-flight mental mapping of the optimization process details to actual data entities i.e. vectors, metrices , tensors etc. details and how to accurately code the hypothesis’
    hyperparameters calculation for the best accuracy. As simple as it is presumed, when it gets to coding it, it becomes tricky but Linear Algebra inter data-entities calculations short-cuts make the coding easier.

  47. Avatar
    Brian November 5, 2021 at 7:52 am #

    1. Had Linear Algebra in college from a good instructor – sometimes confusing
    2. Always wanted to clarify the concepts since then
    3. I have slept several times since then

  48. Avatar
    J. K. Gbang November 26, 2021 at 3:32 pm #

    I am a Statistician and data is my bed-fellow. Therefore learning the mathematics of data is a must. It will prepare me for a better understanding of the applications in topics such as mixed models, generalised linear models, Bayesian models, etc.

    • Avatar
      Adrian Tam November 29, 2021 at 8:32 am #

      As a statistician, you will find machine learning models resonate with the statistical models you learned before. Hope you enjoy the learning journey!

  49. Avatar
    ANA January 12, 2022 at 9:28 am #


  50. Avatar
    April January 26, 2022 at 5:05 am #

    Lesson 2:

    The algebra of vectors and matrices, as distinct from the ordinary algebra of real numbers and the abstract algebra of unspecified entities. – Webster’s New World College Dictionary. Copyright © 2014 by Houghton Mifflin Harcourt Publishing Company

    Linear Algebra is a continuous form of Mathematics … it allows you to model natural phenomena and to compute them efficiently.

    Linear Algebra is the branch of mathematics aimed at solving systems of linear equations with a finite number of unknowns. In particular, one would like to obtain answers to the following questions: 1) Characterization of solutions: Are there solutions to a given system of linear equations? How many solutions are there? 2) Finding solutions: How does the solution set look? What are the solutions?

    • Avatar
      James Carmichael January 26, 2022 at 11:04 am #

      Great work April!

  51. Avatar
    April January 26, 2022 at 5:46 am #

    Lesson 3:
    v = np.array([1,2,3])
    w = np.array([1,2,3])
    v+w = [2,4,6]
    v-w = [0,0,0]
    2*v = [2,4,6]
    w*v = [1,4,9],v) = 14
    w/v = [1,1,1]

    • Avatar
      James Carmichael January 26, 2022 at 11:00 am #

      Thank you for your feedback, April! Keep up the great work!

  52. Avatar
    Viktor August 7, 2022 at 12:45 am #

    I just want to learn how things work since I’m new to AI)

  53. Avatar
    Monika Malanoski September 20, 2022 at 2:15 am #

    Very helpful thank you very much :thumbsup:

    • Avatar
      James Carmichael September 20, 2022 at 9:36 am #

      You are very welcome Monika! We appreciate the feedback and support!

  54. Avatar
    Jake November 5, 2022 at 6:58 am #

    not sure if my previous comment made it… but here are my response to days 1-2 of crashcourse:

    I want to learn linear algebra to…

    1) gain practical competence in machine learning/data science to have meaningful work
    2) increase my capacity to understand the natural world
    3) play my role in passing on knowledge and helping others to achieve their goals

    five definitions of linear algebra according to:

    my college text book (from which I have forgotten everything…:/): linear algebra “… is the art of solving systems of linear equations” -Linear Algebra with Applications by Otto Bretscher

    wikipedia: “Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (also called linear spaces), linear maps (also called linear transformations), and systems of linear equations.”

    meriiam-webster: “a branch of mathematics that is concerned with mathematical structures closed under the operations of addition and scalar multiplication and that includes the theory of systems of linear equations, matrices, determinants, vector spaces, and linear transformations” “Linear algebra is the study of vectors and linear functions.” .. also the study of mathematical structures of information : “Linear algebra uses the tools and methods of vector and matrix operations to determine the properties of linear systems.”

    • Avatar
      James Carmichael November 5, 2022 at 8:02 am #

      Outstanding feedback Jake! Keep up the great work!

  55. Avatar
    Aashi Goel April 21, 2023 at 12:49 am #

    I want to learn linear algebra because:
    1. I find it interesting!
    2. I wish to know the details about calculations that need to be done for various machine learning algorithms so that I can better understand various hyperparameters and how to fine-tune them.
    3. I just like to learn as much as I can about… Everything.

    • Avatar
      James Carmichael April 21, 2023 at 9:28 am #

      Thank you Aashi! Keep up the great work and let us know if we can help answer any questions regarding our content.

  56. Avatar
    Manuel Adigun April 30, 2023 at 8:55 am #

    Day 1
    Linerar algebra is the study of linear combinations ranging from mappings that deals with vectors, matrices and linear functions- Byju’

    Linear algebra is a branch of mathematics concerning linear equations- Wikipedia

    Linear algebra deals with mathematical equations and their representations in vector space-

    Linear Algebra is a systematic theory regarding the solutions of systems of linear equations-

    Linear algebra is the branch of mathematics concerning vector spaces, often finite or countable infinite dimensional-

    • Avatar
      James Carmichael April 30, 2023 at 10:40 am #

      Outstanding feedback Manuel! Let us know if we can help answer any questions as you work through the material.

  57. Avatar
    Manuel Adigun April 30, 2023 at 8:56 am #

    I want to learn linear algebra to augment my foundational knowledge of Machine learning models and to understand them at a fundamental level

  58. Avatar
    Sid Peck June 13, 2023 at 11:30 pm #

    I want to learn linear algebra and machine learning because I am Neuroscience PhD student. It could help me in the following ways:

    1. Automating multivariate analysis for behavioral data. This would allow us to very quickly analyze data sets and reveal “hidden” secondary and tertiary variables resulting from event sequences in the behavior that could provide more information about how the animals in the tasks learn.

    2. Creating simulations of behavioral tasks that are accurate based off data that currently exists. This allows us to make predictions about the outcome of behavioral data and may save us resources by accurately predicting the results prior to the expenditure of animals and time.

    3. AI learning and neural networking. I do not know as much about these topics and their direct applications to my research, but I know it is worth learning and machine learning and liner algebra is a prerequisite.

Leave a Reply