Linear Algebra for Machine Learning (7-Day Mini-Course)

Last Updated on August 9, 2019

Linear Algebra for Machine Learning Crash Course.

Get on top of the linear algebra used in machine learning in 7 Days.

Linear algebra is a field of mathematics that is universally agreed to be a prerequisite for a deeper understanding of machine learning.

Although linear algebra is a large field with many esoteric theories and findings, the nuts and bolts tools and notations taken from the field are required for machine learning practitioners. With a solid foundation of what linear algebra is, it is possible to focus on just the good or relevant parts.

In this crash course, you will discover how you can get started and confidently read and implement linear algebra notation used in machine learning with Python in 7 days.

This is a big and important post. You might want to bookmark it.

Kick-start your project with my new book Linear Algebra for Machine Learning, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started.

  • Update Mar/2018: Fixed a small typo in the SVD lesson.
Linear Algebra for Machine Learning (7-Day Mini-Course)

Linear Algebra for Machine Learning (7-Day Mini-Course)
Photo by Jeff Kubina, some rights reserved.

Who Is This Crash-Course For?

Before we get started, let’s make sure you are in the right place.

This course is for developers that may know some applied machine learning. Maybe you know how to work through a predictive modeling problem end-to-end, or at least most of the main steps, with popular tools.

The lessons in this course do assume a few things about you, such as:

  • You know your way around basic Python for programming.
    You may know some basic NumPy for array manipulation.
  • You want to learn linear algebra to deepen your understanding and application of machine learning.

You do NOT need to know:

  • You do not need to be a math wiz!
  • You do not need to be a machine learning expert!

This crash course will take you from a developer that knows a little machine learning to a developer who can navigate the basics of linear algebra.

Note: This crash course assumes you have a working Python3 SciPy environment with at least NumPy installed. If you need help with your environment, you can follow the step-by-step tutorial here:

Crash-Course Overview

This crash course is broken down into 7 lessons.

You could complete one lesson per day (recommended) or complete all of the lessons in one day (hardcore). It really depends on the time you have available and your level of enthusiasm.

Below is a list of the 7 lessons that will get you started and productive with linear algebra for machine learning in Python:

  • Lesson 01: Linear Algebra for Machine Learning
  • Lesson 02: Linear Algebra
  • Lesson 03: Vectors
  • Lesson 04: Matrices
  • Lesson 05: Matrix Types and Operations
  • Lesson 06: Matrix Factorization
  • Lesson 07: Singular-Value Decomposition

Each lesson could take you 60 seconds or up to 30 minutes. Take your time and complete the lessons at your own pace. Ask questions and even post results in the comments below.

The lessons expect you to go off and find out how to do things. I will give you hints, but part of the point of each lesson is to force you to learn where to go to look for help on and about the linear algebra and the NumPy API and the best-of-breed tools in Python (hint: I have all of the answers directly on this blog; use the search box).

I do provide more help in the form of links to related posts because I want you to build up some confidence and inertia.

Post your results in the comments; I’ll cheer you on!

Hang in there; don’t give up.

Note: This is just a crash course. For a lot more detail and fleshed out tutorials, see my book on the topic titled “Basics of Linear Algebra for Machine Learning“.

Need help with Linear Algebra for Machine Learning?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

Download Your FREE Mini-Course

Lesson 01: Linear Algebra for Machine Learning

In this lesson, you will discover the 5 reasons why a machine learning practitioner should deepen their understanding of linear algebra.

1. You Need to Learn Linear Algebra Notation

You need to be able to read and write vector and matrix notation. Algorithms are described in books, papers, and on websites using vector and matrix notation.

2. You Need to Learn Linear Algebra Arithmetic

In partnership with the notation of linear algebra are the arithmetic operations performed. You need to know how to add, subtract, and multiply scalars, vectors, and matrices.

3. You Need to Learn Linear Algebra for Statistics

You must learn linear algebra in order to be able to learn statistics. Especially multivariate statistics. In order to be able to read and interpret statistics, you must learn the notation and operations of linear algebra. Modern statistics uses both the notation and tools of linear algebra to describe the tools and techniques of statistical methods. From vectors for the means and variances of data, to covariance matrices that describe the relationships between multiple Gaussian variables.

4. You Need to Learn Matrix Factorization

Building on notation and arithmetic is the idea of matrix factorization, also called matrix decomposition.You need to know how to factorize a matrix and what it means. Matrix factorization is a key tool in linear algebra and used widely as an element of many more complex operations in both linear algebra (such as the matrix inverse) and machine learning (least squares).

5. You Need to Learn Linear Least Squares

You need to know how to use matrix factorization to solve linear least squares. Problems of this type can be framed as the minimization of squared error, called least squares, and can be recast in the language of linear algebra, called linear least squares. Linear least squares problems can be solved efficiently on computers using matrix operations such as matrix factorization.

One More Reason

If I could give one more reason, it would be: because it is fun. Seriously.

Your Task

For this lesson, you must list 3 reasons why you, personally, want to learn linear algebra.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover a concise definition of linear algebra.

Lesson 02: Linear Algebra

In this lesson, you will discover a concise definition of linear algebra.

Linear Algebra

Linear algebra is a branch of mathematics, but the truth of it is that linear algebra is the mathematics of data. Matrices and vectors are the language of data.

Linear algebra is about linear combinations. That is, using arithmetic on columns of numbers called vectors and 2D arrays of numbers called matrices, to create new columns and arrays of numbers.

Numerical Linear Algebra

The application of linear algebra in computers is often called numerical linear algebra.

It is more than just the implementation of linear algebra operations in code libraries; it also includes the careful handling of the problems of applied mathematics, such as working with the limited floating point precision of digital computers.

Applications of Linear Algebra

As linear algebra is the mathematics of data, the tools of linear algebra are used in many domains.

  • Matrices in Engineering, such as a line of springs.
  • Graphs and Networks, such as analyzing networks.
  • Markov Matrices, Population, and Economics, such as population growth.
  • Linear Programming, the simplex optimization method.
  • Fourier Series: Linear Algebra for Functions, used widely in signal processing.
  • Linear Algebra for Statistics and Probability, such as least squares for regression.
  • Computer Graphics, such as the various translation, scaling and rotation of images.

Your Task

For this lesson, you must find five quotes from research papers, blogs, or books that define the field of linear algebra.

Post your answer in the comments below. I would love to see what you discover.

In the next lesson, you will discover vectors and simple vector arithmetic.

Lesson 03: Vectors

In this lesson, you will discover vectors and simple vector arithmetic.

What is a Vector?

A vector is a tuple of one or more values called scalars.

Vectors are often represented using a lowercase character such as “v”; for example:

Where v1, v2, v3 are scalar values, often real values.

Defining a Vector

We can represent a vector in Python as a NumPy array.

A NumPy array can be created from a list of numbers. For example, below we define a vector with the length of 3 and the integer values 1, 2, and 3.

Vector Multiplication

Two vectors of equal length can be multiplied together.

As with addition and subtraction, this operation is performed element-wise to result in a new vector of the same length.

We can perform this operation directly in NumPy.

Your Task

For this lesson, you must implement other vector arithmetic operations such as addition, division, subtraction, and the vector dot product.

Post your answer in the comments below. I would love to see what you discover.

In the next lesson, you will discover matrices and simple matrix arithmetic.

Lesson 04: Matrices

In this lesson, you will discover matrices and simple matrix arithmetic.

What is a Matrix?

A matrix is a two-dimensional array of scalars with one or more columns and one or more rows.

The notation for a matrix is often an uppercase letter, such as A, and entries are referred to by their two-dimensional subscript of row (i) and column (j), such as aij. For example:

Defining a Matrix

We can represent a matrix in Python using a two-dimensional NumPy array.

A NumPy array can be constructed given a list of lists. For example, below is a 2 row, 3 column matrix.

Matrix Addition

Two matrices with the same dimensions can be added together to create a new third matrix.

The scalar elements in the resulting matrix are calculated as the addition of the elements in each of the matrices being added.

We can implement this in python using the plus operator directly on the two NumPy arrays.

Matrix Dot Product

Matrix multiplication, also called the matrix dot product, is more complicated than the previous operations and involves a rule, as not all matrices can be multiplied together.

The rule for matrix multiplication is as follows: The number of columns (n) in the first matrix (A) must equal the number of rows (m) in the second matrix (B).

For example, matrix A has the dimensions m rows and n columns and matrix B has the dimensions n and k. The n columns in A and n rows b are equal. The result is a new matrix with m rows and k columns.

The intuition for the matrix multiplication is that we are calculating the dot product between each row in matrix A with each column in matrix B. For example, we can step down rows of column A and multiply each with column 1 in B to give the scalar values in column 1 of C.

The matrix multiplication operation can be implemented in NumPy using the dot() function.

Your Task

For this lesson, you must implement more matrix arithmetic operations such as subtraction, division, the Hadamard product, and vector-matrix multiplication.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover the different types of matrices and matrix operations.

Lesson 05: Matrix Types and Operations

In this lesson, you will discover the different types of matrices and matrix operations.


A defined matrix can be transposed, which creates a new matrix with the number of columns and rows flipped.

This is denoted by the superscript “T” next to the matrix.

We can transpose a matrix in NumPy by calling the T attribute.


The operation of inverting a matrix is indicated by a -1 superscript next to the matrix; for example, A^-1. The result of the operation is referred to as the inverse of the original matrix; for example, B is the inverse of A.

Not all matrices are invertible.

A matrix can be inverted in NumPy using the inv() function.

Square Matrix

A square matrix is a matrix where the number of rows (n) equals the number of columns (m).

The square matrix is contrasted with the rectangular matrix where the number of rows and columns are not equal.

Symmetric Matrix

A symmetric matrix is a type of square matrix where the top-right triangle is the same as the bottom-left triangle.

To be symmetric, the axis of symmetry is always the main diagonal of the matrix, from the top left to the bottom right.

A symmetric matrix is always square and equal to its own transpose.

Triangular Matrix

A triangular matrix is a type of square matrix that has all values in the upper-right or lower-left of the matrix with the remaining elements filled with zero values.

A triangular matrix with values only above the main diagonal is called an upper triangular matrix. Whereas, a triangular matrix with values only below the main diagonal is called a lower triangular matrix.

Diagonal Matrix

A diagonal matrix is one where values outside of the main diagonal have a zero value, where the main diagonal is taken from the top left of the matrix to the bottom right.

A diagonal matrix is often denoted with the variable D and may be represented as a full matrix or as a vector of values on the main diagonal.

Your Task

For this lesson, you must develop examples for other matrix operations such as the determinant, trace, and rank.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover matrix factorization.

Lesson 06: Matrix Factorization

In this lesson, you will discover the basics of matrix factorization, also called matrix decomposition.

What is a Matrix Decomposition?

A matrix decomposition is a way of reducing a matrix into its constituent parts.

It is an approach that can simplify more complex matrix operations that can be performed on the decomposed matrix rather than on the original matrix itself.

A common analogy for matrix decomposition is the factoring of numbers, such as the factoring of 25 into 5 x 5. For this reason, matrix decomposition is also called matrix factorization. Like factoring real values, there are many ways to decompose a matrix, hence there are a range of different matrix decomposition techniques.

LU Matrix Decomposition

The LU decomposition is for square matrices and decomposes a matrix into L and U components.

Where A is the square matrix that we wish to decompose, L is the lower triangle matrix, and U is the upper triangle matrix. A variation of this decomposition that is numerically more stable to solve in practice is called the LUP decomposition, or the LU decomposition with partial pivoting.

The rows of the parent matrix are re-ordered to simplify the decomposition process and the additional P matrix specifies a way to permute the result or return the result to the original order. There are also other variations of the LU.

The LU decomposition is often used to simplify the solving of systems of linear equations, such as finding the coefficients in a linear regression.

The LU decomposition can be implemented in Python with the lu() function. More specifically, this function calculates an LPU decomposition.

Your Task

For this lesson, you must implement small examples of other simple methods for matrix factorization, such as the QR decomposition, the Cholesky decomposition, and the eigendecomposition.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover the Singular-Value Decomposition method for matrix factorization.

Lesson 07: Singular-Value Decomposition

In this lesson, you will discover the Singular-Value Decomposition method for matrix factorization.

Singular-Value Decomposition

The Singular-Value Decomposition, or SVD for short, is a matrix decomposition method for reducing a matrix to its constituent parts in order to make certain subsequent matrix calculations simpler.

Where A is the real m x n matrix that we wish to decompose, U is an m x m matrix, Sigma (often represented by the uppercase Greek letter Sigma) is an m x n diagonal matrix, and V^T is the transpose of an n x n matrix where T is a superscript.

Calculate Singular-Value Decomposition

The SVD can be calculated by calling the svd() function.

The function takes a matrix and returns the U, Sigma, and V^T elements. The Sigma diagonal matrix is returned as a vector of singular values. The V matrix is returned in a transposed form, e.g. V.T.

Your Task

For this lesson, you must list 5 applications of the SVD.

Bonus points if you can demonstrate each with a small example in Python.

Post your answer in the comments below. I would love to see what you discover.

This was the final lesson in the mini-course.

The End!
(Look How Far You Have Come)

You made it. Well done!

Take a moment and look back at how far you have come.

You discovered:

  • The importance of linear algebra to applied machine learning.
  • What linear algebra is all about.
  • What a vector is and how to perform vector arithmetic.
  • What a matrix is and how to perform matrix arithmetic, including matrix multiplication.
  • A suite of types of matrices, their properties, and advanced operations involving matrices.
  • Matrix factorization methods and the LU decomposition method in detail.
  • The popular Singular-Value decomposition method used in machine learning.

This is just the beginning of your journey with linear algebra for machine learning. Keep practicing and developing your skills.

Take the next step and check out my book on Linear Algebra for Machine Learning.


How Did You Do with The Mini-Course?
Did you enjoy this crash course?

Do you have any questions? Were there any sticking points?
Let me know. Leave a comment below.

Get a Handle on Linear Algebra for Machine Learning!

Linear Algebra for Machine Learning

Develop a working understand of linear algebra writing lines of code in python

Discover how in my new Ebook:
Linear Algebra for Machine Learning

It provides self-study tutorials on topics like:
Vector Norms, Matrix Multiplication, Tensors, Eigendecomposition, SVD, PCA and much more...

Finally Understand the Mathematics of Data

Skip the Academics. Just Results.

See What's Inside

54 Responses to Linear Algebra for Machine Learning (7-Day Mini-Course)

  1. Brandon March 23, 2018 at 5:38 am #

    As a software developer, building my skills to attain a role in data science, I am interested in learning more about linear algebra because:

    – I am intrigued to understand the math underpinning various ML algorithms
    – I would like to generally improve my mathematics skills
    – I want to improve my fluency in mathematics more generally so I can better understand published academic papers.

  2. Kalyan Banga March 23, 2018 at 4:43 pm #

    Hi Jason, I am founder of Fusion Analytics World, the Leading Digital Platform for News, Industry Analysis, Jobs, Courses, Events & much more. Covering Research & Analytics across Industries.

    I would be happy to feature your course(s) for free on our website and help you reach out to our targeted research intelligence and analytics focussed readers.

    We would be happy to feature your articles on machine learning as well. Let me know your thoughts.

  3. Fati March 24, 2018 at 6:52 pm #


    Since I started to learn about machine learning, I found the importance of math specially linear algebra.
    This 7 mini lessons can help to find:
    -The most important notation and method which you need as a data scientist or ML developer.
    -Better understanding of ML and the math behind it


  4. Prabhjot March 25, 2018 at 1:55 pm #

    I find your lessons very useful. Thanks for sharing this knowledge.
    I have been looking for good resources on a good way to import my own data into Python (data could be images or excel file, etc.)
    I am quite familiar with MATLAB and fairly new to Python.. I can’t seem to find a good way to import things in Python. I would appreciate if you could please point me to some good resources. Thanks.!

  5. Susensio March 30, 2018 at 6:50 am #

    Great crash course man! I’m having a great time implementing these things from zero in python, I needed this linear algebra foundation refresher!

    • Jason Brownlee March 31, 2018 at 6:26 am #

      Thanks, I’m glad it helped.

      • Susensio April 8, 2018 at 3:57 am #

        BTW, there is a small typo in LU Matrix Decomposition section, where you mention ‘…calculates an LPU decomposition…’ I think it should be PLU.
        I drove myself crazy searching the difference between LPU and PLU lol

  6. Amit Mukherjee July 16, 2018 at 2:58 pm #

    I am learning linear algebra because it is a prerequisite for deep learning for solving computer vision problems.

  7. Amit Mukherjee July 16, 2018 at 11:09 pm #

    # dot product of vectors. Both vectors must be of the same size
    from numpy import array, dot
    a = array([1, 2])
    b = array([13, 14])
    c = dot(a,b)
    #c =

  8. Amit Mukherjee July 17, 2018 at 1:56 am #

    # multiply a matrix with a vector
    from numpy import array, dot
    A = array([[1, 2, 3], [3, 4, 5], [5, 6, 7]])
    b = array([7, 8, 9])
    C = dot(A,b)

  9. Amit Mukherjee July 17, 2018 at 5:49 pm #

    Lower triangular matrix
    [[1 0 0]
    [0 2 0]
    [5 6 3]]
    Upper triangular matrix
    [[1 2 3]
    [0 4 0]
    [0 0 6]]
    Diagonal matrix
    [[1 0 0]
    [0 2 0]
    [0 0 3]]

  10. Amit Mukherjee July 17, 2018 at 5:50 pm #

    A symmetric matrix
    [[1 2 3]
    [2 4 6]
    [3 6 5]]

  11. Ramesh Gupta February 12, 2019 at 7:03 am #

    Triangular Matrix


  12. Abid Rizvi March 22, 2019 at 4:03 am #

    I am inspired by you a lot and this is my first comment after constantly viewing your website for one and a half year(almost). I want to work with you remotely. Is it possible in some way?

    • Jason Brownlee March 22, 2019 at 8:36 am #

      Thanks, I’m glad that the tutorials are helpful.

      A great way to work togehter/contribute is for you to go through some of the tutorials and report your results as comments.

  13. Natasa January 28, 2020 at 2:49 am #

    Hi Jason,

    First thank you for this opportunity. My reasons are following:

    1. To recall&recover my university knowledge on some parts of Linear Algebra.
    2. To understand what is behind formulas deployed in python, so I will able to understand reasons for getting “strange” results
    3. Deep understanding of how Linear Algebra “tools” can help us in investigating patterns in data. This is exciting.


  14. Rui Antunes February 14, 2020 at 8:37 pm #

    As a student I’m interested in learning linear algebra because I want to have a greater understanding on mathematics, statistics, probability theory, and machine learning.
    Thank you for your tutorials

  15. Aimen Shahid February 18, 2020 at 3:41 pm #

    1. During my Bachelor’s Degree I never paid attention to the Linear Algebra course and so I barely passed. This time around I want to properly learn it.
    2. I am starting to get into ML and Computer Vision and I’ve been told I need to have a good understanding of Linear Algebra for that.
    3. I’ve been a bit out of practice with maths and would like to get back into it.

  16. Dustin February 26, 2020 at 4:35 am #

    My 3 reasons for taking this course:

    1. I enjoy learning new aspects to computer programming.
    2. I want to build a system that analyzes the use of the English language to assist in improving my students’ writing.
    3. I have never been confident in my math skills and shied away from to subject all through my schooling – I’d like to prove to myself that even higher level math theory is something I can grasp.

  17. Ajay Kumar March 18, 2020 at 8:43 pm #

    I wanted to learn the equation of PCA and SVM where Linear Algebra is used. I am more enthusiastic to go through each and every steps of this study

  18. Venkata Ramana Mantravadi April 4, 2020 at 3:54 am #


    from numpy import array
    from numpy import linalg
    A = array([[1.0, 2.0], [3.0, 4.0]])

    output : -2.0



    output : 5.0

  19. Venkata Ramana Mantravadi April 4, 2020 at 4:41 am #

    If A is matrix then
    if A^(T)=A, (transpose)
    then A is called a symmetric matrix

  20. Travis Carter April 29, 2020 at 2:57 pm #

    7 Day Course: Day 1.
    Reasons for Learning Linear Algebra
    1. To clarify the language of machine learning notation, right now it’s practically gibberish.
    2. In hopes of not only knowing the words and notation, but understanding what the operations do.
    3. To gain enough knowledge of linear algebra to put it into practice. What I learned in school was only retained for the test, and has never transitioned into my toolbox.

  21. Vishwanath Salokye May 8, 2020 at 9:45 pm #

    7 day course completed in one day in fact few hours..
    I enjoyed recollecting my engg days and able to create my own examples and solve them

    Thanks a lot Jason

  22. K. Siva Senthil (Siva) May 11, 2020 at 7:37 pm #

    Hi Jason,

    My first request to you; before I perform my first task is –

    Please move this section “Leave a Reply” right to the top of comment section. I could avoid scrolling all other comments to post mine. I guess if a reader is interested, scrolling down and reading the comments would still be viable.

    Now to the task; This is my first day of the crash course.
    I personally want to learn linear algebra;
    1. This will help me appreciate many nuances involved in ML algorithms.
    2. I will refresh my earlier formal training on this topic during my under graduate studies but have not used since many years.
    3. I will be able read research papers on algorithms a bit more fluently.

  23. K. Siva Senthil (Siva) May 13, 2020 at 2:49 am #

    Hi Jason,

    My day 2 task –

    Jason Brownlee – “[L]inear algebra is the mathematics of data. Matrices and vectors are language of data.” is the best definition I have read. The other definitions I have encountered are –

    Wikipedia – Linear algebra is the branch of mathematics concerning equations which are linear.

    Introduction to Linear Algebra, 2nd edition By T.A Whitelaw – [Linear algebra] solves systems of simultaneous linear equations and rectangular arrays (matrices, as we call them) of coefficients occurring in such systems. It is also true that many ideas of importance in linear algebra could be traced to geometrical sources.

    Byju’s Learning ( – Linear algebra is the study of linear combinations. It is the study of vector spaces, lines and planes, and some mappings that are required to perform the linear transformations. It includes vectors, matrices and linear functions. It is the study of linear sets of equations and its transformation properties.

    I also noticed much of text books treat linear algebra directly by starting with vectors without much discourse on attempting to describe the topic of linear algebra. E.g.

    1. An introduction to linear algebra by L. Mirsky.
    2. Linear algebra: A course for physicsts and engineers by Arak M. Mithai, Hans J Haubold
    3. Introduction to linear algebra by Gilbert Strang
    4. Linear algebra in 25 lectures –
    5. Linear algebra by Jeff Hefferon 3rd edition

    Retrospectively; vectors, matrices are like axes of linear algebra as x, y and z axes are to space. i.e. Vectors and matrices are elementary and inseparable concepts of linear algebra.

    Some applications that I discovered are in the research of –
    1. Spectral clustering –
    2. Adaptive filters used in wireless communicaion –
    3. Latent semantic analysis in NLP :
    4. Information retrieval –
    5. Philosophy of science – Linear algebra applied to linear metatheory –

  24. K. Siva Senthil (Siva) May 20, 2020 at 1:02 am #

    My day 3 activity following is the code –

    from numpy import array
    from numpy import multiply
    from numpy import divide

    def dotProduct(multiplier, multiplicand) :
    return multiply(multiplier, multiplicand)

    def add(augend, addend) :
    return augend + addend

    def division(dividend, divisor) :
    return divide(dividend, divisor)

    def subtraction(minuend, subtrahend) :
    return minuend – subtrahend

    subtrahend = divisor = multiplier = addend = array([2,3])
    minuend = dividend = multiplicand = augend = array([6,9])

    print(“Vector dot product – ” + str(dotProduct(multiplier, multiplicand)))

    print(“Vector addition – ” + str(add(augend, addend)))

    print(“Vector division – ” + str(division(dividend, divisor)))

    print(“Vector subtraction – ” + str(subtraction(minuend, subtrahend)))

  25. Anoop Nayak July 13, 2020 at 9:31 pm #

    Lesson1: Reasons why I want to learn Linear algebra

    1) All the available data is in the form of some matrices, vectors. I hope that learning linear algebra may help we work with this format of available data.

    2) I have got well enough grasp on trying to relate two-three variables and small codes can work out these relations are enough. But when working with many variables, I hope that there are many better parameters that can help me better address the problems.

    3) It like to work around numbers and try another tool to interpret them.

  26. Ayesha Ayub August 7, 2020 at 1:50 am #

    I have just started my research on text summarization. So, I believe that learning linear algebra will surely be helpful while understanding the mathematical foundations of the deep learning models. Thanks.

Leave a Reply