Last Updated on August 9, 2019

### Linear Algebra for Machine Learning Crash Course.

#### Get on top of the linear algebra used in machine learning in 7 Days.

Linear algebra is a field of mathematics that is universally agreed to be a prerequisite for a deeper understanding of machine learning.

Although linear algebra is a large field with many esoteric theories and findings, the nuts and bolts tools and notations taken from the field are required for machine learning practitioners. With a solid foundation of what linear algebra is, it is possible to focus on just the good or relevant parts.

In this crash course, you will discover how you can get started and confidently read and implement linear algebra notation used in machine learning with Python in 7 days.

This is a big and important post. You might want to bookmark it.

**Kick-start your project** with my new book Linear Algebra for Machine Learning, including *step-by-step tutorials* and the *Python source code* files for all examples.

Let’s get started.

**Update Mar/2018**: Fixed a small typo in the SVD lesson.

## Who Is This Crash-Course For?

Before we get started, let’s make sure you are in the right place.

This course is for developers that may know some applied machine learning. Maybe you know how to work through a predictive modeling problem end-to-end, or at least most of the main steps, with popular tools.

The lessons in this course do assume a few things about you, such as:

- You know your way around basic Python for programming.

You may know some basic NumPy for array manipulation. - You want to learn linear algebra to deepen your understanding and application of machine learning.

You do NOT need to know:

- You do not need to be a math wiz!
- You do not need to be a machine learning expert!

This crash course will take you from a developer that knows a little machine learning to a developer who can navigate the basics of linear algebra.

Note: This crash course assumes you have a working Python3 SciPy environment with at least NumPy installed. If you need help with your environment, you can follow the step-by-step tutorial here:

## Crash-Course Overview

This crash course is broken down into 7 lessons.

You could complete one lesson per day (recommended) or complete all of the lessons in one day (hardcore). It really depends on the time you have available and your level of enthusiasm.

Below is a list of the 7 lessons that will get you started and productive with linear algebra for machine learning in Python:

**Lesson 01**: Linear Algebra for Machine Learning**Lesson 02**: Linear Algebra**Lesson 03**: Vectors**Lesson 04**: Matrices**Lesson 05**: Matrix Types and Operations**Lesson 06**: Matrix Factorization**Lesson 07**: Singular-Value Decomposition

Each lesson could take you 60 seconds or up to 30 minutes. Take your time and complete the lessons at your own pace. Ask questions and even post results in the comments below.

The lessons expect you to go off and find out how to do things. I will give you hints, but part of the point of each lesson is to force you to learn where to go to look for help on and about the linear algebra and the NumPy API and the best-of-breed tools in Python (hint: I have all of the answers directly on this blog; use the search box).

I do provide more help in the form of links to related posts because I want you to build up some confidence and inertia.

Post your results in the comments; I’ll cheer you on!

Hang in there; don’t give up.

Note: This is just a crash course. For a lot more detail and fleshed out tutorials, see my book on the topic titled “Basics of Linear Algebra for Machine Learning“.

### Need help with Linear Algebra for Machine Learning?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

## Lesson 01: Linear Algebra for Machine Learning

In this lesson, you will discover the 5 reasons why a machine learning practitioner should deepen their understanding of linear algebra.

### 1. You Need to Learn Linear Algebra Notation

You need to be able to read and write vector and matrix notation. Algorithms are described in books, papers, and on websites using vector and matrix notation.

### 2. You Need to Learn Linear Algebra Arithmetic

In partnership with the notation of linear algebra are the arithmetic operations performed. You need to know how to add, subtract, and multiply scalars, vectors, and matrices.

### 3. You Need to Learn Linear Algebra for Statistics

You must learn linear algebra in order to be able to learn statistics. Especially multivariate statistics. In order to be able to read and interpret statistics, you must learn the notation and operations of linear algebra. Modern statistics uses both the notation and tools of linear algebra to describe the tools and techniques of statistical methods. From vectors for the means and variances of data, to covariance matrices that describe the relationships between multiple Gaussian variables.

### 4. You Need to Learn Matrix Factorization

Building on notation and arithmetic is the idea of matrix factorization, also called matrix decomposition.You need to know how to factorize a matrix and what it means. Matrix factorization is a key tool in linear algebra and used widely as an element of many more complex operations in both linear algebra (such as the matrix inverse) and machine learning (least squares).

### 5. You Need to Learn Linear Least Squares

You need to know how to use matrix factorization to solve linear least squares. Problems of this type can be framed as the minimization of squared error, called least squares, and can be recast in the language of linear algebra, called linear least squares. Linear least squares problems can be solved efficiently on computers using matrix operations such as matrix factorization.

### One More Reason

If I could give one more reason, it would be: because it is fun. Seriously.

### Your Task

For this lesson, you must list 3 reasons why you, personally, want to learn linear algebra.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover a concise definition of linear algebra.

## Lesson 02: Linear Algebra

In this lesson, you will discover a concise definition of linear algebra.

### Linear Algebra

Linear algebra is a branch of mathematics, but the truth of it is that linear algebra is the mathematics of data. Matrices and vectors are the language of data.

Linear algebra is about linear combinations. That is, using arithmetic on columns of numbers called vectors and 2D arrays of numbers called matrices, to create new columns and arrays of numbers.

### Numerical Linear Algebra

The application of linear algebra in computers is often called numerical linear algebra.

It is more than just the implementation of linear algebra operations in code libraries; it also includes the careful handling of the problems of applied mathematics, such as working with the limited floating point precision of digital computers.

### Applications of Linear Algebra

As linear algebra is the mathematics of data, the tools of linear algebra are used in many domains.

- Matrices in Engineering, such as a line of springs.
- Graphs and Networks, such as analyzing networks.
- Markov Matrices, Population, and Economics, such as population growth.
- Linear Programming, the simplex optimization method.
- Fourier Series: Linear Algebra for Functions, used widely in signal processing.
- Linear Algebra for Statistics and Probability, such as least squares for regression.
- Computer Graphics, such as the various translation, scaling and rotation of images.

### Your Task

For this lesson, you must find five quotes from research papers, blogs, or books that define the field of linear algebra.

Post your answer in the comments below. I would love to see what you discover.

In the next lesson, you will discover vectors and simple vector arithmetic.

## Lesson 03: Vectors

In this lesson, you will discover vectors and simple vector arithmetic.

### What is a Vector?

A vector is a tuple of one or more values called scalars.

Vectors are often represented using a lowercase character such as “v”; for example:

1 |
v = (v1, v2, v3) |

Where v1, v2, v3 are scalar values, often real values.

### Defining a Vector

We can represent a vector in Python as a NumPy array.

A NumPy array can be created from a list of numbers. For example, below we define a vector with the length of 3 and the integer values 1, 2, and 3.

1 2 3 4 |
# create a vector from numpy import array v = array([1, 2, 3]) print(v) |

### Vector Multiplication

Two vectors of equal length can be multiplied together.

1 |
c = a * b |

As with addition and subtraction, this operation is performed element-wise to result in a new vector of the same length.

1 |
a * b = (a1 * b1, a2 * b2, a3 * b3) |

We can perform this operation directly in NumPy.

1 2 3 4 5 6 7 8 |
# multiply vectors from numpy import array a = array([1, 2, 3]) print(a) b = array([1, 2, 3]) print(b) c = a * b print(c) |

### Your Task

For this lesson, you must implement other vector arithmetic operations such as addition, division, subtraction, and the vector dot product.

Post your answer in the comments below. I would love to see what you discover.

In the next lesson, you will discover matrices and simple matrix arithmetic.

## Lesson 04: Matrices

In this lesson, you will discover matrices and simple matrix arithmetic.

### What is a Matrix?

A matrix is a two-dimensional array of scalars with one or more columns and one or more rows.

The notation for a matrix is often an uppercase letter, such as A, and entries are referred to by their two-dimensional subscript of row (i) and column (j), such as aij. For example:

1 |
A = ((a11, a12), (a21, a22), (a31, a32)) |

### Defining a Matrix

We can represent a matrix in Python using a two-dimensional NumPy array.

A NumPy array can be constructed given a list of lists. For example, below is a 2 row, 3 column matrix.

1 2 3 4 |
# create matrix from numpy import array A = array([[1, 2, 3], [4, 5, 6]]) print(A) |

### Matrix Addition

Two matrices with the same dimensions can be added together to create a new third matrix.

1 |
C = A + B |

The scalar elements in the resulting matrix are calculated as the addition of the elements in each of the matrices being added.

We can implement this in python using the plus operator directly on the two NumPy arrays.

1 2 3 4 5 6 7 8 |
# add matrices from numpy import array A = array([[1, 2, 3], [4, 5, 6]]) print(A) B = array([[1, 2, 3], [4, 5, 6]]) print(B) C = A + B print(C) |

### Matrix Dot Product

Matrix multiplication, also called the matrix dot product, is more complicated than the previous operations and involves a rule, as not all matrices can be multiplied together.

1 |
C = A * B |

The rule for matrix multiplication is as follows: The number of columns (n) in the first matrix (A) must equal the number of rows (m) in the second matrix (B).

For example, matrix A has the dimensions m rows and n columns and matrix B has the dimensions n and k. The n columns in A and n rows b are equal. The result is a new matrix with m rows and k columns.

1 |
C(m,k) = A(m,n) * B(n,k) |

The intuition for the matrix multiplication is that we are calculating the dot product between each row in matrix A with each column in matrix B. For example, we can step down rows of column A and multiply each with column 1 in B to give the scalar values in column 1 of C.

The matrix multiplication operation can be implemented in NumPy using the dot() function.

1 2 3 4 5 6 7 8 |
# matrix dot product from numpy import array A = array([[1, 2], [3, 4], [5, 6]]) print(A) B = array([[1, 2], [3, 4]]) print(B) C = A.dot(B) print(C) |

### Your Task

For this lesson, you must implement more matrix arithmetic operations such as subtraction, division, the Hadamard product, and vector-matrix multiplication.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover the different types of matrices and matrix operations.

## Lesson 05: Matrix Types and Operations

In this lesson, you will discover the different types of matrices and matrix operations.

### Transpose

A defined matrix can be transposed, which creates a new matrix with the number of columns and rows flipped.

This is denoted by the superscript “T” next to the matrix.

1 |
C = A^T |

We can transpose a matrix in NumPy by calling the T attribute.

1 2 3 4 5 6 |
# transpose matrix from numpy import array A = array([[1, 2], [3, 4], [5, 6]]) print(A) C = A.T print(C) |

### Inversion

The operation of inverting a matrix is indicated by a -1 superscript next to the matrix; for example, A^-1. The result of the operation is referred to as the inverse of the original matrix; for example, B is the inverse of A.

1 |
B = A^-1 |

Not all matrices are invertible.

A matrix can be inverted in NumPy using the inv() function.

1 2 3 4 5 6 7 8 9 |
# invert matrix from numpy import array from numpy.linalg import inv # define matrix A = array([[1.0, 2.0], [3.0, 4.0]]) print(A) # invert matrix B = inv(A) print(B) |

### Square Matrix

A square matrix is a matrix where the number of rows (n) equals the number of columns (m).

1 |
n = m |

The square matrix is contrasted with the rectangular matrix where the number of rows and columns are not equal.

### Symmetric Matrix

A symmetric matrix is a type of square matrix where the top-right triangle is the same as the bottom-left triangle.

To be symmetric, the axis of symmetry is always the main diagonal of the matrix, from the top left to the bottom right.

A symmetric matrix is always square and equal to its own transpose.

1 |
M = M^T |

### Triangular Matrix

A triangular matrix is a type of square matrix that has all values in the upper-right or lower-left of the matrix with the remaining elements filled with zero values.

A triangular matrix with values only above the main diagonal is called an upper triangular matrix. Whereas, a triangular matrix with values only below the main diagonal is called a lower triangular matrix.

### Diagonal Matrix

A diagonal matrix is one where values outside of the main diagonal have a zero value, where the main diagonal is taken from the top left of the matrix to the bottom right.

A diagonal matrix is often denoted with the variable D and may be represented as a full matrix or as a vector of values on the main diagonal.

### Your Task

For this lesson, you must develop examples for other matrix operations such as the determinant, trace, and rank.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover matrix factorization.

## Lesson 06: Matrix Factorization

In this lesson, you will discover the basics of matrix factorization, also called matrix decomposition.

### What is a Matrix Decomposition?

A matrix decomposition is a way of reducing a matrix into its constituent parts.

It is an approach that can simplify more complex matrix operations that can be performed on the decomposed matrix rather than on the original matrix itself.

A common analogy for matrix decomposition is the factoring of numbers, such as the factoring of 25 into 5 x 5. For this reason, matrix decomposition is also called matrix factorization. Like factoring real values, there are many ways to decompose a matrix, hence there are a range of different matrix decomposition techniques.

### LU Matrix Decomposition

The LU decomposition is for square matrices and decomposes a matrix into L and U components.

1 |
A = L . U |

Where A is the square matrix that we wish to decompose, L is the lower triangle matrix, and U is the upper triangle matrix. A variation of this decomposition that is numerically more stable to solve in practice is called the LUP decomposition, or the LU decomposition with partial pivoting.

1 |
A = P . L . U |

The rows of the parent matrix are re-ordered to simplify the decomposition process and the additional P matrix specifies a way to permute the result or return the result to the original order. There are also other variations of the LU.

The LU decomposition is often used to simplify the solving of systems of linear equations, such as finding the coefficients in a linear regression.

The LU decomposition can be implemented in Python with the lu() function. More specifically, this function calculates an LPU decomposition.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
# LU decomposition from numpy import array from scipy.linalg import lu # define a square matrix A = array([[1, 2, 3], [4, 5, 6], [7, 8, 9]]) print(A) # LU decomposition P, L, U = lu(A) print(P) print(L) print(U) # reconstruct B = P.dot(L).dot(U) print(B) |

### Your Task

For this lesson, you must implement small examples of other simple methods for matrix factorization, such as the QR decomposition, the Cholesky decomposition, and the eigendecomposition.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover the Singular-Value Decomposition method for matrix factorization.

## Lesson 07: Singular-Value Decomposition

In this lesson, you will discover the Singular-Value Decomposition method for matrix factorization.

### Singular-Value Decomposition

The Singular-Value Decomposition, or SVD for short, is a matrix decomposition method for reducing a matrix to its constituent parts in order to make certain subsequent matrix calculations simpler.

1 |
A = U . Sigma . V^T |

Where A is the real m x n matrix that we wish to decompose, U is an m x m matrix, Sigma (often represented by the uppercase Greek letter Sigma) is an m x n diagonal matrix, and V^T is the transpose of an n x n matrix where T is a superscript.

### Calculate Singular-Value Decomposition

The SVD can be calculated by calling the svd() function.

The function takes a matrix and returns the U, Sigma, and V^T elements. The Sigma diagonal matrix is returned as a vector of singular values. The V matrix is returned in a transposed form, e.g. V.T.

1 2 3 4 5 6 7 8 9 10 11 |
# Singular-value decomposition from numpy import array from scipy.linalg import svd # define a matrix A = array([[1, 2], [3, 4], [5, 6]]) print(A) # SVD U, s, V = svd(A) print(U) print(s) print(V) |

### Your Task

For this lesson, you must list 5 applications of the SVD.

Bonus points if you can demonstrate each with a small example in Python.

Post your answer in the comments below. I would love to see what you discover.

This was the final lesson in the mini-course.

## The End!

(*Look How Far You Have Come*)

You made it. Well done!

Take a moment and look back at how far you have come.

You discovered:

- The importance of linear algebra to applied machine learning.
- What linear algebra is all about.
- What a vector is and how to perform vector arithmetic.
- What a matrix is and how to perform matrix arithmetic, including matrix multiplication.
- A suite of types of matrices, their properties, and advanced operations involving matrices.
- Matrix factorization methods and the LU decomposition method in detail.
- The popular Singular-Value decomposition method used in machine learning.

This is just the beginning of your journey with linear algebra for machine learning. Keep practicing and developing your skills.

Take the next step and check out my book on Linear Algebra for Machine Learning.

## Summary

*How Did You Do with The Mini-Course?*

Did you enjoy this crash course?

*Do you have any questions? Were there any sticking points?*

Let me know. Leave a comment below.

As a software developer, building my skills to attain a role in data science, I am interested in learning more about linear algebra because:

– I am intrigued to understand the math underpinning various ML algorithms

– I would like to generally improve my mathematics skills

– I want to improve my fluency in mathematics more generally so I can better understand published academic papers.

Thanks Brandon.

Hi Jason, I am founder of Fusion Analytics World, the Leading Digital Platform for News, Industry Analysis, Jobs, Courses, Events & much more. Covering Research & Analytics across Industries.

I would be happy to feature your course(s) for free on our website and help you reach out to our targeted research intelligence and analytics focussed readers.

We would be happy to feature your articles on machine learning as well. Let me know your thoughts.

No thanks.

Hi,

Since I started to learn about machine learning, I found the importance of math specially linear algebra.

This 7 mini lessons can help to find:

-The most important notation and method which you need as a data scientist or ML developer.

-Better understanding of ML and the math behind it

Thanks

Thanks Fati.

I find your lessons very useful. Thanks for sharing this knowledge.

I have been looking for good resources on a good way to import my own data into Python (data could be images or excel file, etc.)

I am quite familiar with MATLAB and fairly new to Python.. I can’t seem to find a good way to import things in Python. I would appreciate if you could please point me to some good resources. Thanks.!

See this post:

https://machinelearningmastery.com/load-machine-learning-data-python/

Great crash course man! I’m having a great time implementing these things from zero in python, I needed this linear algebra foundation refresher!

Thanks!

Thanks, I’m glad it helped.

BTW, there is a small typo in LU Matrix Decomposition section, where you mention ‘…calculates an LPU decomposition…’ I think it should be PLU.

I drove myself crazy searching the difference between LPU and PLU lol

LUP is correct, it is LU factorization with Partial Pivoting (LUP). From:

https://en.wikipedia.org/wiki/LU_decomposition

I changed the order of the terms to match the reconstruction. Thanks.

I am learning linear algebra because it is a prerequisite for deep learning for solving computer vision problems.

Thanks.

# dot product of vectors. Both vectors must be of the same size

from numpy import array, dot

a = array([1, 2])

print(a)

b = array([13, 14])

print(b)

c = dot(a,b)

#c = a.dot(b)

print(c)

Nice.

# multiply a matrix with a vector

from numpy import array, dot

A = array([[1, 2, 3], [3, 4, 5], [5, 6, 7]])

print(A)

b = array([7, 8, 9])

print(b)

C = dot(A,b)

print(C)

Well done!

Lower triangular matrix

[[1 0 0]

[0 2 0]

[5 6 3]]

Upper triangular matrix

[[1 2 3]

[0 4 0]

[0 0 6]]

Diagonal matrix

[[1 0 0]

[0 2 0]

[0 0 3]]

Nice.

A symmetric matrix

[[1 2 3]

[2 4 6]

[3 6 5]]

Well done.

Triangular Matrix

A=[[1,0,2],[2,1,0],[3,0,1]]

Nice work.

@Jason

I am inspired by you a lot and this is my first comment after constantly viewing your website for one and a half year(almost). I want to work with you remotely. Is it possible in some way?

Thanks, I’m glad that the tutorials are helpful.

A great way to work togehter/contribute is for you to go through some of the tutorials and report your results as comments.

Hi Jason,

First thank you for this opportunity. My reasons are following:

1. To recall&recover my university knowledge on some parts of Linear Algebra.

2. To understand what is behind formulas deployed in python, so I will able to understand reasons for getting “strange” results

3. Deep understanding of how Linear Algebra “tools” can help us in investigating patterns in data. This is exciting.

BR,

Natasa

Thanks for sharing Natasa!

As a student I’m interested in learning linear algebra because I want to have a greater understanding on mathematics, statistics, probability theory, and machine learning.

Thank you for your tutorials

Thanks!

1. During my Bachelor’s Degree I never paid attention to the Linear Algebra course and so I barely passed. This time around I want to properly learn it.

2. I am starting to get into ML and Computer Vision and I’ve been told I need to have a good understanding of Linear Algebra for that.

3. I’ve been a bit out of practice with maths and would like to get back into it.

Thanks for sharing!

My 3 reasons for taking this course:

1. I enjoy learning new aspects to computer programming.

2. I want to build a system that analyzes the use of the English language to assist in improving my students’ writing.

3. I have never been confident in my math skills and shied away from to subject all through my schooling – I’d like to prove to myself that even higher level math theory is something I can grasp.

Thanks Dustin!

I wanted to learn the equation of PCA and SVM where Linear Algebra is used. I am more enthusiastic to go through each and every steps of this study

Thanks!

Determinant

from numpy import array

from numpy import linalg

A = array([[1.0, 2.0], [3.0, 4.0]])

linalg.det(A)

output : -2.0

Trace

A.trace()

output : 5.0

Nice work!

If A is matrix then

if A^(T)=A, (transpose)

then A is called a symmetric matrix

Nice work!

7 Day Course: Day 1.

Reasons for Learning Linear Algebra

1. To clarify the language of machine learning notation, right now it’s practically gibberish.

2. In hopes of not only knowing the words and notation, but understanding what the operations do.

3. To gain enough knowledge of linear algebra to put it into practice. What I learned in school was only retained for the test, and has never transitioned into my toolbox.

Well done!

7 day course completed in one day in fact few hours..

I enjoyed recollecting my engg days and able to create my own examples and solve them

Thanks a lot Jason

Well done on your progress!

Hi Jason,

My first request to you; before I perform my first task is –

Please move this section “Leave a Reply” right to the top of comment section. I could avoid scrolling all other comments to post mine. I guess if a reader is interested, scrolling down and reading the comments would still be viable.

Now to the task; This is my first day of the crash course.

I personally want to learn linear algebra;

1. This will help me appreciate many nuances involved in ML algorithms.

2. I will refresh my earlier formal training on this topic during my under graduate studies but have not used since many years.

3. I will be able read research papers on algorithms a bit more fluently.

Thanks for the suggestion.

Hi Jason,

My day 2 task –

Jason Brownlee – “[L]inear algebra is the mathematics of data. Matrices and vectors are language of data.” is the best definition I have read. The other definitions I have encountered are –

Wikipedia – Linear algebra is the branch of mathematics concerning equations which are linear.

Introduction to Linear Algebra, 2nd edition By T.A Whitelaw – [Linear algebra] solves systems of simultaneous linear equations and rectangular arrays (matrices, as we call them) of coefficients occurring in such systems. It is also true that many ideas of importance in linear algebra could be traced to geometrical sources.

Byju’s Learning (https://byjus.com/maths/linear-algebra/) – Linear algebra is the study of linear combinations. It is the study of vector spaces, lines and planes, and some mappings that are required to perform the linear transformations. It includes vectors, matrices and linear functions. It is the study of linear sets of equations and its transformation properties.

I also noticed much of text books treat linear algebra directly by starting with vectors without much discourse on attempting to describe the topic of linear algebra. E.g.

1. An introduction to linear algebra by L. Mirsky.

2. Linear algebra: A course for physicsts and engineers by Arak M. Mithai, Hans J Haubold

3. Introduction to linear algebra by Gilbert Strang

4. Linear algebra in 25 lectures – https://www.math.ucdavis.edu/~linear/linear.pdf

5. Linear algebra by Jeff Hefferon 3rd edition http://joshua.smcvt.edu/linearalgebra/book.pdf

Retrospectively; vectors, matrices are like axes of linear algebra as x, y and z axes are to space. i.e. Vectors and matrices are elementary and inseparable concepts of linear algebra.

Some applications that I discovered are in the research of –

1. Spectral clustering – https://academic.microsoft.com/paper/2132914434/reference

2. Adaptive filters used in wireless communicaion – https://academic.microsoft.com/paper/2610805269/reference

3. Latent semantic analysis in NLP : https://academic.microsoft.com/paper/1612003148/reference

4. Information retrieval – https://academic.microsoft.com/paper/2072773380/reference

5. Philosophy of science – Linear algebra applied to linear metatheory – https://arxiv.org/abs/2005.02247

Well done!

My day 3 activity following is the code –

from numpy import array

from numpy import multiply

from numpy import divide

def dotProduct(multiplier, multiplicand) :

return multiply(multiplier, multiplicand)

def add(augend, addend) :

return augend + addend

def division(dividend, divisor) :

return divide(dividend, divisor)

def subtraction(minuend, subtrahend) :

return minuend – subtrahend

subtrahend = divisor = multiplier = addend = array([2,3])

minuend = dividend = multiplicand = augend = array([6,9])

print(“Vector dot product – ” + str(dotProduct(multiplier, multiplicand)))

print(“Vector addition – ” + str(add(augend, addend)))

print(“Vector division – ” + str(division(dividend, divisor)))

print(“Vector subtraction – ” + str(subtraction(minuend, subtrahend)))

Well done!

Lesson1: Reasons why I want to learn Linear algebra

1) All the available data is in the form of some matrices, vectors. I hope that learning linear algebra may help we work with this format of available data.

2) I have got well enough grasp on trying to relate two-three variables and small codes can work out these relations are enough. But when working with many variables, I hope that there are many better parameters that can help me better address the problems.

3) It like to work around numbers and try another tool to interpret them.

Nice work!

I have just started my research on text summarization. So, I believe that learning linear algebra will surely be helpful while understanding the mathematical foundations of the deep learning models. Thanks.

Thanks!

I think rather than importing like

from numpy import array, dot

just use the syntax

import numpy as np

then use them as

np.array

np.dot

np.multiply

np.cross

np.dot

is much easier, for you don’t have to remember them at the beginning rather use them according to your code. Correct me if I made any mistake.

Thanks for your suggestion.

I don’t use that idiom intentionally, I don’t like it.

Nice work.

import numpy as np

A=np.array([[1,2],[3,4],[5,6]]);

F=np.array([[1,2],[3,4]]);

C=A.T;

D=np.linalg.inv(F);

E=np.linalg.det(F);

G=np.trace(F);

H=np.linalg.matrix_rank(A);

print(A);

print(C);

print(D);

print(E);

print(G);

print(H);

Nice work!

import numpy.linalg as np

import scipy.linalg as sp

A=np.linalg.array([[1,2,3],[4,5,6],[7,8,9]]);

V=np.linalg.array([[1,2-2j],[2j,5+8j]]);

print(A);

P,L,U=sp.lu(A);

print(P);

print(L);

print(U);

B,D=np.linalg.qr(A);

print(B);

print(D);

Y=np.linalg.cholesky(V);

print(Y);

K=np.linalg.eig(A);

print(K);

#Reconstruction

Recon_2=np.linalg.dot(B,D);

Recon_1=P.dot(L).dot(U);

print(Recon_1);

print(Recon_2);

Well done!

import numpy.linalg as np

import scipy.linalg as sp

A=np.linalg.array([[1,2],[3,4],[5,6]]);

print(A);

U,S,V=sp.svd(A);

print(U);

print(S);

print(V);

Applications of SVD are:-

1.Low Rank Approximation

2.Total Least Square minimization

3. Pseudoinverse

4.Solution of Homogeneous Linear Equations

5. Finding Range, null space and Rank.

my 3 reason for take this course

1. I need to know about tools for Machine Learning and Data Science.

2. I need to see, how to use linear algebra in Machine Learning and Data Science.

3. I need to know what is Machine Learning and Data Science.

Thank you!

Very nice 7 day course, have done 3 days so far

Seems lot of learning

Thanks!

Thank you, I joined for learning, as a prerequisite for my research in Machine Learning and Natural Language Processing

Thanks.

Thank you for sharing this information! I am totally new to the tech world.

1. I need to learn this information so that I can understand what my professor is saying, lol!

2. This will help me with Machine Learning and Data Science.

3. This will help me do well in my major.

Thanks!

Thank you for the course. I interest learn Linear Algebra or Elementary Linear Algebra because:

1. I would like to know about the application of Linear Algebra,

2. This course help me how to use basic mathematics operation on number specially matrice,

3. This will help me to look at Linear Algebra from another point of view, for an application.

Thanks!

Hi,

Linear Algebra is the basic foundation for ML

It helps to develop logical thinking

Linear Algebra helps to evaluate various applications in our day to day life- ex: a mobile plan etc.

Great work!

1. Because most of the correlation between variables can be expressed using matrices so it’s manipulation is important.

2. Simplify hardcore math problems.

3. Curiosity.

Well done!

Why linear algebra? Beacuse…

1. I have been not bad during my engineering study

2. It was a long time ago…

3. I am curious how linear algebra could be funny (your “one more reason”).

Thanks!

– Appear credible when discussing machine learning with other professionals.

– Understand the basics so that I can build on that foundation.

– Support my efforts to introduce machine learning in industrial controllers.

Well done!

I’ve returned to study (after 18.5 years) and doing PhD. I wanted to get deeper knowledge in machine learning. As machine learning tightly ties up with math, I’m working on to refresh my math knowledge, and to understand how machine learning and math algorithms are tied together.

Lesson 02: Applications:

Looks what I found are not quotes, I’ll need to search more to get the quotes. But came across interesting books, sharing come content from them. I’ll gradually go through these books.

1. Management Science – Management decisions often involve making choices between a number of

alternatives. By assuming that the choices are to be made with a fixed goal in mind

and should be based on a set of evaluation criteria. These decisions often involve

a number of human judgments that may not always be completely consistent. The

analytic hierarchy process is a technique for rating the various alternatives based

on a chart consisting of weighted criteria and ratings that measure how well each

alternative satisfies each of the criteria. Set up such a chart or decision tree for the process, then weights and ratings have been assigned to each entry in the chart, an overall

ranking of the alternatives is calculated using simple matrix-vector operations. Also this book discussed how to use advanced matrix techniques to determine appropriate weights and ratings for the decision process. And then presented a numerical algorithm for computing the weight

vectors used in the decision process.

Ref: http://ndl.ethernet.edu.et/bitstream/123456789/24509/1/Linear%20Algebra%20with%20Applications%202.pdf

2. Suppose a nation’s economy is divided into many sectors, such as various manufacturing, communication, entertainment, and service industries. Suppose that for each sector we know its total output for one year and we know exactly how this output is divided

or “exchanged” among the other sectors of the economy. Let the total dollar value of a

sector’s output be called the price of that output. Leontief proved the following result.

There exist equilibrium prices that can be assigned to the total outputs of the

various sectors in such a way that the income of each sector exactly balances its

expenses. Eg. Find equilibrium prices that make each sector’s income match its expenditures. Leontief – The system of 500 equations in 500 variables, is now known as a Leontief “input–output” (or “production”) model.

Ref: https://math.berkeley.edu/~yonah/files/Linear%20Algebra.pdf

3. Computer Graphics: Mathematics used to manipulate and display graphical images such as a wire-frame model of an airplane. Such an image (or picture) consists of a number of points, connecting lines or curves, and information about how to fill in closed regions bounded by the lines and curves. Often, curved lines are approximated by short straight-line segments, and a figure is defined mathematically by a list of points. Among the simplest 2D graphics symbols are letters used for labels on the screen. Some letters are stored as wire-frame objects; others that have curved portions are stored with additional mathematical formulas for the curves.

Ref: https://math.berkeley.edu/~yonah/files/Linear%20Algebra.pdf

4. Now that powerful computers are widely available, more and more scientific and

engineering problems are being treated in a way that uses discrete, or digital, data rather

than continuous data. Difference equations are often the appropriate tool to analyze

such data. Even when a differential equation is used to model a continuous process, a

numerical solution is often produced from a related difference equation.

Eg. Discrete-Time Signals, Linear Difference Equations etc.,

Ref: https://math.berkeley.edu/~yonah/files/Linear%20Algebra.pdf

Well done.

Lesson 03: Vector operations

import numpy as np

from numpy import array

# Vector addition

a1 = array([2,4,6,8])

a2 = array([3,4,5,6])

print(a1+a2)

# Vector multiplication

b1 = array([2,4,6,8])

b2 = array([3,4,5,6])

print(b1 * b2)

# Vector subtraction

c1 = array([2,4,6,8])

c2 = array([3,4,5,6])

print(c1-c2)

# Vector dot product

d1 = array([2,4,6,8])

d2 = array([3,4,5,6])

print([email protected])

print(np.dot(d1,d2))

Output:

[ 5 8 11 14]

[ 6 16 30 48]

[-1 0 1 2]

100

100

Well done.

I want to learn Linear Algebra because

1. Linear Algebra deals the vectors and matrices which are heavily used in Machine Learning

2. Linear Algebra deals with statistics which is the important element of Machine Learning

3. Linear Algebra is a key foundation to the field of Machine Learning, as the notations and operations of Linear Algebra are used to describe the operation of algorithms and the implementation of algorithms in code

Nice work!

Ive been wanting to learn since I was kid

I already watched videos on it

I’m 20 years old

Thanks. Hope you enjoyed.

To strengthen the in-flight mental mapping of the optimization process details to actual data entities i.e. vectors, metrices , tensors etc. details and how to accurately code the hypothesis’

hyperparameters calculation for the best accuracy. As simple as it is presumed, when it gets to coding it, it becomes tricky but Linear Algebra inter data-entities calculations short-cuts make the coding easier.

1. Had Linear Algebra in college from a good instructor – sometimes confusing

2. Always wanted to clarify the concepts since then

3. I have slept several times since then

I am a Statistician and data is my bed-fellow. Therefore learning the mathematics of data is a must. It will prepare me for a better understanding of the applications in topics such as mixed models, generalised linear models, Bayesian models, etc.

As a statistician, you will find machine learning models resonate with the statistical models you learned before. Hope you enjoy the learning journey!

IN SVD HOW DO I RECOVER OR WHERE THE ORIGINAL DATA WERE, THAT IS, ITS POSITION? IF IT IS A MATRIX THAT REPRESENTS CONCEPTS OF A TEXT

Hi ANA…The following is an excellent resource for understanding the fundamentals of SVD.

https://machinelearningmastery.com/singular-value-decomposition-for-machine-learning/

Lesson 2:

The algebra of vectors and matrices, as distinct from the ordinary algebra of real numbers and the abstract algebra of unspecified entities. – Webster’s New World College Dictionary. Copyright © 2014 by Houghton Mifflin Harcourt Publishing Company

Linear Algebra is a continuous form of Mathematics … it allows you to model natural phenomena and to compute them efficiently. https://towardsdatascience.com/linear-algebra-for-deep-learning-f21d7e7d7f23

Linear Algebra is the branch of mathematics aimed at solving systems of linear equations with a ﬁnite number of unknowns. In particular, one would like to obtain answers to the following questions: 1) Characterization of solutions: Are there solutions to a given system of linear equations? How many solutions are there? 2) Finding solutions: How does the solution set look? What are the solutions?

https://math.libretexts.org/Bookshelves/Linear_Algebra/Book%3A_Linear_Algebra_(Schilling_Nachtergaele_and_Lankham)/01%3A_What_is_linear_algebra

Great work April!

Lesson 3:

v = np.array([1,2,3])

w = np.array([1,2,3])

v+w = [2,4,6]

v-w = [0,0,0]

2*v = [2,4,6]

w*v = [1,4,9]

np.dot(w,v) = 14

w/v = [1,1,1]

Thank you for your feedback, April! Keep up the great work!

I just want to learn how things work since I’m new to AI)

Hi Viktor…The following resource is a great starting point for your machine learning journey!

https://machinelearningmastery.com/start-here/

Very helpful thank you very much :thumbsup:

You are very welcome Monika! We appreciate the feedback and support!