Last Updated on

### Linear Algebra for Machine Learning Crash Course.

#### Get on top of the linear algebra used in machine learning in 7 Days.

Linear algebra is a field of mathematics that is universally agreed to be a prerequisite for a deeper understanding of machine learning.

Although linear algebra is a large field with many esoteric theories and findings, the nuts and bolts tools and notations taken from the field are required for machine learning practitioners. With a solid foundation of what linear algebra is, it is possible to focus on just the good or relevant parts.

In this crash course, you will discover how you can get started and confidently read and implement linear algebra notation used in machine learning with Python in 7 days.

This is a big and important post. You might want to bookmark it.

Discover vectors, matrices, tensors, matrix types, matrix factorization, PCA, SVD and much more in my new book, with 19 step-by-step tutorials and full source code.

Let’s get started.

**Update Mar/2018**: Fixed a small typo in the SVD lesson.

## Who Is This Crash-Course For?

Before we get started, let’s make sure you are in the right place.

This course is for developers that may know some applied machine learning. Maybe you know how to work through a predictive modeling problem end-to-end, or at least most of the main steps, with popular tools.

The lessons in this course do assume a few things about you, such as:

- You know your way around basic Python for programming.

You may know some basic NumPy for array manipulation. - You want to learn linear algebra to deepen your understanding and application of machine learning.

You do NOT need to know:

- You do not need to be a math wiz!
- You do not need to be a machine learning expert!

This crash course will take you from a developer that knows a little machine learning to a developer who can navigate the basics of linear algebra.

Note: This crash course assumes you have a working Python3 SciPy environment with at least NumPy installed. If you need help with your environment, you can follow the step-by-step tutorial here:

## Crash-Course Overview

This crash course is broken down into 7 lessons.

You could complete one lesson per day (recommended) or complete all of the lessons in one day (hardcore). It really depends on the time you have available and your level of enthusiasm.

Below is a list of the 7 lessons that will get you started and productive with linear algebra for machine learning in Python:

**Lesson 01**: Linear Algebra for Machine Learning**Lesson 02**: Linear Algebra**Lesson 03**: Vectors**Lesson 04**: Matrices**Lesson 05**: Matrix Types and Operations**Lesson 06**: Matrix Factorization**Lesson 07**: Singular-Value Decomposition

Each lesson could take you 60 seconds or up to 30 minutes. Take your time and complete the lessons at your own pace. Ask questions and even post results in the comments below.

The lessons expect you to go off and find out how to do things. I will give you hints, but part of the point of each lesson is to force you to learn where to go to look for help on and about the linear algebra and the NumPy API and the best-of-breed tools in Python (hint: I have all of the answers directly on this blog; use the search box).

I do provide more help in the form of links to related posts because I want you to build up some confidence and inertia.

Post your results in the comments; I’ll cheer you on!

Hang in there; don’t give up.

Note: This is just a crash course. For a lot more detail and fleshed out tutorials, see my book on the topic titled “Basics of Linear Algebra for Machine Learning“.

### Need help with Linear Algebra for Machine Learning?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

## Lesson 01: Linear Algebra for Machine Learning

In this lesson, you will discover the 5 reasons why a machine learning practitioner should deepen their understanding of linear algebra.

### 1. You Need to Learn Linear Algebra Notation

You need to be able to read and write vector and matrix notation. Algorithms are described in books, papers, and on websites using vector and matrix notation.

### 2. You Need to Learn Linear Algebra Arithmetic

In partnership with the notation of linear algebra are the arithmetic operations performed. You need to know how to add, subtract, and multiply scalars, vectors, and matrices.

### 3. You Need to Learn Linear Algebra for Statistics

You must learn linear algebra in order to be able to learn statistics. Especially multivariate statistics. In order to be able to read and interpret statistics, you must learn the notation and operations of linear algebra. Modern statistics uses both the notation and tools of linear algebra to describe the tools and techniques of statistical methods. From vectors for the means and variances of data, to covariance matrices that describe the relationships between multiple Gaussian variables.

### 4. You Need to Learn Matrix Factorization

Building on notation and arithmetic is the idea of matrix factorization, also called matrix decomposition.You need to know how to factorize a matrix and what it means. Matrix factorization is a key tool in linear algebra and used widely as an element of many more complex operations in both linear algebra (such as the matrix inverse) and machine learning (least squares).

### 5. You Need to Learn Linear Least Squares

You need to know how to use matrix factorization to solve linear least squares. Problems of this type can be framed as the minimization of squared error, called least squares, and can be recast in the language of linear algebra, called linear least squares. Linear least squares problems can be solved efficiently on computers using matrix operations such as matrix factorization.

### One More Reason

If I could give one more reason, it would be: because it is fun. Seriously.

### Your Task

For this lesson, you must list 3 reasons why you, personally, want to learn linear algebra.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover a concise definition of linear algebra.

## Lesson 02: Linear Algebra

In this lesson, you will discover a concise definition of linear algebra.

### Linear Algebra

Linear algebra is a branch of mathematics, but the truth of it is that linear algebra is the mathematics of data. Matrices and vectors are the language of data.

Linear algebra is about linear combinations. That is, using arithmetic on columns of numbers called vectors and 2D arrays of numbers called matrices, to create new columns and arrays of numbers.

### Numerical Linear Algebra

The application of linear algebra in computers is often called numerical linear algebra.

It is more than just the implementation of linear algebra operations in code libraries; it also includes the careful handling of the problems of applied mathematics, such as working with the limited floating point precision of digital computers.

### Applications of Linear Algebra

As linear algebra is the mathematics of data, the tools of linear algebra are used in many domains.

- Matrices in Engineering, such as a line of springs.
- Graphs and Networks, such as analyzing networks.
- Markov Matrices, Population, and Economics, such as population growth.
- Linear Programming, the simplex optimization method.
- Fourier Series: Linear Algebra for Functions, used widely in signal processing.
- Linear Algebra for Statistics and Probability, such as least squares for regression.
- Computer Graphics, such as the various translation, scaling and rotation of images.

### Your Task

For this lesson, you must find five quotes from research papers, blogs, or books that define the field of linear algebra.

Post your answer in the comments below. I would love to see what you discover.

In the next lesson, you will discover vectors and simple vector arithmetic.

## Lesson 03: Vectors

In this lesson, you will discover vectors and simple vector arithmetic.

### What is a Vector?

A vector is a tuple of one or more values called scalars.

Vectors are often represented using a lowercase character such as “v”; for example:

1 |
v = (v1, v2, v3) |

Where v1, v2, v3 are scalar values, often real values.

### Defining a Vector

We can represent a vector in Python as a NumPy array.

A NumPy array can be created from a list of numbers. For example, below we define a vector with the length of 3 and the integer values 1, 2, and 3.

1 2 3 4 |
# create a vector from numpy import array v = array([1, 2, 3]) print(v) |

### Vector Multiplication

Two vectors of equal length can be multiplied together.

1 |
c = a * b |

As with addition and subtraction, this operation is performed element-wise to result in a new vector of the same length.

1 |
a * b = (a1 * b1, a2 * b2, a3 * b3) |

We can perform this operation directly in NumPy.

1 2 3 4 5 6 7 8 |
# multiply vectors from numpy import array a = array([1, 2, 3]) print(a) b = array([1, 2, 3]) print(b) c = a * b print(c) |

### Your Task

For this lesson, you must implement other vector arithmetic operations such as addition, division, subtraction, and the vector dot product.

Post your answer in the comments below. I would love to see what you discover.

In the next lesson, you will discover matrices and simple matrix arithmetic.

## Lesson 04: Matrices

In this lesson, you will discover matrices and simple matrix arithmetic.

### What is a Matrix?

A matrix is a two-dimensional array of scalars with one or more columns and one or more rows.

The notation for a matrix is often an uppercase letter, such as A, and entries are referred to by their two-dimensional subscript of row (i) and column (j), such as aij. For example:

1 |
A = ((a11, a12), (a21, a22), (a31, a32)) |

### Defining a Matrix

We can represent a matrix in Python using a two-dimensional NumPy array.

A NumPy array can be constructed given a list of lists. For example, below is a 2 row, 3 column matrix.

1 2 3 4 |
# create matrix from numpy import array A = array([[1, 2, 3], [4, 5, 6]]) print(A) |

### Matrix Addition

Two matrices with the same dimensions can be added together to create a new third matrix.

1 |
C = A + B |

The scalar elements in the resulting matrix are calculated as the addition of the elements in each of the matrices being added.

We can implement this in python using the plus operator directly on the two NumPy arrays.

1 2 3 4 5 6 7 8 |
# add matrices from numpy import array A = array([[1, 2, 3], [4, 5, 6]]) print(A) B = array([[1, 2, 3], [4, 5, 6]]) print(B) C = A + B print(C) |

### Matrix Dot Product

Matrix multiplication, also called the matrix dot product, is more complicated than the previous operations and involves a rule, as not all matrices can be multiplied together.

1 |
C = A * B |

The rule for matrix multiplication is as follows: The number of columns (n) in the first matrix (A) must equal the number of rows (m) in the second matrix (B).

For example, matrix A has the dimensions m rows and n columns and matrix B has the dimensions n and k. The n columns in A and n rows b are equal. The result is a new matrix with m rows and k columns.

1 |
C(m,k) = A(m,n) * B(n,k) |

The intuition for the matrix multiplication is that we are calculating the dot product between each row in matrix A with each column in matrix B. For example, we can step down rows of column A and multiply each with column 1 in B to give the scalar values in column 1 of C.

The matrix multiplication operation can be implemented in NumPy using the dot() function.

1 2 3 4 5 6 7 8 |
# matrix dot product from numpy import array A = array([[1, 2], [3, 4], [5, 6]]) print(A) B = array([[1, 2], [3, 4]]) print(B) C = A.dot(B) print(C) |

### Your Task

For this lesson, you must implement more matrix arithmetic operations such as subtraction, division, the Hadamard product, and vector-matrix multiplication.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover the different types of matrices and matrix operations.

## Lesson 05: Matrix Types and Operations

In this lesson, you will discover the different types of matrices and matrix operations.

### Transpose

A defined matrix can be transposed, which creates a new matrix with the number of columns and rows flipped.

This is denoted by the superscript “T” next to the matrix.

1 |
C = A^T |

We can transpose a matrix in NumPy by calling the T attribute.

1 2 3 4 5 6 |
# transpose matrix from numpy import array A = array([[1, 2], [3, 4], [5, 6]]) print(A) C = A.T print(C) |

### Inversion

The operation of inverting a matrix is indicated by a -1 superscript next to the matrix; for example, A^-1. The result of the operation is referred to as the inverse of the original matrix; for example, B is the inverse of A.

1 |
B = A^-1 |

Not all matrices are invertible.

A matrix can be inverted in NumPy using the inv() function.

1 2 3 4 5 6 7 8 9 |
# invert matrix from numpy import array from numpy.linalg import inv # define matrix A = array([[1.0, 2.0], [3.0, 4.0]]) print(A) # invert matrix B = inv(A) print(B) |

### Square Matrix

A square matrix is a matrix where the number of rows (n) equals the number of columns (m).

1 |
n = m |

The square matrix is contrasted with the rectangular matrix where the number of rows and columns are not equal.

### Symmetric Matrix

A symmetric matrix is a type of square matrix where the top-right triangle is the same as the bottom-left triangle.

To be symmetric, the axis of symmetry is always the main diagonal of the matrix, from the top left to the bottom right.

A symmetric matrix is always square and equal to its own transpose.

1 |
M = M^T |

### Triangular Matrix

A triangular matrix is a type of square matrix that has all values in the upper-right or lower-left of the matrix with the remaining elements filled with zero values.

A triangular matrix with values only above the main diagonal is called an upper triangular matrix. Whereas, a triangular matrix with values only below the main diagonal is called a lower triangular matrix.

### Diagonal Matrix

A diagonal matrix is one where values outside of the main diagonal have a zero value, where the main diagonal is taken from the top left of the matrix to the bottom right.

A diagonal matrix is often denoted with the variable D and may be represented as a full matrix or as a vector of values on the main diagonal.

### Your Task

For this lesson, you must develop examples for other matrix operations such as the determinant, trace, and rank.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover matrix factorization.

## Lesson 06: Matrix Factorization

In this lesson, you will discover the basics of matrix factorization, also called matrix decomposition.

### What is a Matrix Decomposition?

A matrix decomposition is a way of reducing a matrix into its constituent parts.

It is an approach that can simplify more complex matrix operations that can be performed on the decomposed matrix rather than on the original matrix itself.

A common analogy for matrix decomposition is the factoring of numbers, such as the factoring of 25 into 5 x 5. For this reason, matrix decomposition is also called matrix factorization. Like factoring real values, there are many ways to decompose a matrix, hence there are a range of different matrix decomposition techniques.

### LU Matrix Decomposition

The LU decomposition is for square matrices and decomposes a matrix into L and U components.

1 |
A = L . U |

Where A is the square matrix that we wish to decompose, L is the lower triangle matrix, and U is the upper triangle matrix. A variation of this decomposition that is numerically more stable to solve in practice is called the LUP decomposition, or the LU decomposition with partial pivoting.

1 |
A = P . L . U |

The rows of the parent matrix are re-ordered to simplify the decomposition process and the additional P matrix specifies a way to permute the result or return the result to the original order. There are also other variations of the LU.

The LU decomposition is often used to simplify the solving of systems of linear equations, such as finding the coefficients in a linear regression.

The LU decomposition can be implemented in Python with the lu() function. More specifically, this function calculates an LPU decomposition.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
# LU decomposition from numpy import array from scipy.linalg import lu # define a square matrix A = array([[1, 2, 3], [4, 5, 6], [7, 8, 9]]) print(A) # LU decomposition P, L, U = lu(A) print(P) print(L) print(U) # reconstruct B = P.dot(L).dot(U) print(B) |

### Your Task

For this lesson, you must implement small examples of other simple methods for matrix factorization, such as the QR decomposition, the Cholesky decomposition, and the eigendecomposition.

Post your answer in the comments below. I would love to see what you come up with.

In the next lesson, you will discover the Singular-Value Decomposition method for matrix factorization.

## Lesson 07: Singular-Value Decomposition

In this lesson, you will discover the Singular-Value Decomposition method for matrix factorization.

### Singular-Value Decomposition

The Singular-Value Decomposition, or SVD for short, is a matrix decomposition method for reducing a matrix to its constituent parts in order to make certain subsequent matrix calculations simpler.

1 |
A = U . Sigma . V^T |

Where A is the real m x n matrix that we wish to decompose, U is an m x m matrix, Sigma (often represented by the uppercase Greek letter Sigma) is an m x n diagonal matrix, and V^T is the transpose of an n x n matrix where T is a superscript.

### Calculate Singular-Value Decomposition

The SVD can be calculated by calling the svd() function.

The function takes a matrix and returns the U, Sigma, and V^T elements. The Sigma diagonal matrix is returned as a vector of singular values. The V matrix is returned in a transposed form, e.g. V.T.

1 2 3 4 5 6 7 8 9 10 11 |
# Singular-value decomposition from numpy import array from scipy.linalg import svd # define a matrix A = array([[1, 2], [3, 4], [5, 6]]) print(A) # SVD U, s, V = svd(A) print(U) print(s) print(V) |

### Your Task

For this lesson, you must list 5 applications of the SVD.

Bonus points if you can demonstrate each with a small example in Python.

Post your answer in the comments below. I would love to see what you discover.

This was the final lesson in the mini-course.

## The End!

(*Look How Far You Have Come*)

You made it. Well done!

Take a moment and look back at how far you have come.

You discovered:

- The importance of linear algebra to applied machine learning.
- What linear algebra is all about.
- What a vector is and how to perform vector arithmetic.
- What a matrix is and how to perform matrix arithmetic, including matrix multiplication.
- A suite of types of matrices, their properties, and advanced operations involving matrices.
- Matrix factorization methods and the LU decomposition method in detail.
- The popular Singular-Value decomposition method used in machine learning.

This is just the beginning of your journey with linear algebra for machine learning. Keep practicing and developing your skills.

Take the next step and check out my book on Linear Algebra for Machine Learning.

## Summary

*How Did You Do with The Mini-Course?*

Did you enjoy this crash course?

*Do you have any questions? Were there any sticking points?*

Let me know. Leave a comment below.

As a software developer, building my skills to attain a role in data science, I am interested in learning more about linear algebra because:

– I am intrigued to understand the math underpinning various ML algorithms

– I would like to generally improve my mathematics skills

– I want to improve my fluency in mathematics more generally so I can better understand published academic papers.

Thanks Brandon.

Hi Jason, I am founder of Fusion Analytics World, the Leading Digital Platform for News, Industry Analysis, Jobs, Courses, Events & much more. Covering Research & Analytics across Industries.

I would be happy to feature your course(s) for free on our website and help you reach out to our targeted research intelligence and analytics focussed readers.

We would be happy to feature your articles on machine learning as well. Let me know your thoughts.

No thanks.

Hi,

Since I started to learn about machine learning, I found the importance of math specially linear algebra.

This 7 mini lessons can help to find:

-The most important notation and method which you need as a data scientist or ML developer.

-Better understanding of ML and the math behind it

Thanks

Thanks Fati.

I find your lessons very useful. Thanks for sharing this knowledge.

I have been looking for good resources on a good way to import my own data into Python (data could be images or excel file, etc.)

I am quite familiar with MATLAB and fairly new to Python.. I can’t seem to find a good way to import things in Python. I would appreciate if you could please point me to some good resources. Thanks.!

See this post:

http://machinelearningmastery.com/load-machine-learning-data-python/

Great crash course man! I’m having a great time implementing these things from zero in python, I needed this linear algebra foundation refresher!

Thanks!

Thanks, I’m glad it helped.

BTW, there is a small typo in LU Matrix Decomposition section, where you mention ‘…calculates an LPU decomposition…’ I think it should be PLU.

I drove myself crazy searching the difference between LPU and PLU lol

LUP is correct, it is LU factorization with Partial Pivoting (LUP). From:

https://en.wikipedia.org/wiki/LU_decomposition

I changed the order of the terms to match the reconstruction. Thanks.

I am learning linear algebra because it is a prerequisite for deep learning for solving computer vision problems.

Thanks.

# dot product of vectors. Both vectors must be of the same size

from numpy import array, dot

a = array([1, 2])

print(a)

b = array([13, 14])

print(b)

c = dot(a,b)

#c = a.dot(b)

print(c)

Nice.

# multiply a matrix with a vector

from numpy import array, dot

A = array([[1, 2, 3], [3, 4, 5], [5, 6, 7]])

print(A)

b = array([7, 8, 9])

print(b)

C = dot(A,b)

print(C)

Well done!

Lower triangular matrix

[[1 0 0]

[0 2 0]

[5 6 3]]

Upper triangular matrix

[[1 2 3]

[0 4 0]

[0 0 6]]

Diagonal matrix

[[1 0 0]

[0 2 0]

[0 0 3]]

Nice.

A symmetric matrix

[[1 2 3]

[2 4 6]

[3 6 5]]

Well done.

Triangular Matrix

A=[[1,0,2],[2,1,0],[3,0,1]]

Nice work.

@Jason

I am inspired by you a lot and this is my first comment after constantly viewing your website for one and a half year(almost). I want to work with you remotely. Is it possible in some way?

Thanks, I’m glad that the tutorials are helpful.

A great way to work togehter/contribute is for you to go through some of the tutorials and report your results as comments.

Hi Jason,

First thank you for this opportunity. My reasons are following:

1. To recall&recover my university knowledge on some parts of Linear Algebra.

2. To understand what is behind formulas deployed in python, so I will able to understand reasons for getting “strange” results

3. Deep understanding of how Linear Algebra “tools” can help us in investigating patterns in data. This is exciting.

BR,

Natasa

Thanks for sharing Natasa!

As a student I’m interested in learning linear algebra because I want to have a greater understanding on mathematics, statistics, probability theory, and machine learning.

Thank you for your tutorials

Thanks!

1. During my Bachelor’s Degree I never paid attention to the Linear Algebra course and so I barely passed. This time around I want to properly learn it.

2. I am starting to get into ML and Computer Vision and I’ve been told I need to have a good understanding of Linear Algebra for that.

3. I’ve been a bit out of practice with maths and would like to get back into it.

Thanks for sharing!

My 3 reasons for taking this course:

1. I enjoy learning new aspects to computer programming.

2. I want to build a system that analyzes the use of the English language to assist in improving my students’ writing.

3. I have never been confident in my math skills and shied away from to subject all through my schooling – I’d like to prove to myself that even higher level math theory is something I can grasp.

Thanks Dustin!

I wanted to learn the equation of PCA and SVM where Linear Algebra is used. I am more enthusiastic to go through each and every steps of this study

Thanks!

Determinant

from numpy import array

from numpy import linalg

A = array([[1.0, 2.0], [3.0, 4.0]])

linalg.det(A)

output : -2.0

Trace

A.trace()

output : 5.0

Nice work!

If A is matrix then

if A^(T)=A, (transpose)

then A is called a symmetric matrix

Nice work!

7 Day Course: Day 1.

Reasons for Learning Linear Algebra

1. To clarify the language of machine learning notation, right now it’s practically gibberish.

2. In hopes of not only knowing the words and notation, but understanding what the operations do.

3. To gain enough knowledge of linear algebra to put it into practice. What I learned in school was only retained for the test, and has never transitioned into my toolbox.

Well done!

7 day course completed in one day in fact few hours..

I enjoyed recollecting my engg days and able to create my own examples and solve them

Thanks a lot Jason

Well done on your progress!

Hi Jason,

My first request to you; before I perform my first task is –

Please move this section “Leave a Reply” right to the top of comment section. I could avoid scrolling all other comments to post mine. I guess if a reader is interested, scrolling down and reading the comments would still be viable.

Now to the task; This is my first day of the crash course.

I personally want to learn linear algebra;

1. This will help me appreciate many nuances involved in ML algorithms.

2. I will refresh my earlier formal training on this topic during my under graduate studies but have not used since many years.

3. I will be able read research papers on algorithms a bit more fluently.

Thanks for the suggestion.

Hi Jason,

My day 2 task –

Jason Brownlee – “[L]inear algebra is the mathematics of data. Matrices and vectors are language of data.” is the best definition I have read. The other definitions I have encountered are –

Wikipedia – Linear algebra is the branch of mathematics concerning equations which are linear.

Introduction to Linear Algebra, 2nd edition By T.A Whitelaw – [Linear algebra] solves systems of simultaneous linear equations and rectangular arrays (matrices, as we call them) of coefficients occurring in such systems. It is also true that many ideas of importance in linear algebra could be traced to geometrical sources.

Byju’s Learning (https://byjus.com/maths/linear-algebra/) – Linear algebra is the study of linear combinations. It is the study of vector spaces, lines and planes, and some mappings that are required to perform the linear transformations. It includes vectors, matrices and linear functions. It is the study of linear sets of equations and its transformation properties.

I also noticed much of text books treat linear algebra directly by starting with vectors without much discourse on attempting to describe the topic of linear algebra. E.g.

1. An introduction to linear algebra by L. Mirsky.

2. Linear algebra: A course for physicsts and engineers by Arak M. Mithai, Hans J Haubold

3. Introduction to linear algebra by Gilbert Strang

4. Linear algebra in 25 lectures – https://www.math.ucdavis.edu/~linear/linear.pdf

5. Linear algebra by Jeff Hefferon 3rd edition http://joshua.smcvt.edu/linearalgebra/book.pdf

Retrospectively; vectors, matrices are like axes of linear algebra as x, y and z axes are to space. i.e. Vectors and matrices are elementary and inseparable concepts of linear algebra.

Some applications that I discovered are in the research of –

1. Spectral clustering – https://academic.microsoft.com/paper/2132914434/reference

2. Adaptive filters used in wireless communicaion – https://academic.microsoft.com/paper/2610805269/reference

3. Latent semantic analysis in NLP : https://academic.microsoft.com/paper/1612003148/reference

4. Information retrieval – https://academic.microsoft.com/paper/2072773380/reference

5. Philosophy of science – Linear algebra applied to linear metatheory – https://arxiv.org/abs/2005.02247

Well done!

My day 3 activity following is the code –

from numpy import array

from numpy import multiply

from numpy import divide

def dotProduct(multiplier, multiplicand) :

return multiply(multiplier, multiplicand)

def add(augend, addend) :

return augend + addend

def division(dividend, divisor) :

return divide(dividend, divisor)

def subtraction(minuend, subtrahend) :

return minuend – subtrahend

subtrahend = divisor = multiplier = addend = array([2,3])

minuend = dividend = multiplicand = augend = array([6,9])

print(“Vector dot product – ” + str(dotProduct(multiplier, multiplicand)))

print(“Vector addition – ” + str(add(augend, addend)))

print(“Vector division – ” + str(division(dividend, divisor)))

print(“Vector subtraction – ” + str(subtraction(minuend, subtrahend)))

Well done!