In deep learning it is common to see a lot of discussion around tensors as the cornerstone data structure.

Tensor even appears in name of Google’s flagship machine learning library: “TensorFlow”.

Tensors are a type of data structure used in linear algebra, and like vectors and matrices, you can calculate arithmetic operations with tensors.

In this tutorial, you will discover what tensors are and how to manipulate them in Python with NumPy

After completing this tutorial, you will know:

- That tensors are a generalization of matrices and are represented using n-dimensional arrays.
- How to implement element-wise operations with tensors.
- How to perform the tensor product.

Let’s get started.

## Tutorial Overview

This tutorial is divided into 3 parts; they are:

- What are Tensors?
- Tensors in Python
- Element-Wise Tensor Operations
- Tensor Product

### Need help with Linear Algebra for Machine Learning?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

## What are Tensors?

A tensor is a generalization of vectors and matrices and is easily understood as a multidimensional array.

In the general case, an array of numbers arranged on a regular grid with a variable number of axes is known as a tensor.

— Page 33, Deep Learning, 2016.

A vector is a one-dimensional or first order tensor and a matrix is a two-dimensional or second order tensor.

Tensor notation is much like matrix notation with a capital letter representing a tensor and lowercase letters with subscript integers representing scalar values within the tensor.

1 2 3 |
t111, t121, t131 t112, t122, t132 t113, t123, t133 T = (t211, t221, t231), (t212, t222, t232), (t213, t223, t233) t311, t321, t331 t312, t322, t332 t313, t323, t333 |

Many of the operations that can be performed with scalars, vectors, and matrices can be reformulated to be performed with tensors.

As a tool, tensors and tensor algebra is widely used in the fields of physics and engineering. It is a term and set of techniques known in machine learning in the training and operation of deep learning models can be described in terms of tensors.

## Tensors in Python

Like vectors and matrices, tensors can be represented in Python using the N-dimensional array (ndarray).

A tensor can be defined in-line to the constructor of array() as a list of lists.

The example below defines a 3x3x3 tensor as a NumPy ndarray. Three dimensions is easier to wrap your head around. Here, we first define rows, then a list of rows stacked as columns, then a list of columns stacked as levels in a cube.

1 2 3 4 5 6 7 8 9 |
# create tensor from numpy import array T = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) print(T.shape) print(T) |

Running the example first prints the shape of the tensor, then the values of the tensor itself.

You can see that, at least in three-dimensions, the tensor is printed as a series of matrices, one for each layer. For this 3D tensor, axis 0 specifies the level, axis 1 specifies the column, and axis 2 specifies the row.

1 2 3 4 5 6 7 8 9 10 11 12 |
(3, 3, 3) [[[ 1 2 3] [ 4 5 6] [ 7 8 9]] [[11 12 13] [14 15 16] [17 18 19]] [[21 22 23] [24 25 26] [27 28 29]]] |

## Element-Wise Tensor Operations

As with matrices, we can perform element-wise arithmetic between tensors.

In this section, we will work through the four main arithmetic operations.

### Tensor Addition

The element-wise addition of two tensors with the same dimensions results in a new tensor with the same dimensions where each scalar value is the element-wise addition of the scalars in the parent tensors.

1 2 3 4 5 6 7 8 9 10 11 |
a111, a121, a131 a112, a122, a132 A = (a211, a221, a231), (a112, a122, a132) b111, b121, t131 b112, b122, b132 B = (b211, t221, t231), (b112, b122, b132) C = A + B a111 + b111, a121 + b121, a131 + b131 a112 + b112, a122 + b122, a132 + b132 C = (a211 + b211, a221 + b221, a231 + b231), (a112 + b112, a122 + b122, a132 + b132) |

In NumPy, we can add tensors directly by adding arrays.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
# tensor addition from numpy import array A = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) B = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) C = A + B print(C) |

Running the example prints the addition of the two parent tensors.

1 2 3 4 5 6 7 8 9 10 11 |
[[[ 2 4 6] [ 8 10 12] [14 16 18]] [[22 24 26] [28 30 32] [34 36 38]] [[42 44 46] [48 50 52] [54 56 58]]] |

### Tensor Subtraction

The element-wise subtraction of one tensor from another tensor with the same dimensions results in a new tensor with the same dimensions where each scalar value is the element-wise subtraction of the scalars in the parent tensors.

1 2 3 4 5 6 7 8 9 10 |
a111, a121, a131 a112, a122, a132 A = (a211, a221, a231), (a112, a122, a132) b111, b121, t131 b112, b122, b132 B = (b211, t221, t231), (b112, b122, b132) C = A - B a111 - b111, a121 - b121, a131 - b131 a112 - b112, a122 - b122, a132 - b132 C = (a211 - b211, a221 - b221, a231 - b231), (a112 - b112, a122 - b122, a132 - b132) |

In NumPy, we can subtract tensors directly by subtracting arrays.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
# tensor subtraction from numpy import array A = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) B = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) C = A - B print(C) |

Running the example prints the result of subtracting the first tensor from the second.

1 2 3 4 5 6 7 8 9 10 11 |
[[[0 0 0] [0 0 0] [0 0 0]] [[0 0 0] [0 0 0] [0 0 0]] [[0 0 0] [0 0 0] [0 0 0]]] |

### Tensor Hadamard Product

The element-wise multiplication of one tensor from another tensor with the same dimensions results in a new tensor with the same dimensions where each scalar value is the element-wise multiplication of the scalars in the parent tensors.

As with matrices, the operation is referred to as the Hadamard Product to differentiate it from tensor multiplication. Here, we will use the “o” operator to indicate the Hadamard product operation between tensors.

1 2 3 4 5 6 7 8 9 10 |
a111, a121, a131 a112, a122, a132 A = (a211, a221, a231), (a112, a122, a132) b111, b121, t131 b112, b122, b132 B = (b211, t221, t231), (b112, b122, b132) C = A o B a111 * b111, a121 * b121, a131 * b131 a112 * b112, a122 * b122, a132 * b132 C = (a211 * b211, a221 * b221, a231 * b231), (a112 * b112, a122 * b122, a132 * b132) |

In NumPy, we can multiply tensors directly by multiplying arrays.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
# tensor Hadamard product from numpy import array A = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) B = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) C = A * B print(C) |

Running the example prints the result of multiplying the tensors.

1 2 3 4 5 6 7 8 9 10 11 |
[[[ 1 4 9] [ 16 25 36] [ 49 64 81]] [[121 144 169] [196 225 256] [289 324 361]] [[441 484 529] [576 625 676] [729 784 841]]] |

### Tensor Division

The element-wise division of one tensor from another tensor with the same dimensions results in a new tensor with the same dimensions where each scalar value is the element-wise division of the scalars in the parent tensors.

1 2 3 4 5 6 7 8 9 10 |
a111, a121, a131 a112, a122, a132 A = (a211, a221, a231), (a112, a122, a132) b111, b121, t131 b112, b122, b132 B = (b211, t221, t231), (b112, b122, b132) C = A / B a111 / b111, a121 / b121, a131 / b131 a112 / b112, a122 / b122, a132 / b132 C = (a211 / b211, a221 / b221, a231 / b231), (a112 / b112, a122 / b122, a132 / b132) |

In NumPy, we can divide tensors directly by dividing arrays.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
# tensor division from numpy import array A = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) B = array([ [[1,2,3], [4,5,6], [7,8,9]], [[11,12,13], [14,15,16], [17,18,19]], [[21,22,23], [24,25,26], [27,28,29]], ]) C = A / B print(C) |

Running the example prints the result of dividing the tensors.

1 2 3 4 5 6 7 8 9 10 11 |
[[[ 1. 1. 1.] [ 1. 1. 1.] [ 1. 1. 1.]] [[ 1. 1. 1.] [ 1. 1. 1.] [ 1. 1. 1.]] [[ 1. 1. 1.] [ 1. 1. 1.] [ 1. 1. 1.]]] |

## Tensor Product

The tensor product operator is often denoted as a circle with a small x in the middle. We will denote it here as “(x)”.

Given a tensor A with q dimensions and tensor B with r dimensions, the product of these tensors will be a new tensor with the order of q + r or, said another way, q + r dimensions.

The tensor product is not limited to tensors, but can also be performed on matrices and vectors, which can be a good place to practice in order to develop the intuition for higher dimensions.

Let’s take a look at the tensor product for vectors.

1 2 3 4 5 6 7 8 |
a = (a1, a2) b = (b1, b2) c = a (x) b a1 * [b1, b2] c = (a2 * [b1, b2]) |

Or, unrolled:

1 2 |
a1 * b1, a1 * b2 c = (a2 * b1, a2 * b2) |

Let’s take a look at the tensor product for matrices.

1 2 3 4 5 6 7 8 9 10 11 12 |
a11, a12 A = (a21, a22) b11, b12 B = (b21, b22) C = A (x) B b11, b12 b11, b12 a11 * (b21, b22), a12 * (b21, b22) C = [ b11, b12 b11, b12 ] a21 * (b21, b22), a22 * (b21, b22) |

Or, unrolled:

1 2 3 4 |
a11 * b11, a11 * b12, a12 * b11, a12 * b12 a11 * b21, a11 * b22, a12 * b21, a12 * b22 C = (a21 * b11, a21 * b12, a22 * b11, a22 * b12) a21 * b21, a21 * b22, a22 * b21, a22 * b22 |

The tensor product can be implemented in NumPy using the tensordot() function.

The function takes as arguments the two tensors to be multiplied and the axis on which to sum the products over, called the sum reduction. To calculate the tensor product, also called the tensor dot product in NumPy, the axis must be set to 0.

In the example below, we define two order-1 tensors (vectors) with and calculate the tensor product.

1 2 3 4 5 6 7 |
# tensor product from numpy import array from numpy import tensordot A = array([1,2]) B = array([3,4]) C = tensordot(A, B, axes=0) print(C) |

Running the example prints the result of the tensor product.

The result is an order-2 tensor (matrix) with the lengths 2×2.

1 2 |
[[3 4] [6 8]] |

The tensor product is the most common form of tensor multiplication that you may encounter, but there are many other types of tensor multiplications that exist, such as the tensor dot product and the tensor contraction.

## Extensions

This section lists some ideas for extending the tutorial that you may wish to explore.

- Update each example using your own small contrived tensor data.
- Implement three other types of tensor multiplication not covered in this tutorial with small vector or matrix data.
- Write your own functions to implement each tensor operation.

If you explore any of these extensions, I’d love to know.

## Further Reading

This section provides more resources on the topic if you are looking to go deeper.

### Books

- A Student’s Guide to Vectors and Tensors, 2011.
- Chapter 12, Special Topics, Matrix Computations, 2012.
- Tensor Algebra and Tensor Analysis for Engineers, 2015.

### API

### Articles

- Tensor algebra on Wikipedia
- Tensor on Wikipedia
- Tensor product on Wikipedia
- Outer product on Wikipedia

### Other

- Fundamental Tensor Operations for Large-Scale Data Analysis in Tensor Train Formats, 2016.
- Tensor product, direct sum, Quantum Mechanics I, 2006.
- Tensorphobia and the Outer Product, 2016.
- The Tensor Product, 2011

## Summary

In this tutorial, you discovered what tensors are and how to manipulate them in Python with NumPy.

Specifically, you learned:

- That tensors are a generalization of matrices and are represented using n-dimensional arrays.
- How to implement element-wise operations with tensors.
- How to perform the tensor product.

Do you have any questions?

Ask your questions in the comments below and I will do my best to answer.

Hi Jason!

Very nice, simple and well detailed introduction to one of the key mathematical tools for deep learning. I think any amateur in tensor could easily take over from here.

Thanks.