Scientific Functions in NumPy and SciPy

Python is a general-purpose computation language, but it is very welcomed in scientific computing. It can replace R and Matlab in many cases, thanks to some libraries in the Python ecosystem. In machine learning, we use some mathematical or statistical functions extensively, and often, we will find NumPy and SciPy useful. In the following, we will have a brief overview of what NumPy and SciPy provide and some tips for using them.

After finishing this tutorial, you will know:

  • What NumPy and SciPy provide for your project
  • How to quickly speed up NumPy code using numba

Kick-start your project with my new book Python for Machine Learning, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started!

Scientific Functions in NumPy and SciPy
Photo by Nothing Ahead. Some rights reserved.

Overview

This tutorial is divided into three parts:

  • NumPy as a tensor library
  • Functions from SciPy
  • Speeding up with numba

NumPy as a Tensor Library

While the list and tuple in Python are how we manage arrays natively, NumPy provides us the array capabilities closer to C or Java in the sense that we can enforce all elements of the same data type and, in the case of high dimensional arrays, in a regular shape in each dimension. Moreover, carrying out the same operation in the NumPy array is usually faster than in Python natively because the code in NumPy is highly optimized.

There are a thousand functions provided by NumPy, and you should consult NumPy’s documentation for the details. Some common usage can be found in the following cheat sheet:

NumPy Cheat Sheet. Copyright 2022 MachineLearningMastery.com

There are some cool features from NumPy that are worth mentioning as they are helpful for machine learning projects.

For instance, if we want to plot a 3D curve, we would compute $z=f(x,y)$ for a range of $x$ and $y$ and then plot the result in the $xyz$-space. We can generate the range with:

For $z=f(x,y)=\sqrt{1-x^2-(y/2)^2}$, we may need a nested for-loop to scan each value on arrays x and y and do the computation. But in NumPy, we can use meshgrid to expand two 1D arrays into two 2D arrays in the sense that by matching the indices, we get all the combinations as follows:

In the above, the 2D array xx produced by meshgrid() has identical values on the same column, and yy has identical values on the same row. Hence element-wise operations on xx and yy are essentially operations on the $xy$-plane. This is why it works and why we can plot the ellipsoid above.

Another nice feature in NumPy is a function to expand the dimension. Convolutional layers in the neural network usually expect 3D images, namely, pixels in 2D, and the different color channels as the third dimension. It works for color images using RGB channels, but we have only one channel in grayscale images. For example, the digits dataset in scikit-learn:

This shows that there are 1797 images from this dataset, and each is in 8×8 pixels. This is a grayscale dataset that shows each pixel is a value of darkness. We add the 4th axis to this array (i.e., convert a 3D array into a 4D array) so each image is in 8x8x1 pixels:

A handy feature in working with the NumPy array is Boolean indexing and fancy indexing. For example, if we have a 2D array:

we can check if all values in a column are positive:

This shows only the first two columns are all positive. Note that it is a length-5 one-dimensional array, which is the same size as axis 1 of array X. If we use this Boolean array as an index on axis 1, we select the subarray for only where the index is positive:

If a list of integers is used in lieu of the Boolean array above, we select from X according to the index matching the list. NumPy calls this fancy indexing. So below, we can select the first two columns twice and form a new array:

Functions from SciPy

SciPy is a sister project of NumPy. Hence, you will mostly see SciPy functions expecting NumPy arrays as arguments or returning one. SciPy provides a lot more functions that are less commonly used or more advanced.

SciPy functions are organized under submodules. Some common submodules are:

  • scipy.cluster.hierarchy: Hierarchical clustering
  • scipy.fft: Fast Fourier transform
  • scipy.integrate: Numerical integration
  • scipy.interpolate: Interpolation and spline functions
  • scipy.linalg: Linear algebra
  • scipy.optimize: Numerical optimization
  • scipy.signal: Signal processing
  • scipy.sparse: Sparse matrix representation
  • scipy.special: Some exotic mathematical functions
  • scipy.stats: Statistics, including probability distributions

But never assume SciPy can cover everything. For time series analysis, for example, it is better to depend on the statsmodels module instead.

We have covered a lot of examples using scipy.optimize in other posts. It is a great tool to find the minimum of a function using, for example, Newton’s method. Both NumPy and SciPy have the linalg submodule for linear algebra, but those in SciPy are more advanced, such as the function to do QR decomposition or matrix exponentials.

Maybe the most used feature of SciPy is the stats module. In both NumPy and SciPy, we can generate multivariate Gaussian random numbers with non-zero correlation.

But if we want to reference the distribution function itself, it is best to depend on SciPy. For example, the famous 68-95-99.7 rule is referring to the standard normal distribution, and we can get the exact percentage from SciPy’s cumulative distribution functions:

So we see that we expect a 68.269% probability that values fall within one standard deviation from the mean in a normal distribution. Conversely, we have the percentage point function as the inverse function of the cumulative distribution function:

So this means if the values are in a normal distribution, we expect a 99% probability (one-tailed probability) that the value will not be more than 2.32 standard deviations beyond the mean.

These are examples of how SciPy can give you an extra mile over what NumPy gives you.

Want to Get Started With Python for Machine Learning?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

Speeding Up with numba

NumPy is faster than native Python because many of the operations are implemented in C and use optimized algorithms. But there are times when we want to do something, but NumPy is still too slow.

It may help if you ask numba to further optimize it by parallelizing or moving the operation to GPU if you have one. You need to install the numba module first:

And it may take a while if you need to compile numba into a Python module. Afterward, if you have a function that is purely NumPy operations, you can add the numba decorator to speed it up:

What it does is use a just-in-time compiler to vectorize the operation so it can run faster. You can see the best performance improvement if your function is running many times in your program (e.g., the update function in gradient descent) because the overhead of running the compiler can be amortized.

For example, below is an implementation of the t-SNE algorithm to transform 784-dimensional data into 2-dimensional. We are not going to explain the t-SNE algorithm in detail, but it needs many iterations to converge. The following code shows how we can use numba to optimize the inner loop functions (and it demonstrates some NumPy usage as well). It takes a few minutes to finish. You may try to remove the @numba.jit decorators afterward. It will take a considerably longer time.

Further Reading

This section provides more resources on the topic if you are looking to go deeper.

API documentations

Summary

In this tutorial, you saw a brief overview of the functions provided by NumPy and SciPy.

Specifically, you learned:

  • How to work with NumPy arrays
  • A few functions provided by SciPy to help
  • How to make NumPy code faster by using the JIT compiler from numba

Get a Handle on Python for Machine Learning!

Python For Machine Learning

Be More Confident to Code in Python

...from learning the practical Python tricks

Discover how in my new Ebook:
Python for Machine Learning

It provides self-study tutorials with hundreds of working code to equip you with skills including:
debugging, profiling, duck typing, decorators, deployment, and much more...

Showing You the Python Toolbox at a High Level for
Your Projects


See What's Inside

2 Responses to Scientific Functions in NumPy and SciPy

  1. Avatar
    Pamphile Roy May 26, 2022 at 9:19 am #

    Nice article! Please update the NumPy part about random number generators. Randint, randn and alike should not be used in new code. np.random.Generator should be used for new code 🙂

    • Avatar
      James Carmichael May 27, 2022 at 9:24 am #

      Thank you for the feedback Pamphile!

Leave a Reply