Search results for "Machine Learning"

Can I get a tax invoice for my purchase?

Yes. I can provide an invoice that you can use for reimbursement from your company or for tax purposes. Please contact me directly with your purchase details: The name of the book or bundle that you purchased. The email address that you used to make the purchase. Ideally, the order number in your purchase receipt email. Your full […]

Why don’t you have a post or book on ___?

I will get to it eventually I hope. Until then, contact me and let me know about the topic you want me to cover.

How do I use LSTMs for time series forecasting?

You can get started using deep learning methods such as MLPs, CNNs and LSTMs for univariate, multivariate and multi-step time series forecasting here: Deep Learning for Time Series Forecasting

How do I prepare my data for an LSTM?

The LSTM expects data to be provided as a three-dimensional array with the dimensions [samples, time steps, features]. Learn more about how to reshape your data in this tutorial: How to Reshape Input Data for Long Short-Term Memory Networks in Keras For a reusable function that you can use to transform a univariate or multivariate […]

Do you have material on time series in R?

Sorry, I do not have material on time series forecasting in R. I do have a book on time series in Python. There are already some great books on time series forecast in R, for example, see this post: Top Books on Time Series Forecasting With R

A Gentle Introduction to Broadcasting with NumPy Arrays

Arrays with different sizes cannot be added, subtracted, or generally be used in arithmetic. A way to overcome this is to duplicate the smaller array so that it is the dimensionality and size as the larger array. This is called array broadcasting and is available in NumPy when performing array arithmetic, which can greatly reduce […]

A Standard Multivariate, Multi-Step, and Multi-Site Time Series Forecasting Problem

Real-world time series forecasting is challenging for a whole host of reasons not limited to problem features such as having multiple input variables, the requirement to predict multiple time steps, and the need to perform the same type of prediction for multiple physical sites. In this post, you will discover a standardized yet complex time […]

How to Implement a Beam Search Decoder for Natural Language Processing

Natural language processing tasks, such as caption generation and machine translation, involve generating sequences of words. Models developed for these problems often operate by generating probability distributions across the vocabulary of output words and it is up to decoding algorithms to sample the probability distributions to generate the most likely sequences of words. In this […]

Caption Generation with the Inject and Merge Encoder-Decoder Models

Caption generation is a challenging artificial intelligence problem that draws on both computer vision and natural language processing. The encoder-decoder recurrent neural network architecture has been shown to be effective at this problem. The implementation of this architecture can be distilled into inject and merge based models, and both make different assumptions about the role […]

A Gentle Introduction to Exploding Gradients in Neural Networks

Exploding gradients are a problem where large error gradients accumulate and result in very large updates to neural network model weights during training. This has the effect of your model being unstable and unable to learn from your training data. In this post, you will discover the problem of exploding gradients with deep artificial neural […]