Search results for "Long Short Term Memory Network"

On the Suitability of Long Short-Term Memory Networks for Time Series Forecasting

On the Suitability of Long Short-Term Memory Networks for Time Series Forecasting

Long Short-Term Memory (LSTM) is a type of recurrent neural network that can learn the order dependence between items in a sequence. LSTMs have the promise of being able to learn the context required to make predictions in time series forecasting problems, rather than having this context pre-specified and fixed. Given the promise, there is […]

Continue Reading
A Gentle Introduction to Long Short-Term Memory Networks by the Experts

A Gentle Introduction to Long Short-Term Memory Networks by the Experts

Long Short-Term Memory (LSTM) networks are a type of recurrent neural network capable of learning order dependence in sequence prediction problems. This is a behavior required in complex problem domains like machine translation, speech recognition, and more. LSTMs are a complex area of deep learning. It can be hard to get your hands around what […]

Continue Reading
A Demonstration of Memory in a Long Short-Term Memory Network

Demonstration of Memory with a Long Short-Term Memory Network in Python

Long Short-Term Memory (LSTM) networks are a type of recurrent neural network capable of learning over long sequences. This differentiates them from regular multilayer neural networks that do not have memory and can only learn a mapping between input and output patterns. It is important to understand the capabilities of complex neural networks like LSTMs […]

Continue Reading
Time Series Forecasting with the Long Short-Term Memory Network in Python

Time Series Forecasting with the Long Short-Term Memory Network in Python

The Long Short-Term Memory recurrent neural network has the promise of learning long sequences of observations. It seems a perfect match for time series forecasting, and in fact, it may be. In this tutorial, you will discover how to develop an LSTM forecast model for a one-step univariate time series forecasting problem. After completing this […]

Continue Reading
IMG_9527

An Introduction to Recurrent Neural Networks and the Math That Powers Them

When it comes to sequential or time series data, traditional feedforward networks cannot be used for learning and prediction. A mechanism is required to retain past or historical information to forecast future values. Recurrent neural networks, or RNNs for short, are a variant of the conventional feedforward artificial neural networks that can deal with sequential […]

Continue Reading
Line Plots of KL Divergence Loss and Classification Accuracy over Training Epochs on the Blobs Multi-Class Classification Problem

How to Choose Loss Functions When Training Deep Learning Neural Networks

Deep learning neural networks are trained using the stochastic gradient descent optimization algorithm. As part of the optimization algorithm, the error for the current state of the model must be estimated repeatedly. This requires the choice of an error function, conventionally called a loss function, that can be used to estimate the loss of the […]

Continue Reading
A Gentle Introduction to Dropout for Regularizing Deep Neural Networks

A Gentle Introduction to Dropout for Regularizing Deep Neural Networks

Deep learning neural networks are likely to quickly overfit a training dataset with few examples. Ensembles of neural networks with different model configurations are known to reduce overfitting, but require the additional computational expense of training and maintaining multiple models. A single model can be used to simulate having a large number of different network […]

Continue Reading