Search results for "Long Short Term Memory Network"

How to Prepare Sequence Prediction for Truncated Backpropagation Through Time in Keras

How to Prepare Sequence Prediction for Truncated BPTT in Keras

Recurrent neural networks are able to learn the temporal dependence across multiple timesteps in sequence prediction problems. Modern recurrent neural networks like the Long Short-Term Memory, or LSTM, network are trained with a variation of the Backpropagation algorithm called Backpropagation Through Time. This algorithm has been modified further for efficiency on sequence prediction problems with […]

Continue Reading
Line Plot of Log Loss for an LSTM, Reversed LSTM and a Bidirectional LSTM

How to Develop a Bidirectional LSTM For Sequence Classification in Python with Keras

Bidirectional LSTMs are an extension of traditional LSTMs that can improve model performance on sequence classification problems. In problems where all timesteps of the input sequence are available, Bidirectional LSTMs train two instead of one LSTMs on the input sequence. The first on the input sequence as-is and the second on a reversed copy of […]

Continue Reading
How to use an Encoder-Decoder LSTM to Echo Sequences of Random Integers

How to use an Encoder-Decoder LSTM to Echo Sequences of Random Integers

A powerful feature of Long Short-Term Memory (LSTM) recurrent neural networks is that they can remember observations over long sequence intervals. This can be demonstrated by contriving a simple sequence echo problem where the entire input sequence or partial contiguous blocks of the input sequence are echoed as an output sequence. Developing LSTM recurrent neural […]

Continue Reading
How to use Different Batch Sizes for Training and Predicting in Python with Keras

How to use Different Batch Sizes when Training and Predicting with LSTMs

Keras uses fast symbolic mathematical libraries as a backend, such as TensorFlow and Theano. A downside of using these libraries is that the shape and size of your data must be defined once up front and held constant regardless of whether you are training your network or making predictions. On sequence prediction problems, it may […]

Continue Reading
Instability of Online Learning for Stateful LSTM for Time Series Forecasting

Instability of Online Learning for Stateful LSTM for Time Series Forecasting

Some neural network configurations can result in an unstable model. This can make them hard to characterize and compare to other model configurations on the same problem using descriptive statistics. One good example of a seemingly unstable model is the use of online learning (a batch size of 1) for a stateful Long Short-Term Memory […]

Continue Reading
Stateful and Stateless LSTM for Time Series Forecasting with Python

Stateful and Stateless LSTM for Time Series Forecasting with Python

The Keras Python deep learning library supports both stateful and stateless Long Short-Term Memory (LSTM) networks. When using stateful LSTM networks, we have fine-grained control over when the internal state of the LSTM network is reset. Therefore, it is important to understand different ways of managing this internal state when fitting and making predictions with […]

Continue Reading