Archive | Deep Learning

How to Prepare Sequence Prediction for Truncated Backpropagation Through Time in Keras

How to Prepare Sequence Prediction for Truncated Backpropagation Through Time in Keras

Recurrent neural networks are able to learn the temporal dependence across multiple timesteps in sequence prediction problems. Modern recurrent neural networks like the Long Short-Term Memory, or LSTM, network are trained with a variation of the Backpropagation algorithm called Backpropagation Through Time. This algorithm has been modified further for efficiency on sequence prediction problems with […]

Continue Reading 0
A Gentle Introduction to Backpropagation Through Time

A Gentle Introduction to Backpropagation Through Time

Backpropagation Through Time, or BPTT, is the training algorithm used to update weights in recurrent neural networks like LSTMs. To effectively frame sequence prediction problems for recurrent neural networks, you must have a strong conceptual understanding of what Backpropagation Through Time is doing and how configurable variations like Truncated Backpropagation Through Time will affect the […]

Continue Reading 10
Data Preparation for Variable Length Input Sequences for Sequence Prediction

Data Preparation for Variable Length Input Sequences

Deep learning libraries assume a vectorized representation of your data. In the case of variable length sequence prediction problems, this requires that your data be transformed such that each sequence has the same length. This vectorization allows code to efficiently perform the matrix operations in batch for your chosen deep learning algorithms. In this tutorial, […]

Continue Reading 8
Line Plot of Log Loss for an LSTM, Reversed LSTM and a Bidirectional LSTM

How to Develop a Bidirectional LSTM For Sequence Classification in Python with Keras

Bidirectional LSTMs are an extension of traditional LSTMs that can improve model performance on sequence classification problems. In problems where all timesteps of the input sequence are available, Bidirectional LSTMs train two instead of one LSTMs on the input sequence. The first on the input sequence as-is and the second on a reversed copy of […]

Continue Reading 8
How to Get Reproducible Results from Neural Networks with Keras

How to Get Reproducible Results with Keras

Neural network algorithms are stochastic. This means they make use of randomness, such as initializing to random weights, and in turn the same network trained on the same data can produce different results. This can be confusing to beginners as the algorithm appears unstable, and in fact they are by design. The random initialization allows […]

Continue Reading 0
How to use an Encoder-Decoder LSTM to Echo Sequences of Random Integers

How to use an Encoder-Decoder LSTM to Echo Sequences of Random Integers

A powerful feature of Long Short-Term Memory (LSTM) recurrent neural networks is that they can remember observations over long sequence intervals. This can be demonstrated by contriving a simple sequence echo problem where the entire input sequence or partial contiguous blocks of the input sequence are echoed as an output sequence. Developing LSTM recurrent neural […]

Continue Reading 4
How to Learn to Echo Random Integers with Long Short-Term Memory Recurrent Neural Networks

How to Learn to Echo Random Integers with Long Short-Term Memory Recurrent Neural Networks

Long Short-Term Memory (LSTM) Recurrent Neural Networks are able to learn the order dependence in long sequence data. They are a fundamental technique used in a range of state-of-the-art results, such as image captioning and machine translation. They can also be difficult to understand, specifically how to frame a problem to get the most out […]

Continue Reading 2