Archive | Long Short-Term Memory Networks

The 5 Step Life-Cycle for Long Short-Term Memory Models in Keras

The 5 Step Life-Cycle for Long Short-Term Memory Models in Keras

Deep learning neural networks are very easy to create and evaluate in Python with Keras, but you must follow a strict model life-cycle. In this post, you will discover the step-by-step life-cycle for creating, training, and evaluating Long Short-Term Memory (LSTM) Recurrent Neural Networks in Keras and how to make predictions with a trained model. […]

Continue Reading
A Gentle Introduction to Long Short-Term Memory Networks by the Experts

A Gentle Introduction to Long Short-Term Memory Networks by the Experts

Long Short-Term Memory (LSTM) networks are a type of recurrent neural network capable of learning order dependence in sequence prediction problems. This is a behavior required in complex problem domains like machine translation, speech recognition, and more. LSTMs are a complex area of deep learning. It can be hard to get your hands around what […]

Continue Reading
How to Learn to Add Numbers with seq2seq Recurrent Neural Networks

Learn to Add Numbers with an Encoder-Decoder LSTM Recurrent Neural Network

Long Short-Term Memory (LSTM) networks are a type of Recurrent Neural Network (RNN) that are capable of learning the relationships between elements in an input sequence. A good demonstration of LSTMs is to learn how to combine multiple terms together using a mathematical operation like a sum and outputting the result of the calculation. A […]

Continue Reading
How to use Different Batch Sizes for Training and Predicting in Python with Keras

How to use Different Batch Sizes when Training and Predicting with LSTMs

Keras uses fast symbolic mathematical libraries as a backend, such as TensorFlow and Theano. A downside of using these libraries is that the shape and size of your data must be defined once up front and held constant regardless of whether you are training your network or making predictions. On sequence prediction problems, it may […]

Continue Reading
A Demonstration of Memory in a Long Short-Term Memory Network

Demonstration of Memory with a Long Short-Term Memory Network in Python

Long Short-Term Memory (LSTM) networks are a type of recurrent neural network capable of learning over long sequences. This differentiates them from regular multilayer neural networks that do not have memory and can only learn a mapping between input and output patterns. It is important to understand the capabilities of complex neural networks like LSTMs […]

Continue Reading
Crash Course in Recurrent Neural Networks for Deep Learning

Crash Course in Recurrent Neural Networks for Deep Learning

Another type of neural network is dominating difficult machine learning problems involving sequences of inputs: recurrent neural networks. Recurrent neural networks have connections that have loops, adding feedback and memory to the networks over time. This memory allows this type of network to learn and generalize across sequences of inputs rather than individual patterns. A […]

Continue Reading