Long Short-Term Memory (LSTM) Recurrent Neural Networks are able to learn the order dependence in long sequence data. They are a fundamental technique used in a range of state-of-the-art results, such as image captioning and machine translation. They can also be difficult to understand, specifically how to frame a problem to get the most out […]
Archive | Long Short-Term Memory Networks
The 5 Step Life-Cycle for Long Short-Term Memory Models in Keras
Deep learning neural networks are very easy to create and evaluate in Python with Keras, but you must follow a strict model life-cycle. In this post, you will discover the step-by-step life-cycle for creating, training, and evaluating Long Short-Term Memory (LSTM) Recurrent Neural Networks in Keras and how to make predictions with a trained model. […]
A Gentle Introduction to Long Short-Term Memory Networks by the Experts
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network capable of learning order dependence in sequence prediction problems. This is a behavior required in complex problem domains like machine translation, speech recognition, and more. LSTMs are a complex area of deep learning. It can be hard to get your hands around what […]
Learn to Add Numbers with an Encoder-Decoder LSTM Recurrent Neural Network
Long Short-Term Memory (LSTM) networks are a type of Recurrent Neural Network (RNN) that are capable of learning the relationships between elements in an input sequence. A good demonstration of LSTMs is to learn how to combine multiple terms together using a mathematical operation like a sum and outputting the result of the calculation. A […]
How to Use the TimeDistributed Layer in Keras
Long Short-Term Networks or LSTMs are a popular and powerful type of Recurrent Neural Network, or RNN. They can be quite difficult to configure and apply to arbitrary sequence prediction problems, even with well defined and “easy to use” interfaces like those provided in the Keras deep learning library in Python. One reason for this […]
How to use Different Batch Sizes when Training and Predicting with LSTMs
Keras uses fast symbolic mathematical libraries as a backend, such as TensorFlow and Theano. A downside of using these libraries is that the shape and size of your data must be defined once up front and held constant regardless of whether you are training your network or making predictions. On sequence prediction problems, it may […]
Demonstration of Memory with a Long Short-Term Memory Network in Python
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network capable of learning over long sequences. This differentiates them from regular multilayer neural networks that do not have memory and can only learn a mapping between input and output patterns. It is important to understand the capabilities of complex neural networks like LSTMs […]
Crash Course in Recurrent Neural Networks for Deep Learning
Another type of neural network is dominating difficult machine learning problems involving sequences of inputs: recurrent neural networks. Recurrent neural networks have connections that have loops, adding feedback and memory to the networks over time. This memory allows this type of network to learn and generalize across sequences of inputs rather than individual patterns. A […]