Search results for "translation"

How to Develop an Encoder-Decoder Model with Attention for Sequence-to-Sequence Prediction in Keras

How to Develop an Encoder-Decoder Model with Attention in Keras

The encoder-decoder architecture for recurrent neural networks is proving to be powerful on a host of sequence-to-sequence prediction problems in the field of natural language processing such as machine translation and caption generation. Attention is a mechanism that addresses a limitation of the encoder-decoder architecture on long sequences, and that in general speeds up the […]

Continue Reading
Feeding Hidden State as Input to Decoder

How Does Attention Work in Encoder-Decoder Recurrent Neural Networks

Attention is a mechanism that was developed to improve the performance of the Encoder-Decoder RNN on machine translation. In this tutorial, you will discover the attention mechanism for the Encoder-Decoder model. After completing this tutorial, you will know: About the Encoder-Decoder model and attention mechanism for machine translation. How to implement the attention mechanism step-by-step. […]

Continue Reading
Scatter Plot of PCA Projection of Word2Vec Model

How to Develop Word Embeddings in Python with Gensim

Word embeddings are a modern approach for representing text in natural language processing. Word embedding algorithms like word2vec and GloVe are key to the state-of-the-art results achieved by neural network models on natural language processing problems like machine translation. In this tutorial, you will discover how to train and load word embedding models for natural […]

Continue Reading
Promise of Deep Learning for Natural Language Processing

Promise of Deep Learning for Natural Language Processing

The promise of deep learning in the field of natural language processing is the better performance by models that may require more data but less linguistic expertise to train and operate. There is a lot of hype and large claims around deep learning methods, but beyond the hype, deep learning methods are achieving state-of-the-art results on […]

Continue Reading
7 Applications of Deep Learning for Natural Language Processing

7 Applications of Deep Learning for Natural Language Processing

The field of natural language processing is shifting from statistical methods to neural network methods. There are still many challenging problems to solve in natural language. Nevertheless, deep learning methods are achieving state-of-the-art results on some specific language problems. It is not just the performance of deep learning models on benchmark problems that is most […]

Continue Reading
Relationship between Induction, Deduction and Transduction

Gentle Introduction to Transduction in Machine Learning

Transduction or transductive learning are terms you may come across in applied machine learning. The term is being used with some applications of recurrent neural networks on sequence prediction problems, like some problems in the domain of natural language processing. In this post, you will discover what transduction is in machine learning. After reading this […]

Continue Reading
Natural Language Processing with Deep Learning

Review of Stanford Course on Deep Learning for Natural Language Processing

Natural Language Processing, or NLP, is a subfield of machine learning concerned with understanding speech and text data. Statistical methods and statistical machine learning dominate the field and more recently deep learning methods have proven very effective in challenging NLP problems like speech recognition and text translation. In this post, you will discover the Stanford […]

Continue Reading