Search results for "translation"

Scatter Plot of PCA Projection of Word2Vec Model

How to Develop Word Embeddings in Python with Gensim

Word embeddings are a modern approach for representing text in natural language processing. Word embedding algorithms like word2vec and GloVe are key to the state-of-the-art results achieved by neural network models on natural language processing problems like machine translation. In this tutorial, you will discover how to train and load word embedding models for natural […]

Continue Reading
Promise of Deep Learning for Natural Language Processing

Promise of Deep Learning for Natural Language Processing

The promise of deep learning in the field of natural language processing is the better performance by models that may require more data but less linguistic expertise to train and operate. There is a lot of hype and large claims around deep learning methods, but beyond the hype, deep learning methods are achieving state-of-the-art results on […]

Continue Reading
7 Applications of Deep Learning for Natural Language Processing

7 Applications of Deep Learning for Natural Language Processing

The field of natural language processing is shifting from statistical methods to neural network methods. There are still many challenging problems to solve in natural language. Nevertheless, deep learning methods are achieving state-of-the-art results on some specific language problems. It is not just the performance of deep learning models on benchmark problems that is most […]

Continue Reading
Relationship between Induction, Deduction and Transduction

Gentle Introduction to Transduction in Machine Learning

Transduction or transductive learning are terms you may come across in applied machine learning. The term is being used with some applications of recurrent neural networks on sequence prediction problems, like some problems in the domain of natural language processing. In this post, you will discover what transduction is in machine learning. After reading this […]

Continue Reading
Natural Language Processing with Deep Learning

Review of Stanford Course on Deep Learning for Natural Language Processing

Natural Language Processing, or NLP, is a subfield of machine learning concerned with understanding speech and text data. Statistical methods and statistical machine learning dominate the field and more recently deep learning methods have proven very effective in challenging NLP problems like speech recognition and text translation. In this post, you will discover the Stanford […]

Continue Reading
Gentle Introduction to Making Predictions with Sequences

Making Predictions with Sequences

Sequence prediction is different from other types of supervised learning problems. The sequence imposes an order on the observations that must be preserved when training models and making predictions. Generally, prediction problems that involve sequence data are referred to as sequence prediction problems, although there are a suite of problems that differ based on the […]

Continue Reading
Encoder-Decoder Long Short-Term Memory Networks

Encoder-Decoder Long Short-Term Memory Networks

Gentle introduction to the Encoder-Decoder LSTMs for sequence-to-sequence prediction with example Python code. The Encoder-Decoder LSTM is a recurrent neural network designed to address sequence-to-sequence problems, sometimes called seq2seq. Sequence-to-sequence prediction problems are challenging because the number of items in the input and output sequences can vary. For example, text translation and learning to execute […]

Continue Reading