Search results for "word embedding"

attention_research_cover

A Bird’s Eye View of Research on Attention

Attention is a concept that is scientifically studied across multiple disciplines, including psychology, neuroscience, and, more recently, machine learning. While all disciplines may have produced their own definitions for attention, one core quality they can all agree on is that attention is a mechanism for making both biological and artificial neural systems more flexible.  In […]

Continue Reading
29539982252_f2d3e260be_k

A Gentle Introduction to Vector Space Models

Vector space models are to consider the relationship between data that are represented by vectors. It is popular in information retrieval systems but also useful for other purposes. Generally, this allows us to compare the similarity of two vectors from a geometric perspective. In this tutorial, we will see what is a vector space model […]

Continue Reading
Example of 100 Photos of Sneakers Generated by an AC-GAN

How to Develop an Auxiliary Classifier GAN (AC-GAN) From Scratch with Keras

Generative Adversarial Networks, or GANs, are an architecture for training generative models, such as deep convolutional neural networks for generating images. The conditional generative adversarial network, or cGAN for short, is a type of GAN that involves the conditional generation of images by a generator model. Image generation can be conditional on a class label, […]

Continue Reading
Line Plot for Supervised Greedy Layer-Wise Pretraining Showing Model Layers vs Train and Test Set Classification Accuracy on the Blobs Classification Problem

How to Use Greedy Layer-Wise Pretraining in Deep Learning Neural Networks

Training deep neural networks was traditionally challenging as the vanishing gradient meant that weights in layers close to the input layer were not updated in response to errors calculated on the training dataset. An innovation and important milestone in the field of deep learning was greedy layer-wise pretraining that allowed very deep neural networks to […]

Continue Reading
Overview of Course Structure

Practical Deep Learning for Coders (Review)

Practical deep learning is a challenging subject in which to get started. It is often taught in a bottom-up manner, requiring that you first get familiar with linear algebra, calculus, and mathematical optimization before eventually learning the neural network techniques. This can take years, and most of the background theory will not help you to […]

Continue Reading
Promise of Deep Learning for Natural Language Processing

Promise of Deep Learning for Natural Language Processing

The promise of deep learning in the field of natural language processing is the better performance by models that may require more data but less linguistic expertise to train and operate. There is a lot of hype and large claims around deep learning methods, but beyond the hype, deep learning methods are achieving state-of-the-art results on […]

Continue Reading