Join Doug Turnbull’s ‘ML Powered Search’ Live Cohort

Sponsored Post    Sign up for Doug Turnbull’s exclusive live cohort, starting October 11. Previous Sphere cohorts have had students from Apple, Amazon, Spotify, Microsoft, Twitter, Shopify, Glassdoor, and more. Doug leads the entire Search Relevance practice at Shopify. He has spent the last 10+ years writing industry-leading books such as “Relevant Search” (2016) & […]

Continue Reading 0
tour_cover

A Tour of Attention-Based Architectures

As the popularity of attention in machine learning grows, so does the list of neural architectures that incorporate an attention mechanism. In this tutorial, you will discover the salient neural architectures that have been used in conjunction with attention. After completing this tutorial, you will gain a better understanding of how the attention mechanism is […]

Continue Reading 4
attention_mechanism_cover

The Attention Mechanism from Scratch

The attention mechanism was introduced to improve the performance of the encoder-decoder model for machine translation. The idea behind the attention mechanism was to permit the decoder to utilize the most relevant parts of the input sequence in a flexible manner, by a weighted combination of all of the encoded input vectors, with the most […]

Continue Reading 15
what_is_attention_cover

What is Attention?

Attention is becoming increasingly popular in machine learning, but what makes it such an attractive concept? What is the relationship between attention as applied in artificial neural networks, and its biological counterpart? What are the components that one would expect to form an attention-based system in machine learning? In this tutorial, you will discover an […]

Continue Reading 6
attention_research_cover

A Bird’s Eye View of Research on Attention

Attention is a concept that is scientifically studied across multiple disciplines, including psychology, neuroscience and, more recently, machine learning. While all disciplines may have produced their own definitions for attention, there is one core quality they can all agree on: attention is a mechanism for making both biological and artificial neural systems more flexible.  In […]

Continue Reading 7

Last call: Stefan Krawcyzk’s ‘Mastering MLOps’ Live Cohort

Sponsored Post   This is your last chance to sign up for Stefan Krawczyk’s exclusive live cohort, starting next week (August 22nd). We already have students enrolled from Apple, Amazon, Spotify, Nubank, Workfusion, Glassdoor, ServiceNow, and more. Stefan Krawczky has spent the last 15+ years working on MLOps at companies like Stitch Fix, Nextdoor, and […]

Continue Reading 0
How to Make Classification and Regression Predictions for Deep Learning Models in Keras

How to Make Predictions with Keras

Once you choose and fit a final deep learning model in Keras, you can use it to make predictions on new data instances. There is some confusion amongst beginners about how exactly to do this. I often see questions such as: How do I make predictions with my model in Keras? In this tutorial, you […]

Continue Reading 219
Why Initialize a Neural Network with Random Weights?

Why Initialize a Neural Network with Random Weights?

The weights of artificial neural networks must be initialized to small random numbers. This is because this is an expectation of the stochastic optimization algorithm used to train the model, called stochastic gradient descent. To understand this approach to problem solving, you must first understand the role of nondeterministic and randomized algorithms as well as […]

Continue Reading 37