Activity regularization provides an approach to encourage a neural network to learn sparse features or internal representations of raw observations. […]

# Archive | Deep Learning Performance

## A Gentle Introduction to Activation Regularization in Deep Learning

Deep learning models are capable of automatically learning a rich internal representation from raw input data. This is called feature […]

## How to Reduce Overfitting Using Weight Constraints in Keras

Weight constraints provide an approach to reduce the overfitting of a deep learning neural network model on the training data […]

## A Gentle Introduction to Weight Constraints in Deep Learning

Weight regularization methods like weight decay introduce a penalty to the loss function when training a neural network to encourage […]

## How to Use Weight Decay to Reduce Overfitting of Neural Network in Keras

Weight regularization provides an approach to reduce the overfitting of a deep learning neural network model on the training data […]

## Use Weight Regularization to Reduce Overfitting of Deep Learning Models

Neural networks learn a set of weights that best map inputs to outputs. A network with large network weights can […]

## How to Configure the Number of Layers and Nodes in a Neural Network

Artificial neural networks have two main hyperparameters that control the architecture or topology of the network: the number of layers […]

## Gentle Introduction to the Adam Optimization Algorithm for Deep Learning

The choice of optimization algorithm for your deep learning model can mean the difference between good results in minutes, hours, […]

## How To Improve Deep Learning Performance

20 Tips, Tricks and Techniques That You Can Use To Fight Overfitting and Get Better Generalization How can you get […]