As neural networks become increasingly popular in the field of machine learning, it is important to understand the role that activation functions play in their implementation. In this article, you’ll explore the concept of activation functions that are applied to the output of each neuron in a neural network to introduce non-linearity into the model. […]
Author Archive | Muhammad Asad Iqbal Khan
Building a Logistic Regression Classifier in PyTorch
Logistic regression is a type of regression that predicts the probability of an event. It is used for classification problems and has many applications in the fields of machine learning, artificial intelligence, and data mining. The formula of logistic regression is to apply a sigmoid function to the output of a linear function. This article […]
Training Logistic Regression with Cross-Entropy Loss in PyTorch
In the previous session of our PyTorch series, we demonstrated how badly initialized weights can impact the accuracy of a classification model when mean square error (MSE) loss is used. We noticed that the model didn’t converge during training and its accuracy was also significantly reduced. In the following, you will see what happens if […]
Building an Image Classifier with a Single-Layer Neural Network in PyTorch
A single-layer neural network, also known as a single-layer perceptron, is the simplest type of neural network. It consists of only one layer of neurons, which are connected to the input layer and the output layer. In case of an image classifier, the input layer would be an image and the output layer would be […]
Neural Network with More Hidden Neurons
The traditional model of neural network is called multilayer perceptrons. They are usually made up of a series of interconnected layers. The input layer is where the data enters the network, and the output layer is where the network delivers the output. The input layer is usually connected to one or more hidden layers, which […]
Building a Single Layer Neural Network in PyTorch
A neural network is a set of neuron nodes that are interconnected with one another. The neurons are not just connected to their adjacent neurons but also to the ones that are farther away. The main idea behind neural networks is that every neuron in a layer has one or more input values, and they […]
Building a Softmax Classifier for Images in PyTorch
Softmax classifier is a type of classifier in supervised learning. It is an important building block in deep learning networks and the most popular choice among deep learning practitioners. Softmax classifier is suitable for multiclass classification, which outputs the probability for each of the classes. This tutorial will teach you how to build a softmax […]
Introduction to Softmax Classifier in PyTorch
While a logistic regression classifier is used for binary class classification, softmax classifier is a supervised learning algorithm which is mostly used when multiple classes are involved. Softmax classifier works by assigning a probability distribution to each class. The probability distribution of the class with the highest probability is normalized to 1, and all other […]
Initializing Weights for Deep Learning Models
In order to build a classifier that accurately classifies the data samples and performs well on test data, you need to initialize the weights in a way that the model converges well. Usually we randomized the weights. But when we use mean square error (MSE) as loss for training a logistic regression model, we may […]
Making Predictions with Logistic Regression in PyTorch
Logistic regression is a statistical technique for modeling the probability of an event. It is often used in machine learning for making predictions. We apply logistic regression when a categorical outcome needs to be predicted. In PyTorch, the construction of logistic regression is similar to that of linear regression. They both applied to linear inputs. […]