abdullahShakoortree-gdd40e365b_1920

Functional Programming in Python

Python is a fantastic programming language. It is likely to be your first choice for developing a machine learning or data science application. Python is interesting because it is a multi-paradigm programming language that can be used for both object-oriented and imperative programming. It has a simple syntax that is easy to read and comprehend. […]

Continue Reading
Some Language Features in Python

Some Language Features in Python

The Python language syntax is quite powerful and expressive. Hence it is concise to express an algorithm in Python. Maybe this is the reason why it is popular in machine learning, as we need to experiment a lot in developing a machine learning model. If you’re new to Python but with experience in another programming […]

Continue Reading
Untitled

Method of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 3: Implementing An SVM From Scratch In Python)

The mathematics that powers a support vector machine (SVM) classifier is beautiful. It is important to not only learn the basic model of an SVM but also know how you can implement the entire model from scratch. This is a continuation of our series of tutorials on SVMs. In part1 and part2 of this series we […]

Continue Reading
shakeel-ahmad-Z_MWEx6MgHI-unsplash

Method of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 2: The Non-Separable Case)

This tutorial is an extension of Method Of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 1: The Separable Case)) and explains the non-separable case. In real life problems positive and negative training examples may not be completely separable by a linear decision boundary. This tutorial explains how a soft margin can be built […]

Continue Reading
freeman-zhou-plX7xeNb3Yo-unsplash

Application of differentiations in neural networks

Differential calculus is an important tool in machine learning algorithms. Neural networks in particular, the gradient descent algorithm depends on the gradient, which is a quantity computed by differentiation. In this tutorial, we will see how the back-propagation technique is used in finding the gradients in neural networks. After completing this tutorial, you will know […]

Continue Reading
IMG_9900

Method of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 1: The Separable Case)

This tutorial is designed for anyone looking for a deeper understanding of how Lagrange multipliers are used in building up the model for support vector machines (SVMs). SVMs were initially designed to solve binary classification problems and later extended and applied to regression and unsupervised learning. They have shown their success in solving many complex machine […]

Continue Reading
rach-teo-2BzUlVUWCoo-unsplash (1)

Face Recognition using Principal Component Analysis

Recent advance in machine learning has made face recognition not a difficult problem. But in the previous, researchers have made various attempts and developed various skills to make computer capable of identifying people. One of the early attempt with moderate success is eigenface, which is based on linear algebra techniques. In this tutorial, we will […]

Continue Reading