Non-Linear Classification in R

In this post you will discover 8 recipes for non-linear classification in R. Each recipe is ready for you to copy and paste and modify for your own problem.

All recipes in this post use the iris flowers dataset provided with R in the datasets package. The dataset describes the measurements if iris flowers and requires classification of each observation to one of three flower species.

Irises

Irise Flowers
Photo by dottieg2007, some rights reserved

Mixture Discriminant Analysis

This recipe demonstrates the MDA method on the iris dataset.

Learn more about the mda function in the mda package.

Quadratic Discriminant Analysis

QDA seeks a quadratic relationship between attributes that maximizes the distance between the classes.

This recipe demonstrates the QDA method on the iris dataset.

Learn more about the qda function in the MASS package.

Get Started with Machine Learning in R, Right Now

Machine Learning Mastery With R Mini Course Table of Contents

R is the most popular platform among professional data scientists for applied machine learning.

Download your mini-course in Machine Learning with R.

Start Your FREE Mini-Course >> 

FREE 14-Day Mini-Course in
Machine Learning with R

Download your PDF containing all 14 lessons.

Get your daily lesson via email with tips and tricks.

Regularized Discriminant Analysis

This recipe demonstrates the RDA method on the iris dataset.

Learn more about the rda function in the klaR package.

Neural Network

A Neural Network (NN) is a graph of computational units that receive inputs and transfer the result into an output that is passed on. The units are ordered into layers to connect the features of an input vector to the features of an output vector. With training, such as the Back-Propagation algorithm, neural networks can be designed and trained to model the underlying relationship in data.

This recipe demonstrates a Neural Network on the iris dataset.

Learn more about the nnet function in the nnet package.

Flexible Discriminant Analysis

This recipe demonstrates the FDA method on the iris dataset.

Learn more about the fda function in the mda package.

Support Vector Machine

Support Vector Machines (SVM) are a method that uses points in a transformed problem space that best separate classes into two groups. Classification for multiple classes is supported by a one-vs-all method. SVM also supports regression by modeling the function with a minimum amount of allowable error.

This recipe demonstrates the SVM method on the iris dataset.

Learn more about the ksvm function in the kernlab package.

k-Nearest Neighbors

The k-Nearest Neighbor (kNN) method makes predictions by locating similar cases to a given data instance (using a similarity function) and returning the average or majority of the most similar data instances.

This recipe demonstrate the kNN method on the iris dataset.

Learn more about the knn3 function in the caret package.

Naive Bayes

Naive Bayes uses Bayes Theorem to model the conditional relationship of each attribute to the class variable.

This recipe demonstrates Naive Bayes on the iris dataset.

Learn more about the naiveBayes function in the e1071 package.

Summary

In this post you discovered 8 recipes for non-linear classificaiton in R using the iris flowers dataset.

Each recipe is generic and ready for you to copy and paste and modify for your own problem.

Frustrated With Your Progress In R Machine Learning?

Develop Your Own Models and Predictions in Minutes

...with just a few lines of R code

Discover how in my new Ebook: Machine Learning Mastery With R

It covers self-study tutorials and end-to-end projects on topics like:
Loading data, visualization, build models, algorithm tuning, and much more...

Finally Bring Machine Learning To
Your Own Projects

Skip the Academics. Just Results.

Click to learn more.

2 Responses to Non-Linear Classification in R

  1. Daniel Nee August 28, 2014 at 9:29 pm #

    Naive Bayes would generally be considered a linear classifier. The exception being if you are learning a Gaussian Naive Bayes (numerical feature set) and learning separate variances per class for each feature.

    Tom Mitchell has a new book chapter that covers this topic pretty well: http://www.cs.cmu.edu/~tom/mlbook/NBayesLogReg.pdf

  2. Chathurani September 29, 2015 at 9:40 pm #

    this example is good , but i know about more than this. as a example Neural Network different model, but it related only text data .

Leave a Reply