It can be difficult to get started in deep learning.

Thankfully, a number of universities have opened up their deep learning course material for free, which can be a great jump-start when you are looking to better understand the foundations of deep learning.

In this post you will discover the deep learning courses that you can browse and work through to develop and cement your understanding of the field.

This is a long post that deep links into many videos. It is intended for you to bookmark, browse and jump into specific topics across courses rather than pick one course and complete it end-to-end.

**Kick-start your project** with my new book Deep Learning With Python, including *step-by-step tutorials* and the *Python source code* files for all examples.

Let’s get started.

## Overview

We will take a quick look at the following 6 deep learning courses.

- Deep Learning at Oxford
- Deep Learning at Udacity by Google
- Deep Learning Summer School at Montreal
- Deep Learning for Natural Language Processing at Stanford
- Convolutional Neural Networks for Visual Recognition at Stanford
- Neural networks Class at Université de Sherbrooke

There is also a “Other Courses” section at the end to gather additional video courses that are not free, broken or smaller in scope and don’t neatly fit into this summary review.

## Course Tips and How To Use This Post

There are a lot of courses and a lot of great free material out there.

My best advice is:

Do not pick a course and work through it end-to-end.

This is counter to what most people suggest.

Your impulse will be to “*get serious*” and pick “*the best*” course and work through all of the material. You will almost certainly fail.

The material is tough and you will need to take your time and get multiple different perspectives on each topic.

The very best way to really get into this material is to work through it topic by topic and draw from across all of the courses until you really understand a topic, before moving onto the next topic.

You do not need to understand all topics and you do not need to use a single source to understand a single topic.

Bookmark this page, then browse, sample and dip into the material you need, when you need it as you learn how to implement actual real deep learning models in code using a platform like Keras.

### Need help with Deep Learning in Python?

Take my free 2-week email course and discover MLPs, CNNs and LSTMs (with code).

Click to sign-up now and also get a free PDF Ebook version of the course.

## Deep Learning at Oxford

This is a machine learning course that focuses on deep learning taught at Oxford by Nando de Freitas.

I really like this course. I watched all of the videos on double time and took notes. It provides a good foundation in theory and covers modern deep learning topics such as LSTMs. Code examples are shown in Torch.

I noted that the syllabus differed from the actual video lectures available and the YouTube playlist listed the lectures out of order, so below is the list of 2015 video lectures in order.

- Deep Learning Lecture 1: Introduction
- Deep Learning Lecture 2: linear models
- Deep Learning Lecture 3: Maximum likelihood and information
- Deep Learning Lecture 4: Regularization, model complexity and data complexity (part 1)
- Deep Learning Lecture 5: Regularization, model complexity and data complexity (part 2)
- Deep Learning Lecture 6: Optimization
- Deep learning Lecture 7: Logistic regression, a Torch approach
- Deep Learning Lecture 8: Modular back-propagation, logistic regression and Torch
- Deep Learning Lecture 9: Neural networks and modular design in Torch
- Deep Learning Lecture 10: Convolutional Neural Networks
- Deep Learning Lecture 11: Max-margin learning, transfer and memory networks
- Deep Learning Lecture 12: Recurrent Neural Nets and LSTMs
- Deep Learning Lecture 13: Alex Graves on Hallucination with RNNs
- Deep Learning Lecture 14: Karol Gregor on Variational Autoencoders and Image Generation
- Deep Learning Lecture 15: Deep Reinforcement Learning – Policy search
- Deep Learning Lecture 16: Reinforcement learning and neuro-dynamic programming

The highlight for me was Alex Graves‘ talk on RNNs (Lecture 13). A smart guy doing great work. I was reading a lot of Alex’s papers at the time I watch this video so I may be biased.

### Resources

## Deep Learning at Udacity by Google

This is a mini course collaboration between Arpan Chakraborty from Udacity and Vincent Vanhoucke, a Principal Scientist at Google.

The course is free, hosted on Udacity and focuses on TensorFlow. It is a small piece of the broader Machine Learning Engineer Nanodegree by Google hosted on Udacity.

You must sign-up to Udacity, but once you sign-in you can access this course for free.

All course videos are on YouTube, but (intentionally) really hard to find with poor naming and linking. If anyone knows of a pirate playlist with all of the videos, please post it in the comments.

The course is divided into 4 lessons:

- Lesson 1: From Machine Learning to Deep Learning
- Lesson 2: Deep Neural Networks
- Lesson 3: Convolutional Neural Networks
- Lesson 4: Deep Models for Text and Sequences

The course is short but is broken up into many short videos and the Udacity interface is nice. Vincent seems to present in all of the videos I looked at (which is great) and videos are shown in the YouTube interface.

There is also a discussion form where you can ask and answer questions, driven by the slick discourse software.

My preference was to dip into videos that interested me rather than completing the whole course or doing any of the course work.

### Resources

## Deep Learning Summer School at Montreal

A deep learning supper school was held in 2015 at the University of Montreal.

According to the website, the summer school was aimed at graduate students and industrial engineers and researchers who already have some basic knowledge of machine learning.

There were at least 30 talks (there are 30 videos) from notable researchers in the field of deep learning on a range of topics from introductory material to state of the art research.

These videos are are a real treasure trove. Take your time and pick your topics carefully. All videos are hosted on the VideoLectures.net site, which has an good enough interface, but not as clean as YouTube.

Many (all?) talks had PDF slides linked below the video, and more information is available from the schedule page on the official website.

Here’s the full list of lecture topics with links to the videos. I’ve tried to list related videos together (e.g. part 1, part 2).

- Introduction to Machine Learning
- Deep Learning: Theoretical Motivations
- Multilayer Neural Networks
- Training Deep Neural Networks
- Multilayer Neural Networks
- Deep Learning for Distribution Estimation
- Undirected Graphical Models
- Stacks of Restricted Boltzmann Machines
- On manifolds and autoencoders
- Visual features: From Fourier to Gabor
- Visual features II
- Convolutional Networks
- Learning to Compare
- NLP and Deep Learning 1: Human Language & Word Vectors
- NLP and Deep Learning 2: Compositional Deep Learning
- Seeing People with Deep Learning
- Deep Learning
- Deep Learning 2
- Speech Recognition and Deep Learning
- Tutorial on Neural Network Optimization Problems
- Deep Learning (hopefully faster)
- Adversarial Examples
- From Language Modelling to Machine Translation
- Deep NLP Recurrent Neural Networks
- Deep NLP Applications and Dynamic Memory Networks
- Memory, Reading, and Comprehension
- Smooth, Finite, and Convex Optimization Deep Learning Summer School
- Non Smooth, Non Finite, and Non Convex Optimization
- Variational Autoencoder and Extensions
- Deep Generative Models

Pick a topic and dive in. So good!

It looks like there will be a 2016 summer school and hopefully there will be videos.

### Resources

## Deep Learning for Natural Language Processing at Stanford

This is a deep learning course focusing on natural language processing (NLP) taught by Richard Socher at Stanford.

An interesting note is that you can access PDF versions of student reports, work that might inspire you or give you ideas.

The YouTube playlist has poorly named files and some missing lectures. The 2016 videos are not all uploaded yet. Below is a list of the 2015 lectures and the links to the videos. Much easier for just jumping into a specific topic.

- Lecture 1: Intro to NLP and Deep Learning
- Lecture 2: Simple Word Vector representations: word2vec, GloVe
- Lecture 3: Advanced word vector representations: language models, softmax, single layer networks
- Lecture 4: Word Window Classification and Neural Networks
- Lecture 5: Project Advice, Neural Networks and Back-Prop (in full gory detail)
- Lecture 6: Practical tips: gradient checks, overfitting, regularization, activation functions, details
- Lecture 7: Recurrent neural networks for language modeling and other tasks
- Lecture 7 (8!?): GRUs and LSTMs for machine translation
- Lecture 9: Recursive neural networks for parsing
- Lecture 10: Recursive neural networks for different tasks (e.g. sentiment analysis)
- Lecture 11: Review Session for Midterm
- Lecture 13: Convolutional neural networks for sentence classification
- Lecture 15: Applications of DL to NLP
- Guest Lecture with Andrew Maas: Speech recognition
- Guest Lecture with Jason Weston: Memory Networks
- Guest Lecture with Elliot English: Efficient Implementations and GPUs

This is great material if you are into deep learning for NLP, an area where it really excels.

### Resources

- CS224d: Deep Learning for Natural Language Processing Homepage
- Course Syllabus
- 2015 Course Video Play List
- 2016 Course Video Play List

## Convolutional Neural Networks for Visual Recognition at Stanford

This course focuses on the use of deep learning for computer vision applications with convolutional neural networks.

It is another course taught at Stanford, this time by Andrej Karpathy and others.

Unfortunately, the course videos were taken down, but some clever people have found ways to put them back up in other places. See the playlists in the resources section below.

I regret to inform that we were forced to take down CS231n videos due to legal concerns. Only 1/4 million views of society benefit served 🙁

— Andrej Karpathy (@karpathy) May 3, 2016

Another great course.

Below are the video lectures for the 2016 course, but I’m not sure how long the links will last. Leave a comment and let me know if you discover the links turned bad and I’ll fix them up.

- Lecture 1 Introduction and Historical Context
- Lecture 2 Data driven approach, kNN, Linear Classification 1
- Lecture 3 Linear Classification 2, Optimization
- Lecture 4 Backpropagation, Neural Networks 1
- Lecture 5 Neural Networks Part 2
- Lecture 6 Neural Networks Part 3 Intro to ConvNets
- Lecture 7 Convolutional Neural Networks
- Lecture 8 Localization and Detection
- Lecture 9 Visualization, Deep Dream, Neural Style, Adversarial Examples
- Lecture 10 Recurrent Neural Networks, Image Captioning, LSTM
- Lecture 11 ConvNets in practice
- Lecture 12 Deep Learning libraries
- Lecture 13 Segmentation, soft attention, spatial transformers
- Lecture 14 Videos and Unsupervised Learning
- Lecture 15 Invited Talk by Jeff Dean

### Resources

- Convolutional Neural Networks for Visual Recognition Homepage
- Course Syllabus (and the 2015 Syllabus)
- Course Videos on Archive.org
- Course Videos on YouTube
- Course Sample Code

## Neural Networks Class at Université de Sherbrooke

This is a course on neural networks taught by Hugo Larochelle at the University in Sherbrooke in Québec.

There is a ton of material. A ton.

The videos are one-on-one rather than lectures and there are many small videos for each topic rather than large one hour info dumps.

I think this might be a better format than the traditional lectures, but I’m not completely won over yet. A difficulty is there are 92 videos (!!!) to browse and it can be hard to find specific videos to watch.

The material is taught covering 10 main topics:

- Topic 1: Feedforward neural networks
- Topic 2: Training neural networks
- Topic 3: Conditional random fields
- Topic 4: Training Conditional random fields
- Topic 5: Restricted Boltzmann machine
- Topic 6: Autoencoders
- Topic 7: Deep learning
- Topic 8: Sparse coding
- Topic 9: Computer vision
- Topic 10: Natural language processing

My recommendation is to use the main course home page to browse the topics and then use those links into the specific videos. The YouTube playlist has far too many videos to browse and understand. The paradox of choice will kill you.

### Resources

## Other Courses

Below are some additional video courses that are either not free, difficult to access or smaller in scope.

- Deep Learning Course at CILVR Lab @ NYU (broken links?)
- Smaller Deep Learning Courses on Udemy
- Deep Learning at CMU
- Nvidia Self-Paced Courses for Deep Learning
- Neural Networks for Machine Learning at Coursera by the University of Toronto (awesome, but no longer free)
- Graduate Summer School: Deep Learning, Feature Learning, 2012

## Summary

In this post you have discover a number of world class video courses on deep learning covering, theory, computer vision, natural language processing and more.

Heed the advice at the top of this post.

Browse and dip into lectures by topic and do not try to take on a whole course. Learn one thing rather than try and learn everything.

Take your time, bookmark this page so you can come back, and have fun.

Do you know of some other video courses on deep learning that I have not listed?

Let me know in the comments and I will update the list.

What about taking the complete udacity nanodegree?

I only included the deep learning component rather than the whole nano degree.

Machine learning appears more or less a statistical learning only. If that is true then why there is so much of importance for machine learning now.

“Machine learning is a type of artificial intelligence (AI) that provides computers with the ability to learn without being explicitly programmed.” This is so called definition of machine learning this increases my curiosity in understanding where exactly our algorithm starts learning from the past and predicts future output.

For example in a simple linear regression we are using ‘m’ examples and then extracts feature manually ,finally we are developing models based on selected feature, with many iteration we are trying to get best model. My question is In this case where lies the core learning ?

If we need to go exactly as per the definition only we need a feed a dataset to an algorithm , it should create its own feature (through some extent we can say recommender system in this) and it learns from the past data. In course of time it should improves its performance automatically by tuning its model.

Is there an learning which will improve its performance by itself?

Kindly enlighten me further on this I may be wrong on my view also.

Alex Graves gave some brief introduction for hallucination in LSTM. This video #13 describes how the neural network can learn when it’s appropriate to read, write, or erase the data it has learned. I think this is a good example of a model that can learn to learn.

Thanks for the tip Tom.

hello sir which of the above course course do you think we should start i know basics machine learning,so according to you which course will suite me??

Great question.

I teach a top-down and results-first approach. I would advise you to first learn how to work through a problem end-to-end using deep learning, then dive deeper into the theory of these courses.

My book might be a good place for you to start if you like my approach:

https://machinelearningmastery.mystagingwebsite.com/deep-learning-with-python/

Videos of “Deep Learning Summer School, Montreal 2016” is uploaded

http://videolectures.net/deeplearning2016_montreal/

Fantastic! Thanks Willy.

great work Jason. you have shared most valuable lecture series of deep learning in a one page. thanxx lot.

Thanks, I’m glad you found the post useful!

DL Summer school at University of Montreal now has videos from 2016: http://videolectures.net/deeplearning2016_montreal/

Thanks Viktor.

Nice, do you have another list for general machine learning courses?

Perhaps this post:

https://machinelearningmastery.mystagingwebsite.com/16-options-to-get-started-and-make-progress-in-machine-learning-and-data-science/

http://course.fast.ai/

https://www.deeplearning.ai/

Thanks Mike!

In 202 if you want to take the Standford class with Andrej Karpathy, head of computer vision dep in Tesla, you can access the class notes from 2016, rather than the 2020 notes, using this link from web archive.

http://web.archive.org/web/20190724082250/http://cs231n.stanford.edu/syllabus.html

Thanks for sharing.

HI Jason, thanks for your help.

Just like you said it seems the CNN Course links by Andrej Karpathy are no longer working.

Could anything be done to fix them.

Thanks

Hi John…While I cannot speak to Andrej’s blog. The following is available:

https://cs231n.github.io/