Natural Language Processing, or NLP, is a subfield of machine learning concerned with understanding speech and text data.

Statistical methods and statistical machine learning dominate the field and more recently deep learning methods have proven very effective in challenging NLP problems like speech recognition and text translation.

In this post, you will discover the Stanford course on the topic of Natural Language Processing with Deep Learning methods.

This course is free and I encourage you to make use of this excellent resource.

After completing this post, you will know:

- The goal and prerequisites of this course.
- A breakdown of the course lectures and how to access the slides, notes, and videos.
- How to make best use of this material.

Let’s get started.

## Overview

This post is divided into 5 parts; they are:

- Course Summary
- Prerequisites
- Lectures
- Projects
- How to Best Use This Material

## Course Summary

The course is taught by Chris Manning and Richard Socher.

Chris Manning is an author of at least two top textbooks on Natural Language Processing:

Richard Socher is the guy behind MetaMind and is the Chief Scientist at Salesforce.

Natural Language Processing is the study of computational methods for working with voice and text data.

Goal: for computers to process or “understand” natural language in order to perform tasks that are useful

Since the 1990s, the field has been focused on statistical methods. More recently, the field is switching to deep learning methods given the demonstrably improved capabilities they offer.

This course is focused on teaching statistical natural language processing with deep learning methods. From the course description on the website:

Recently, deep learning approaches have obtained very high performance across many different NLP tasks. These models can often be trained with a single end-to-end model and do not require traditional, task-specific feature engineering.

Goals of the Course

- An understanding of and ability to use the effective modern methods for deep learning
- Some big picture understanding of human languages and the difficulties in understanding and producing them
- An understanding of and ability to build systems for some of the major problems in NLP

This course is taught at Stanford, although the lectures used in the course have been recorded and made public, and we will focus on these freely available materials.

### Need help with Deep Learning for Text Data?

Take my free 7-day email crash course now (with code).

Click to sign-up and also get a free PDF Ebook version of the course.

## Prerequisites

The course assumes some mathematical and programming skill.

Nevertheless, refresher materials are provided in case the requisite skills are rusty.

Specifically:

- College Calculus
- Statistics and Probability
- Machine Learning
- Python Programming

Code examples are in Python and make use of the NumPy and TensorFlow Python libraries.

## Lectures

The lectures and material seem to change a little each time the course is taught. This is not unsurprising given the speed that things are changing the field.

Here, we will look at the CS224n Winter 2017 syllabus and lectures that are publicly available.

I recommend watching the YouTube videos of the lectures, and access the slides, papers, and further reading on the syllabus only if needed.

The course is broken down into the following 18 lectures and one review:

- Lecture 1: Natural Language Processing with Deep Learning
- Lecture 2: Word Vector Representations: word2vec
- Lecture 3: GloVe: Global Vectors for Word Representation
- Lecture 4: Word Window Classification and Neural Networks
- Lecture 5: Backpropagation and Project Advice
- Lecture 6: Dependency Parsing
- Lecture 7: Introduction to TensorFlow
- Lecture 8: Recurrent Neural Networks and Language Models
- Lecture 9: Machine Translation and Advanced Recurrent LSTMs and GRUs
- Review Session: Midterm Review
- Lecture 10: Neural Machine Translation and Models with Attention
- Lecture 11: Gated Recurrent Units and Further Topics in NMT
- Lecture 12: End-to-End Models for Speech Processing
- Lecture 13: Convolutional Neural Networks
- Lecture 14: Tree Recursive Neural Networks and Constituency Parsing
- Lecture 15: Coreference Resolution
- Lecture 16: Dynamic Neural Networks for Question Answering
- Lecture 17: Issues in NLP and Possible Architectures for NLP
- Lecture 18: Tackling the Limits of Deep Learning for NLP

I watched them all on YouTube at double playback speed with the slides open while taking notes.

## Projects

Students of the course are expected to complete assignments.

You may want to complete the assessments yourself to test your knowledge from working through the lectures.

You can see the assignments here: CS224n Assignments

Importantly, students must submit a final project report using deep learning on a natural language processing problem.

These projects can be fun to read if you are looking for ideas for how to test out your new found skills.

Directories of submitted student reports are available here:

If you find some great reports, please post your discoveries in the comments.

## How to Best Use This Material

This course is designed for students and the goal is to teach enough NLP and Deep Learning theory for the students to start developing their own methods.

This may not be your goal.

You may be a developer. You may be only interested in using the tools of deep learning on NLP problems to get a result on a current project.

In fact, this is the situation of most of my readers. If this sounds like you, I would caution you to be very careful in the way you work through the material.

**Skip the Math**. Do not focus on why the methods work. Instead, focus on a summary for how the methods work and skip the large sections on equations. You can always come back later to deepen your understanding in order to achieve better results.**Focus on Process**. Take your learnings from the lectures and put together processes that you can use on your own projects. The methods are taught piecewise, and there is little information on how to actually tie it all together.**Tool Invariant**. I do not recommend coding the methods yourself or even in using TensorFlow as demonstrated in the lectures. Learn the principles and use productive tools like Keras to actually implement the methods on your project.

There is a lot of gold in this material for practitioners, but you must keep your wits and not fall into the “*I must understand everything*” trap. As a practitioner, your goals are very different and you must ruthlessly stay on target.

## Further Reading

This section provides more resources on the topic if you are looking go deeper.

- CS224d: Deep Learning for Natural Language Processing
- CS224n: Natural Language Processing with Deep Learning
- CS224n Syllabus (Winter 2017)
- CS224n Video Lectures (Winter 2017)
- CS22d Sub-Reddit
- CS224d Student Project Reports (2015, 2016)
- CS224n Assignments

### Older Related Material

- CS 224N / Ling 284 — Natural Language Processing
- 2015 CS224d Lectures (deprecated by new 2016 lectures)
- 2016 CS224D Lecture Videos
- Deep Learning for Natural Language Processing (without Magic) 2013

## Summary

In this post, you discovered the Stanford course on Deep Learning for Natural Language Processing.

Specifically, you learned:

- The goal and prerequisites of this course.
- A breakdown of the course lectures and how to access the slides, notes, and videos.
- How to make best use of this material.

Did you work through some or all of this course material?

Let me know in the comments below.

CS224n is the most comprehensive MOOC on NLP that I have come across.

I could not agree more. But it is for academics/students, not for average developers.

I followed the first lectures and I really liked the idea of learning from the best researchers in NLP field. Also, I think the assignments are quite stimulating and not too much guided as you can find in many online resources. Although is not intended to support developers, it gives a deep understanding of the topics that anyone should have.

Thanks for your note Casimiro.

Did you complete any of the assignments?

Excellent review, thanks. Looking at some of the lecture videos, it seems to me that the math required as a prerequisite is a bit more complex than what they say you really need, but maybe it’s just me.

Have you seen the fast.ai course?

I have seen it but not taken it, have you?

If possible I would like to take this course deep learning for natural language processing remotely from Hong Kong.

However sometimes I travel between Hong Kong and Shanghai.

Sure, the material is free. Download it and take it at your own pace. I did.

Hey Jason I am not able to open the assignments. Are they open for everyone? Am i suppose to download it.

You can see them here:

http://web.stanford.edu/class/cs224n/assignments

I tried to apply for the course but it has got filled up so quick. It ended before the official enrollment period: http://scpd.stanford.edu/search/publicCourseSearchDetails.do?method=load&courseId=11754.

Do you know if there may be a group that would be interested in learning this together outside of the course?

Or do you know someone who has taken the course through the Stanford Center for Professional Development?

Sorry, I do not.

Jason rocks again!!!!!

My short summary for developers:

– Focus on Process

– “You must be ruthlessly stay on target……..” (sounds like something Miyamoto Musashi could have said)

– Don’t fall into the trap of I must know everything!

As Jason wrote in his NLP book, the rabbit whole is very deep. Just check out the questions on Quora. So many people are ‘lost’.

Thank you Jason!

Franco

Thanks Franco.