Last Updated on
There are so many deep learning libraries to choose from.
Which are the good professional libraries that are worth learning and which are someones side project and should be avoided. It is hard to tell the difference.
In this post you will discover the top deep learning libraries that you should consider learning and using in your own deep learning project.
Discover how to develop deep learning models for a range of predictive modeling problems with just a few lines of code in my new book, with 18 step-by-step tutorials and 9 projects.
Let’s get started.
In this post you are going to discover the following deep learning libraries. All are open source using various different permissive licenses.
It is better described as a mathematical expression compiler where you symbolically define what you want and the framework complies your program to run efficiently on GPUs or CPUs.
It is a research platform more than a deep learning library. You have to do a lot work yourself to create the models you want. For example, there are no neural network classes.
Nevertheless, there is an excellent deep learning tutorial that shows you how you could create classes and functions for deep learning. For example it provides step-by-step examples to create the following deep learning algorithms:
Theano is really an ecosystem and in practice you would not use Theano directly. There is a long list of libraries built on top of Theano that provide handy wrapper API. Some more popular projects include:
These are becoming very large projects in and of themselves, providing helpful APIs into the underlying Theano platform, greatly accelerating the speed at which you can put models together.
If you are a Python developer and interested in broader deep learning or research, this is the platform for you.
Need help with Deep Learning in Python?
Take my free 2-week email course and discover MLPs, CNNs and LSTMs (with code).
Click to sign-up now and also get a free PDF Ebook version of the course.
Torch (called Torch7 using an odd version numbering) is a Lua deep learning framework developed by Ronan Collobert, Clement Farabet and Koray Kavukcuoglu for research and development into deep learning algorithms. It was used and promoted by the CILVR Lab at NYU (home to Yann LeCun).
Torch is used and has been further developed by the Facebook AI lab, Google DeepMind, Twitter and a host of others.
Under the covers Torch makes use of C/C++ libraries as well as CUDA for GPU. It has a goal of speed whist adopting the C-friendly language Lua to provide a less intimidating interface.
The goal of Torch is to have maximum flexibility and speed in building your scientific algorithms while making the process extremely simple
There is a lot of documentation, but it is a mess. Popular applications of Torch are for supervised image problems with Convolutional Neural Networks and agents in more complex domains with deep reinforcement learning.
If you are primarily interested in reinforcement learning, Torch is probably the platform for you.
Caffe is a Python deep learning library developed by Yangqing Jia at the Berkeley Vision and Learning Center for supervised computer vision problems.
The primary focus is Convolutional Neural Networks and it may be the world leader.
A big benefit of the library is the number of pre-trained networks that can be downloaded from the Caffe Model Zoo and used immediately. This includes state of the art models that can achieve world class results on standard computer vision datasets.
For example here are some tutorials for world class models:
If you are primarily interested in Convolutional Neural Networks and image problems, Caffe is probably the platform for you.
DeepLearning4J (or DL4J for short) is a Deep Learning framework developed in Java (and JVM languages) by Adam Gibson for commercial deep learning projects.
DL4J is a JVM-based, industry-focused, commercially supported, distributed deep-learning framework intended to solve problems involving massive amounts of data in a reasonable amount of time
DeepLearning4J is a slick platform and it offers a suite of state of the art deep learning algorithms, not limited to:
- Deep Belief Networks
- Stacked Denoising Autoencoders
- Convolutional Neural Networks
- Long Short-Term Memory Units
- Recurrent Neural Networks
The documentation is quite good covering a range of topics including some theory of the algorithms themselves and code examples.
It has the benefits of working with the whole Java ecosystem which is the predominate platform in business software development, including other languages on the JVM (e.g. Scala) and platforms for big data (Hadoop and Spark).
Deep Learning Tool Round-ups
A lot of people have done round-ups of deep learning libraries and tools. This section lists some of these round-ups and other resources that you can use to dive deeper into deep learning tools.
- KDDNuggets has a round-up of deep learning tools titled Popular Deep Learning Tools – A Review that including the results of a 2015 survey. It looks like Pylearn2 and Theano were the most popular.
- DL4J has a comparison of all the top tools titled DL4J vs. Torch vs. Theano vs. Caffe vs. TensorFlow.
- The Quora post What is the best deep learning library at the current stage for working on large data? is quite insightful as an overview.
- There is a nice round up on Teglor titled Deep Learning Libraries by Language
- DeepLearning.net has a nice list of deep learning software.
- On reddit there is a great discussion titled Best framework for Deep Neural Nets?
- List of open source deep learning projects titled 100 Best GitHub: Deep Learning.
In this post you discover the top most popular deep learning tools and libraries.
Specifically, you learned about:
Have you used one or more of these tools? Let me know what you think about them in the comments.
Do you have any questions about deep learning or the libraries listed in this post? Ask your question in the comments and I will do my best to answer it.