Yes. I can provide an invoice that you can use for reimbursement from your company or for tax purposes. Please contact me directly with your purchase details: The name of the book or bundle that you purchased. The email address that you used to make the purchase. Ideally, the order number in your purchase receipt email. Your full […]
Search results for "Machine Learning"
I will get to it eventually I hope. Until then, contact me and let me know about the topic you want me to cover.
You can get started using deep learning methods such as MLPs, CNNs and LSTMs for univariate, multivariate and multi-step time series forecasting here: Deep Learning for Time Series Forecasting
The LSTM expects data to be provided as a three-dimensional array with the dimensions [samples, time steps, features]. Learn more about how to reshape your data in this tutorial: How to Reshape Input Data for Long Short-Term Memory Networks in Keras For a reusable function that you can use to transform a univariate or multivariate […]
Sorry, I do not have material on time series forecasting in R. I do have a book on time series in Python. There are already some great books on time series forecast in R, for example, see this post: Top Books on Time Series Forecasting With R
Arrays with different sizes cannot be added, subtracted, or generally be used in arithmetic. A way to overcome this is to duplicate the smaller array so that it is the dimensionality and size as the larger array. This is called array broadcasting and is available in NumPy when performing array arithmetic, which can greatly reduce […]
Real-world time series forecasting is challenging for a whole host of reasons not limited to problem features such as having multiple input variables, the requirement to predict multiple time steps, and the need to perform the same type of prediction for multiple physical sites. In this post, you will discover a standardized yet complex time […]
Natural language processing tasks, such as caption generation and machine translation, involve generating sequences of words. Models developed for these problems often operate by generating probability distributions across the vocabulary of output words and it is up to decoding algorithms to sample the probability distributions to generate the most likely sequences of words. In this […]
Caption generation is a challenging artificial intelligence problem that draws on both computer vision and natural language processing. The encoder-decoder recurrent neural network architecture has been shown to be effective at this problem. The implementation of this architecture can be distilled into inject and merge based models, and both make different assumptions about the role […]
Exploding gradients are a problem where large error gradients accumulate and result in very large updates to neural network model weights during training. This has the effect of your model being unstable and unable to learn from your training data. In this post, you will discover the problem of exploding gradients with deep artificial neural […]