11 Classical Time Series Forecasting Methods in Python (Cheat Sheet)

Machine learning methods can be used for classification and forecasting on time series problems.

Before exploring machine learning methods for time series, it is a good idea to ensure you have exhausted classical linear time series forecasting methods. Classical time series forecasting methods may be focused on linear relationships, nevertheless, they are sophisticated and perform well on a wide range of problems, assuming that your data is suitably prepared and the method is well configured.

In this post, will you will discover a suite of classical methods for time series forecasting that you can test on your forecasting problem prior to exploring to machine learning methods.

The post is structured as a cheat sheet to give you just enough information on each method to get started with a working code example and where to look to get more information on the method.

All code examples are in Python and use the Statsmodels library. The APIs for this library can be tricky for beginners (trust me!), so having a working code example as a starting point will greatly accelerate your progress.

This is a large post; you may want to bookmark it.

Let’s get started.

11 Classical Time Series Forecasting Methods in Python (Cheat Sheet)

11 Classical Time Series Forecasting Methods in Python (Cheat Sheet)
Photo by Ron Reiring, some rights reserved.

Overview

This cheat sheet demonstrates 11 different classical time series forecasting methods; they are:

  1. Autoregression (AR)
  2. Moving Average (MA)
  3. Autoregressive Moving Average (ARMA)
  4. Autoregressive Integrated Moving Average (ARIMA)
  5. Seasonal Autoregressive Integrated Moving-Average (SARIMA)
  6. Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX)
  7. Vector Autoregression (VAR)
  8. Vector Autoregression Moving-Average (VARMA)
  9. Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX)
  10. Simple Exponential Smoothing (SES)
  11. Holt Winter’s Exponential Smoothing (HWES)

Did I miss your favorite classical time series forecasting method?
Let me know in the comments below.

Each method is presented in a consistent manner.

This includes:

  • Description. A short and precise description of the technique.
  • Python Code. A short working example of fitting the model and making a prediction in Python.
  • More Information. References for the API and the algorithm.

Each code example is demonstrated on a simple contrived dataset that may or may not be appropriate for the method. Replace the contrived dataset with your data in order to test the method.

Remember: each method will require tuning to your specific problem. In many cases, I have examples of how to configure and even grid search parameters on the blog already, try the search function.

If you find this cheat sheet useful, please let me know in the comments below.

Autoregression (AR)

The autoregression (AR) method models the next step in the sequence as a linear function of the observations at prior time steps.

The notation for the model involves specifying the order of the model p as a parameter to the AR function, e.g. AR(p). For example, AR(1) is a first-order autoregression model.

The method is suitable for univariate time series without trend and seasonal components.

Python Code

More Information

Moving Average (MA)

The moving average (MA) method models the next step in the sequence as a linear function of the residual errors from a mean process at prior time steps.

A moving average model is different from calculating the moving average of the time series.

The notation for the model involves specifying the order of the model q as a parameter to the MA function, e.g. MA(q). For example, MA(1) is a first-order moving average model.

The method is suitable for univariate time series without trend and seasonal components.

Python Code

We can use the ARMA class to create an MA model and setting a zeroth-order AR model. We must specify the order of the MA model in the order argument.

More Information

Autoregressive Moving Average (ARMA)

The Autoregressive Moving Average (ARMA) method models the next step in the sequence as a linear function of the observations and resiudal errors at prior time steps.

It combines both Autoregression (AR) and Moving Average (MA) models.

The notation for the model involves specifying the order for the AR(p) and MA(q) models as parameters to an ARMA function, e.g. ARMA(p, q). An ARIMA model can be used to develop AR or MA models.

The method is suitable for univariate time series without trend and seasonal components.

Python Code

More Information

Autoregressive Integrated Moving Average (ARIMA)

The Autoregressive Integrated Moving Average (ARIMA) method models the next step in the sequence as a linear function of the differenced observations and residual errors at prior time steps.

It combines both Autoregression (AR) and Moving Average (MA) models as well as a differencing pre-processing step of the sequence to make the sequence stationary, called integration (I).

The notation for the model involves specifying the order for the AR(p), I(d), and MA(q) models as parameters to an ARIMA function, e.g. ARIMA(p, d, q). An ARIMA model can also be used to develop AR, MA, and ARMA models.

The method is suitable for univariate time series with trend and without seasonal components.

Python Code

More Information

Seasonal Autoregressive Integrated Moving-Average (SARIMA)

The Seasonal Autoregressive Integrated Moving Average (SARIMA) method models the next step in the sequence as a linear function of the differenced observations, errors, differenced seasonal observations, and seasonal errors at prior time steps.

It combines the ARIMA model with the ability to perform the same autoregression, differencing, and moving average modeling at the seasonal level.

The notation for the model involves specifying the order for the AR(p), I(d), and MA(q) models as parameters to an ARIMA function and AR(P), I(D), MA(Q) and m parameters at the seasonal level, e.g. SARIMA(p, d, q)(P, D, Q)m where “m” is the number of time steps in each season (the seasonal period). A SARIMA model can be used to develop AR, MA, ARMA and ARIMA models.

The method is suitable for univariate time series with trend and/or seasonal components.

Python Code

More Information

Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX)

The Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX) is an extension of the SARIMA model that also includes the modeling of exogenous variables.

Exogenous variables are also called covariates and can be thought of as parallel input sequences that have observations at the same time steps as the original series. The primary series may be referred to as endogenous data to contrast it from the exogenous sequence(s). The observations for exogenous variables are included in the model directly at each time step and are not modeled in the same way as the primary endogenous sequence (e.g. as an AR, MA, etc. process).

The SARIMAX method can also be used to model the subsumed models with exogenous variables, such as ARX, MAX, ARMAX, and ARIMAX.

The method is suitable for univariate time series with trend and/or seasonal components and exogenous variables.

Python Code

More Information

Vector Autoregression (VAR)

The Vector Autoregression (VAR) method models the next step in each time series using an AR model. It is the generalization of AR to multiple parallel time series, e.g. multivariate time series.

The notation for the model involves specifying the order for the AR(p) model as parameters to a VAR function, e.g. VAR(p).

The method is suitable for multivariate time series without trend and seasonal components.

Python Code

More Information

Vector Autoregression Moving-Average (VARMA)

The Vector Autoregression Moving-Average (VARMA) method models the next step in each time series using an ARMA model. It is the generalization of ARMA to multiple parallel time series, e.g. multivariate time series.

The notation for the model involves specifying the order for the AR(p) and MA(q) models as parameters to a VARMA function, e.g. VARMA(p, q). A VARMA model can also be used to develop VAR or VMA models.

The method is suitable for multivariate time series without trend and seasonal components.

Python Code

More Information

Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX)

The Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX) is an extension of the VARMA model that also includes the modeling of exogenous variables. It is a multivariate version of the ARMAX method.

Exogenous variables are also called covariates and can be thought of as parallel input sequences that have observations at the same time steps as the original series. The primary series(es) are referred to as endogenous data to contrast it from the exogenous sequence(s). The observations for exogenous variables are included in the model directly at each time step and are not modeled in the same way as the primary endogenous sequence (e.g. as an AR, MA, etc. process).

The VARMAX method can also be used to model the subsumed models with exogenous variables, such as VARX and VMAX.

The method is suitable for multivariate time series without trend and seasonal components and exogenous variables.

Python Code

More Information

Simple Exponential Smoothing (SES)

The Simple Exponential Smoothing (SES) method models the next time step as an exponentially weighted linear function of observations at prior time steps.

The method is suitable for univariate time series without trend and seasonal components.

Python Code

More Information

Holt Winter’s Exponential Smoothing (HWES)

The Holt Winter’s Exponential Smoothing (HWES) also called the Triple Exponential Smoothing method models the next time step as an exponentially weighted linear function of observations at prior time steps, taking trends and seasonality into account.

The method is suitable for univariate time series with trend and/or seasonal components.

Python Code

More Information

Further Reading

This section provides more resources on the topic if you are looking to go deeper.

Summary

In this post, you discovered a suite of classical time series forecasting methods that you can test and tune on your time series dataset.

Did I miss your favorite classical time series forecasting method?
Let me know in the comments below.

Did you try any of these methods on your dataset?
Let me know about your findings in the comments.

Do you have any questions?
Ask your questions in the comments below and I will do my best to answer.

Want to Develop Time Series Forecasts with Python?

Introduction to Time Series Forecasting With Python

Develop Your Own Forecasts in Minutes

...with just a few lines of python code

Discover how in my new Ebook:
Introduction to Time Series Forecasting With Python

It covers self-study tutorials and end-to-end projects on topics like:
Loading data, visualization, modeling, algorithm tuning, and much more...

Finally Bring Time Series Forecasting to
Your Own Projects

Skip the Academics. Just Results.

Click to learn more.

90 Responses to 11 Classical Time Series Forecasting Methods in Python (Cheat Sheet)

  1. Adriena Welch August 6, 2018 at 3:20 pm #

    Hi Jason, thanks for such an excellent and comprehensive post on time series. I sincerely appreciate your effort. As you ask for the further topic, just wondering if I can request you for a specific topic I have been struggling to get an output. It’s about Structural Dynamic Factor model ( SDFM) by Barigozzi, M., Conti, A., and Luciani, M. (Do euro area countries respond asymmetrically to the common monetary policy) and Mario Forni Luca Gambetti (The Dynamic Effects of Monetary Policy: A Structural Factor Model Approach). Would it be possible for you to go over and estimate these two models using Python or R? It’s just a request from me and sorry if it doesn’t go with your interest.

    • Jason Brownlee August 7, 2018 at 6:23 am #

      Thanks for the suggestion. I’ve not heard of that method before.

  2. Kamal Singh August 6, 2018 at 6:19 pm #

    I am working on Time series or Prediction with neural network and SVR, I want to this in matlab by scratch can you give me the references of this materials
    Thank you in advance

    • Jason Brownlee August 7, 2018 at 6:26 am #

      Sorry, I don’t have any materials for matlab, it is only really used in universities.

  3. Catalin August 6, 2018 at 8:50 pm #

    Hi Jason! From which editor do you import the python code into the webpage of your article? Or what kind of container it that windowed control used to display the python code?

  4. Mike August 7, 2018 at 2:28 am #

    Thanks for all the things to try!

    I recently stumbled over some tasks where the classic algorithms like linear regression or decision trees outperformed even sophisticated NNs. Especially when boosted or averaged out with each other.

    Maybe its time to try the same with time series forecasting as I’m not getting good results for some tasks with an LSTM.

    • Jason Brownlee August 7, 2018 at 6:30 am #

      Always start with simple methods before trying more advanced methods.

      The complexity of advanced methods just be justified by additional predictive skill.

  5. Elie Kawerk August 7, 2018 at 2:36 am #

    Hi Jason,

    Thanks for this nice post!

    You’ve imported the sin function from math many times but have not used it.

    I’d like to see more posts about GARCH, ARCH and co-integration models.

    Best,
    Elie

    • Jason Brownlee August 7, 2018 at 6:30 am #

      Thanks, fixed.

      I have a post on ARCH (and friends) scheduled.

  6. Elie Kawerk August 7, 2018 at 2:38 am #

    Will you consider writing a follow-up book on advanced time-series models soon?

    • Jason Brownlee August 7, 2018 at 6:32 am #

      Yes, it is written. I am editing it now. The title will be “Deep Learning for Time Series Forecasting”.

      CNNs are amazing at time series, and CNNs + LSTMs together are really great.

      • Elie Kawerk August 7, 2018 at 6:40 am #

        will the new book cover classical time-series models like VAR, GARCH, ..?

        • Jason Brownlee August 7, 2018 at 2:29 pm #

          The focus is deep learning (MLP, CNN and LSTM) with tutorials on how to get the most from classical methods (Naive, SARIMA, ETS) before jumping into deep learning methods. I hope to have it done by the end of the month.

          • Elie Kawerk August 7, 2018 at 5:02 pm #

            This is great news! Don’t you think that R is better suited than Python for classical time-series models?

          • Jason Brownlee August 8, 2018 at 6:15 am #

            Perhaps generally, but not if you are building a system for operational use. I think Python is a better fit.

          • Dark7wind August 9, 2018 at 7:16 am #

            Great to hear this news. May I ask if the book also cover the topic of multivariate and multistep?

          • Jason Brownlee August 9, 2018 at 7:34 am #

            Yes, there are many chapters on multi-step and most chapters work with multivariate data.

      • Søren August 7, 2018 at 10:27 pm #

        Sounds amazing that you finally 😉 are geting the new book out on time-series models – when will it be available to buy?

        • Jason Brownlee August 8, 2018 at 6:20 am #

          Thanks. I hope by the end of the month or soon after.

  7. Arun Mishra August 10, 2018 at 5:25 am #

    I use Prophet.
    https://facebook.github.io/prophet/docs/quick_start.html

    Also, sometimes FastFourier Transformations gives a good result.

    • Jason Brownlee August 10, 2018 at 6:21 am #

      Thanks.

      • AJ Rader August 16, 2018 at 7:11 am #

        I would second the use of prophet, especially in the context of shock events — this is where this approach has a unique advantage.

  8. Ravi Rokhade August 10, 2018 at 5:19 pm #

    What are the typical application domain of these algos?

  9. Alberto Garcia Galindo August 11, 2018 at 12:14 am #

    Hi Jason!
    Firstly I congratulate you for your blog. It is helping me a lot in my final work on my bachelor’s degree in Statistics!
    What are the assumptions for make forecasting on time series using Machine Learning algorithms? For example, it must to be stationary? Thanks!

    • Jason Brownlee August 11, 2018 at 6:11 am #

      Gaussian error, but they work anyway if you violate assumptions.

      The methods like SARIMA/ETS try to make the series stationary as part of modeling (e.g. differencing).

      You may want to look at power transforms to make data more Gaussian.

  10. Neeraj August 12, 2018 at 4:55 pm #

    Hi Jason
    I’m interested in forecasting the temperatures
    I’m provided with the previous data of the temperature
    Can you suggest me the procedure I should follow in order to solve this problem

    • Jason Brownlee August 13, 2018 at 6:15 am #

      Yes, an SARIMA model would be a great place to start.

  11. Den August 16, 2018 at 12:15 am #

    Hey Jason,

    Cool stuff as always. Kudos to you for making me a ML genius!

    Real quick:
    How would you combine VARMAX with an SVR in python?

    Elaboration.
    Right now I am trying to predict a y-value, and have x1…xn variables.
    The tricky part is, the rows are grouped.
    So, for example.

    If the goal is to predict the price of a certain car in the 8th year, and I have data for 1200 cars, and for each car I have x11_xnm –> y1_xm data (meaning that let’s say car_X has data until m=10 years and car_X2 has data until m=3 years, for example).

    First I divide the data with the 80/20 split, trainset/testset, here the first challenge arises. How to make the split?? I chose to split the data based on the car name, then for each car I gathered the data for year 1 to m. (If this approach is wrong, please tell me) The motivation behind this, is that the 80/20 could otherwise end up with data of all the cars of which some would have all the years and others would have none of the years. aka a very skewed distribution.

    Then I create a model using an SVR, with some parameters.
    And then I try to predict the y-values of a certain car. (value in year m)

    However, I do not feel as if I am using the time in my prediction. Therefore, I turned to VARMAX.

    Final question(s).
    How do you make a time series prediction if you have multiple groups [in this case 1200 cars, each of which have a variable number of years(rows)] to make the model from?
    Am I doing right by using the VARMAX or could you tell me a better approach?

    Sorry for the long question and thank you for your patience!

    Best,

    Den

    • Jason Brownlee August 16, 2018 at 6:09 am #

      You can try model per group or across groups. Try both and see what works best.

      Compare a suite of ml methods to varmax and use what performs the best on your dataset.

  12. Petrônio Cândido August 16, 2018 at 6:36 am #

    Hi Jason!

    Excellent post! I also would like to invite you to know the Fuzzy Time Series, which are data driven, scalable and interpretable methods to analyze and forecast time series data. I have recently published a python library for that on http://petroniocandido.github.io/pyFTS/ .

    All feedbacks are welcome! Thanks in advance!

  13. Chris Phillips August 30, 2018 at 8:19 am #

    Hi Jason,

    Thank you so much for the many code examples on your site. I am wondering if you can help an amatur like me on something.

    When I pull data from our database, I generally do it for multiple SKU’s at the same time into a large table. Considering that there are thousands of unique SKU’s in the table, is there a methodology you would recommend for generating a forecast for each individual SKU? My initial thought is to run a loop and say something to the effect of: For each in SKU run…Then the VAR Code or the SARIMA code.

    Ideally I’d love to use SARIMA, as I think this works the best for the data I am looking to forecast, but if that is only available to one SKU at a time and VAR is not constrained by this, it will work as well. If there is a better methodology that you know of for these, I would gladly take this advice as well!

    Thank you so much!

  14. Eric September 6, 2018 at 6:32 am #

    Great post. I’m currently investigating a state space approach to forecasting. Dynamic Linear Modeling using a Kálmán Filter algorithm (West, Hamilton). There is a python package, pyDLM, that looks promising, but it would be great to hear your thoughts on this package and this approach.

    • Jason Brownlee September 6, 2018 at 2:07 pm #

      Sounds good, I hope to cover state space methods in the future. To be honest, I’ve had limited success but also limited exposure with the methods.

      Not familiar with the lib. Let me know how you go with it.

  15. Roberto Tomás September 27, 2018 at 7:38 am #

    Hi Jason, I noticed using VARMAX that I had to remove seasonality — enforcing stationarity .. now I have test and predictions data that I cannot plot (I can, but it doesn’t look right _at all_). I’m wondering if there are any built-ins that handle translation to and from seasonality for me? My notebook is online: https://nbviewer.jupyter.org/github/robbiemu/location-metric-data/blob/master/appData%20and%20locationData.ipynb

  16. Sara October 2, 2018 at 7:36 am #

    Thanks for your great tutorial posts. This one was very helpful. I am wondering if there is any method that is suitable for multivariate time series with a trend or/and seasonal components?

    • Jason Brownlee October 2, 2018 at 11:03 am #

      Yes, you can try MLPs, CNNs and LSTMs.

      You can experiment with each with and without data prep to make the series stationary.

      • Sara October 3, 2018 at 1:48 am #

        Thanks for your respond. I also have another question I would appreciate if you help me.
        I have a dataset which includes multiple time series variables which are not stationary and seems that these variables are not dependent on each other. I tried ARIMA for each variable column, also VAR for the pair of variables, I expected to get better result with ARIMA model (for non-stationarity of time series) but VAR provides much better prediction. Do you have any thought why?

        • Jason Brownlee October 3, 2018 at 6:20 am #

          No, go with the method that gives the best performance.

  17. Eric October 17, 2018 at 9:52 am #

    Hi Jason,

    In the (S/V)ARIMAX procedure, should I check to see if my exogenous regressors are stationary and difference if them if necessary before fitting?

    Y = data2 = [x + random() for x in range(101, 200)]
    X = data1 = [x + random() for x in range(1, 100)]

    If I don’t, then I can’t tell if a change in X is related to a change in Y, or if they are both just trending with time. The time trend dominates as 0 <= random() <= 1

    In R, Hyndman recommends "[differencing] all variables first as estimation of a model with non-stationary errors is not consistent and can lead to “spurious regression”".

    https://robjhyndman.com/hyndsight/arimax/

    Does SARIMAX handle this automatically or flag me if I have non-stationary regressors?

    Thanks

    • Jason Brownlee October 17, 2018 at 2:27 pm #

      No, the library will not do this for you. Differencing is only performed on the provided series, not the exogenous variables.

      Perhaps try with and without and use the approach that results in the lowest forecast error for your specific dataset.

  18. Andrew K October 23, 2018 at 9:09 am #

    Hi Jason,

    Thank you for this wonderful tutorial.

    I do have a question regarding data that isn’t continuous, for example, data that can only be measured during daylight hours. How would you approach a time series analysis (forecasting) with data that has this behavior? Fill non-daylight hour data with 0’s or nan’s?

    Thanks.

  19. Khalifa Ali October 23, 2018 at 4:48 pm #

    Hey..
    Kindly Help us in making hybrid forecasting techniques.
    Using two forecasting technique and make a hybrid technique from them.
    Like you may use any two techniques mentioned above and make a hybrid technique form them.
    Thanks.

    • Jason Brownlee October 24, 2018 at 6:25 am #

      Sure, what problem are you having with using multiple methods exactly?

  20. Mohammad Alzyout October 31, 2018 at 6:25 pm #

    Thank you for your excellent and clear tutorial.

    I wondered which is the best way to forecast the next second Packet Error Rate in DSRC network for safety messages exchange between vehicles to decide the best distribution over Access Categories of EDCA.

    I hesitated to choose between LSTM or ARMA methodology.

    Could you please guide me to the better method of them ?

    Kindly, note that I’m beginner in both methods and want to decide the best one to go deep with it because I don’t have enouph time to learn both methods especially they are as I think from different backgrounds.

    Thank you in advance.

    Best regards,
    Mohammad.

    • Jason Brownlee November 1, 2018 at 6:03 am #

      I recommend testing a suite of methods in order to discover what works best for your specific problem.

  21. Jawad November 8, 2018 at 12:33 am #

    Hi Jason,
    Thanks for great post. I have 2 questions. First, is there a way to calculate confidence intervals in HWES, because i could not find any way in the documentation. And second, do we have something like ‘nnetar’ R’s neural network package for time series forecasting available in python.
    Regards

  22. Jawad Iqbal November 22, 2018 at 8:46 am #

    Thanks for your reply Jason. “nnetar” is a function in R,
    https://www.rdocumentation.org/packages/forecast/versions/8.4/topics/nnetar
    it is used for time series forecasting. I could not find anything similar in Python.
    but now i am using your tutorial of LSTM for time series forecasting.
    And i am facing an issue that my data points are 750. and when i do prediction the way you have mentioned i.e. feed the one step forecast back to the new forecast step. So, the plot of my forecasting is just the repetition of my data. Forecast look just like the cyclic repetition of the training data. I don’t know what am i missing.

  23. Rima December 4, 2018 at 9:59 pm #

    Hi Jason,
    Thank you for this great post!
    In VARMAX section, at the end you wrote:
    “The method is suitable for univariate time series without trend and seasonal components and exogenous variables.”
    I understand from the description of VARMAX that it takes as input, multivariate time series and exogenous variables. No?
    Another question, can we use the seasonal_decompose (https://www.statsmodels.org/dev/generated/statsmodels.tsa.seasonal.seasonal_decompose.html) function in python to remove the seasonality and transform our time series to stationary time series? If so, is the result residual (output of seasonal_decompose) is what are we looking for?

    Thanks!
    Rima

    • Jason Brownlee December 5, 2018 at 6:16 am #

      Thanks, fixed.

      • Rima December 11, 2018 at 9:36 pm #

        What about Seasonal_decompose method? Do we use residual result or the trend?

        • Jason Brownlee December 12, 2018 at 5:53 am #

          Sorry, I don’t understand, perhaps you can elaborate your question?

          • Rima December 12, 2018 at 7:36 pm #

            The seasonal_decompose function implemented in python gives us 4 resutls: the original data, the seasonal component, the trend component and the residual component. Which component should we use to forecast this curve? the residual or the trend component?

          • Jason Brownlee December 13, 2018 at 7:50 am #

            I generally don’t recommend using the decomposed elements in forecasting. I recommend performing the transforms on your data yourself.

  24. Lucky December 5, 2018 at 12:49 am #

    Hi Jason,

    Could you please help me list down the names of all the models available to forecast a univariate time series?

    Thanks!

  25. Jane December 6, 2018 at 5:21 am #

    Hi Jason,

    Thank you this was super helpful!

    For the AR code, is there any modification I can make so that model predicts multiple periods as opposed to the next one? For example, if am using a monthly time series, and have data up until August 2018, the AR predicts September 2018. Can it predict September 2018, October, 2018, and November 2018 based on the same model and give me these results?

  26. Esteban December 21, 2018 at 6:56 am #

    Hi, thank you so much for your post.
    I have a question, have you used or have you any guidelines for the use of neural networks in forescating time series, using CNN and LSTMboth together?

  27. mk December 22, 2018 at 10:34 pm #

    All methods have common problems. In real life, we do not need to predict the sample data. The sample data already contains the values of the next moment. The so-called prediction is only based on a difference, time lag. That is to say, the best prediction is performance delay. If we want to predict the future, we don’t know the value of the current moment. How do we predict? Or maybe we have collected the present and past values, trained for a long time, and actually the next moment has passed. What need do we have to predict?

    • Jason Brownlee December 23, 2018 at 6:06 am #

      You can frame the problem any way you wish, e.g. carefully define what inputs you have and what output you want to predict, then fit a model to achieve that.

  28. Dr. Omar January 2, 2019 at 1:40 am #

    Dear Jason : your post and book look interesting , I am interested in forecasting a daily close price for a stock market or any other symbol, data collected is very huge and contain each price ( let’s say one price for each second) , can you briefly tell how we can predict this in general and if your book and example codes if applied will yield to future data.
    can we after inputting our data and producing the plot for the past data , can we extend the time series and get the predicted priced for next day/month /year , please explain

  29. AD January 4, 2019 at 7:51 pm #

    Hi Jason,

    Thank you for this great post.
    I have a requirement of predicting receipt values for open invoices of various customers. I am taking closed invoices – whose receipt amount is used to create training data and open invoices as test data.

    Below is the list of columns I will be getting as raw data
    For Test Data – RECEIPT_AMOUNT, RECEIPT_DATE will be blank, depicting Open Invoices
    For Training Data – Closed Invoices will have receipt amount and receipt date

    CUSTOMER_NUMBER
    CUSTOMER_TRX_ID
    INVOICE_NUMBER
    INVOICE_DATE
    RECEIPT_AMOUNT
    BAL_AMOUNT
    CUSTOMER_PROFILE
    CITY_STATE
    STATE
    PAYMENT_TERM
    DUE_DATE
    PAYMENT_METHOD
    RECEIPT_DATE

    It would be a great help if you can guide me which algo be suitable for this requirement. I think a multivariate method can satisfy this requirement

    Thanks,
    AD

  30. Marius January 8, 2019 at 7:17 am #

    Hi Jason,

    Are STAR models relevant here as well?

    Kindest
    Marius

  31. Heracles January 11, 2019 at 8:35 pm #

    Hi Jason,

    Thanks for this.
    I want to forecast whether an event would happen or not. Would that SARMAR actually work work if we have a binary column in it?
    How would I accomplish something like this including the time?

  32. Gary Morton January 13, 2019 at 5:00 am #

    Good morning

    A quality cheat sheet for time series, which I took time to re-create and decided to try an augment by adding code snippets for ARCH and GARH

    It did not take long to realize that Statsmodels does not have an ARCH function, leading to a google search that took me directly to:

    https://machinelearningmastery.com/develop-arch-and-garch-models-for-time-series-forecasting-in-python/

    Great work =) Thought to include here as I did not see a direct link, sans your above comment on thinking to do an ARCH and GARCH module.

    also for reference:

    LSTM time series model
    https://machinelearningmastery.com/how-to-develop-lstm-models-for-multi-step-time-series-forecasting-of-household-power-consumption/

    MLP and Keras Time Series
    https://machinelearningmastery.com/time-series-prediction-with-deep-learning-in-python-with-keras/

    Cheers and thank you
    -GM

  33. Mahmut COLAK January 14, 2019 at 10:08 am #

    Hi Jason,

    Thank you very much this paper. I have a time series problem but i can’t find any technique for applying. My dataset include multiple input and one output like multiple linear regression but also it has timestamp. Which algorithm is the best solution for my problem?

    Thanks.

  34. Mahmut COLAK January 14, 2019 at 10:16 am #

    Hi Jason,

    I have a problem about time series data.
    My dataset include multiple input and one output.
    Normally it is like multiple linear regression but as additional has timestamp 🙁
    So i can’t find any solution or algorithm.
    For example: AR, MA, ARIMA, ARIMAX, VAR, SARIMAX or etc.
    Which one is the best for my problem?

    Thanks.

    • Jason Brownlee January 14, 2019 at 11:15 am #

      I recommend testing a suite of methods and discover what works best for your specific dataset.

  35. RedzCh January 18, 2019 at 11:12 pm #

    one thing is there any methods to do grouped forecasting by keys or category so you have lots of forecasts , there is this on R to an extent

    • Jason Brownlee January 19, 2019 at 5:41 am #

      I’m not sure I follow, can you elaborate please?

  36. Rodrigo January 18, 2019 at 11:15 pm #

    First of all, I have read two of your books(Basics_for_Linear_Algebra_for_Machine_Learning and deep_learning_time_series_forecasting) and the simplicity with which you explain difficult concepts is brilliant. I’m using the second one to face the problem hat I present below.
    I’m facing a predicting problem for food alerts. The goal is to predict the variables of the most probable alert in the next x days (also any information I could get about future alerts is really useful for me).Alerts are recorded over time (so it’s a time series problem).
    The problem is that observations are not uniform over time (not separated by equal time lapses), i.e: since alerts are only recorded when they happen, there can be one day without alerts and another with 50 alerts. As you indicate in your book, it is a discontiguous time series.

    The entry for the possible model could be the alerts (each alert correctly coded as they are categorical variables) of the last x days, but this entry must have a fixed size/format. Since the time windows don’t have the same number of alerts, I don’t know what is the correct way to deal with this problem.

    Any data formatting suggestion to make the observations uniform over time?

    Or should I just face the problem in a different way (different inputs)?

    Thank you for your great work.

    • Jason Brownlee January 19, 2019 at 5:44 am #

      Sounds like a great problem!

      There are many ways to frame and model the problem and I would encourage you to explore a number and discover what works best.

      First, you need to confirm that you have data that can be used to predict the outcome, e.g. is it temporally dependent, or whatever it is dependent upon, can the model get access to that.

      Then, perhaps explore modeling it as a time series classification problem, e.g. is the even going to occur in this interval. Explore different interval sizes and different input history sizes and see what works.

      Let me know how you go.

Leave a Reply