Believe it or not, but is potentially the best tutorial about time series forecasting out there. Definitely worth attention. Please keep up the good work👍
İnteresting video Rob. You're a hero Here are my two cents on this video for people struggling with timeseries forecasting. 1. Feature engineering is extremely essential. Make sure to get thr right features before training your data. 2. Instead of using base features try using derived ones such as "mean, median, std, var, rolling mean, rolling std, rolling median etc." 3. Use preprocessing to clean your data and make sure to interpolate your missing values instead of dropping them. 4. Never mix things. Forcast trends with trends and linear with linear.
As a person making a career change from an entirely different industry I really appreciate your videos. Finishing up class, with your help! Will be back to learn on my own this summer...thanks again!
Absolutely great tutorial (thanks!), but I still have questions, although they are more general questions and not particularly related to XGBoost: 1) How to cope with data that has repetitions, like repetitions of the same date (pivot tables, but I have difficulty coping with pivots when the dataframe already has multiple features) 2) How does XGBoost (or any other model) cope with categories in data, for example (and applied to this tutorial): what if the electricity usage data would include regions, how can you integrate those in a forecasting model? This also implies repetitions: date1 -> region1 data; date1 -> region2 data; date1 -> region3 data, etc... I'll eventually find out how to work with this, but if you wanted some input for a follow up video, here it is :D
Thank you for showing how to train a forecasting model with cross validation. I've never truly understood it, until I saw your video. I'll apply it to my own projects!
Thank you for the amazing tutorial,I appreciate the fact that we get to see advanced methods and the workflow of an expert , it may seem hard to follow at first , especially for a beginner like me , but going through every step , bit by bit will pay off.
These are such good videos Rob. You cover so much material quickly and clearly, with straightforward language that a novice like me can understand. Much appreciated 👍
Love your teaching style. Practical and to the point, it makes it really easy to understand these features. My mind is exploding with ideas on how to apply this. I would love to see this used but across multiple categories in a data set. For example, for financial data, creating predictions for COGS, expenses, revenue, assets, liabilities, etc., even adding future data like a large asset being purchased or an acquisition of some sort to create financial statements for years to come. Thank you so much for the video!
This video is, in equal parts, fascinating and concerning. Fascinating with how easy you make it look. Concerning because there is nowhere to hide in terms of how far i need to go.........
The part 1 and 2 are such a tremendous job. I managed to do some forecasting and nowcasting for european inflation using your guidance. You should consider creating a course. You are excelent!!
Again. Amazing work. I want to use your tutorial at work for our interns in the future. Note that there is also a lag function you can use in pandas (shift), but it is based on "previous rows" rather than "previous unit of time", so it will not be as accurate as your mapping (when some rows are missing) Additionally, you can add any other feature that is available in your dataset (multivariate dataset), this example is purely timeseries, but if you have for example a column for "county", you could experiment with making several models per county or one big model with "county" as a feature.
Wow. That's a big honor that you want to use my tutorial to help train people. Please do it! Yes, I made the lag features in this way to ensure we didn't have any issues if certain rows were missing from the dataset, but you can also use something like `shift` in pandas. This dataset doesn't have detailed features but I agree it's good to add them if available.
@@robmulla Sorry for asking something like this but a tutorial on time series forecasting with Pytorches Temporal Fusion Transformer (TFT) from you would be awesome!
Amazing. Really covered very well most of my questions from the firat video. This is the best video I have seen so far about xgboost that integrates time series data with feature engineering that uses aggregate data and shows clearly how to use it with cross-validation appropriately to avoid data leakage. I would make a 3rd part where you focus on integrating everything within a preprocessing pipeline along with hyperparameter tuning using GridSearchCV or any other method in order to make it more robust. Either way you showed me a new way to use
I am glad I found your channel, every morning before I start my work (Work From Home Days) I watch at least one of your videos and damn! my productivity and skills have improved by a lot (especially pandas data pipelines) at work. Thanks Rob! Keep up the good work!
I have never enjoyed time series forecasting, I failed learning it like 7 times past year, and now I’m starting again and not letting it go this time. Your 2 videos (40 mins) are so helpful and the way you explain things are amazing, I think if you continue the series, you would blast YT since there is no much content about it. You made my day tho 😂🎉
Really nice video indeed, I am currently using the XGBoost model for time sries prediction of water consumption and it does better job than the ARIMA family so far. Thank you for sharing how to continue with the prediction, this is really hard to find anywhere as info. Great videos and keep up with the good work, i will definately follow :)
please keep making video tutorials as your videos help beginners like me (career changers) to finish their projects or assignment. i already follow your kaggle and YT.
@@robmulla can you make tutorial regarding this forecast or any forecast compare between XGBoost and ARIMA please as follow up series for this Time Series Project.
Once again: great video, thanks a lot! I would love to see you showcasing some more advanced feature engineering and hyperparameter tuning. Where to start? Which rule of thumbs do you use (for example for the starting hyperparameters)? What do you look for while tuning, etc. However: can't wait to see your next video! Have a nice day :)
Great suggestion! I have considered a xgboost tuning guide however it really depends on the dataset. There are some packages out there that can help automate that process. For feature engineering you will be limited for a dataset like this because we only have the target value, fun feature engineering would come if you also had other correlated features (like weather forecasts, etc).
Please Please! If you have time. We would like to see another lecture comparing the performance of XGBoost with deep neural network (DNN) and Multiple Linear regression (MLR) models in time Series Forecasting for energy (in the same topic).We will be very grateful to you, because there are no clear and useful lectures as you have provided, in a very easy and useful way.
This is some real quality stuff dude ! I found 100's of youtubers who talk about .fit() .predict() but you have gone way beyond that ! Thanks for teaching us a correct methodology for approaching time series analysis. Thanks a lot !
I appreciate the feedback! I think it's important to know some of these high level ideas like forecasting horizon and model validation. `fit` and `predict` is the easy part! 😆
Very nicely done. Crisp explanations, optimum amount of details and plenty of crumbs to follow should one care to deep dive into specific areas. This is the second video on forecasting and I'd love to see more videos on the series. How about a tutorial on forecasting with RNN/LSTM?
Glad it was helpful! I try to make the videos less fluff but sometime I worry if I actually strike the right balance. I definately have plans for a RNN/LSTM forecasting video although I am not a big supporter of using them in applications like this. Also thinking about making a video about something like facebook's prophet model.
Thanks for the feedback! Using the shift method can be really handy, however you need to be careful and ensure that there is no gaps in your data. The reason I did the mapping the way it's done in this video is because there are some timestamps that are missing and shifting would then make the values not align correctly. Hopefully that makes sense.
That makes sense. Indeed, I did assume that there are no gaps in the timeseries, because that step is usually done in a preprocessing step beforehand, if necessary (so that there are no gaps in general, not only for the lag features). But of course that was still an assumption from my side. Thanks for the reply!
Oh, thank you very much, Rob ! Thanks for answering the inquiry about time series cross validation! It makes things much clearer now! God bless you, man!
Glad you found it helpful. Time series validation can become more of an art than a science especially when you have non-stable trends. Time series is hard!
Excellent work. Keep the content coming! Build a model to predict future topics people with ask, there is an API to pull data from YT. But seriously, nice work, love the advanced deep dives. How i got this far was i found a RUclips short from you explaining a topic that popped in my feed, then you had a 10 minute general knowledge video, then a deep dive series... that structure i really liked.
Great video. I think you should've explained that the purpose of doing cross-validation is to do hyper-parameter tuning and/or change your features to improve your results
Thanks again for the great tutorial, it is a stepping stone to understand how to use XGBoost and some key techniques to apply when dealing with timeseries data. However I think there is one conceptional problem with using lagged versions of the data. Imagine you use just lag 1. During the training process you always have the actual previous value (lag 1) but when you forecast into the unknown future even at the 2nd timestamp you already don't have the actual data for the lag1 for this unknown future, hence the trained parameter is not possible to be applied in this case during all the future steps, so everything less than the forecast horizon might be misleading in my opinion since the model will just not apply the trained relationship for those lags as they don't exists in each step. Please correct me if I am wrong, I am developing a Pipeline for TimeSeries forecasting with my own data and I really want to achieve best possible outcome.
Awesome and informative work with really creative visuals. The graphs and plots were impressive and visually appealing. As a suggestion, though, you may improve the structure by adding headers to separate each section.
Glad you found it inspiring. Parameter tuning isn't always worth spending a lot of time on. At least at first. You can use an auto-tuner like Optuna, once you've got your model setup and it will tune parameters automatically.
Great tutorial! Learned a lot from these time series forecasting videos. Could you please also make a video on how to detect data anomaly with this time series data? Such as using the trained model to detect possible outliers of energy use in your example data set. Thanks!!
Thanks. Good idea. Usually abnormalities on data like this are assessed by checking the values outside some distance from the mean value. 3 standard deviations is typical for a true outlier. It depends what data you want to subset to (like a certain month or hour) to determine what mean and std to use.
@@robmulla Using mean assumes a symmetric normal distribution? - where you have a non-symetric, non-normal distribution, median and percentiles may be better?
great video, it would be really great if you do a continuation and show us how to upload the model and connect it to a GUI so that it can be used by everyday users.
Hi Professor, I need your advice! But first, I want to thank you for the content you create-it's incredibly helpful and inspiring for many people. I have a question regarding forecasting. I work in operations at a delivery start-up and I am responsible for projecting the contact volumes we receive from customers in our CRM. My goal is to determine the number of agents needed to handle the demand efficiently without overspending. The challenging part, and the reason I'm reaching out to you, is that we need to project volumes in 30-minute intervals rather than daily. I need to forecast ahead in 30-minute intervals over several months. I'll provide an example for clarity. Could you advise me on which method would be feasible for this particular requirement? Thank you!
Your videos are always explained with such clarity, well done. I was wondering if we could predict the future using the method you implemented here in the model from the previous video? Thank you.
Hey Rob, thank you so much for this great video! It's helped me a lot! If you could update the code file to include the holiday feature, that would be fantastic!
Thanks for watching. The lag variables represent what the target value was in the past. I just called them lag1, lag2 etc. But if you watch at 12:19 when I create them I create them for a certain number of days in the past. Hope that helps.
Hello, thank you for making such a great teaching video. I would like to ask if you will make a teaching video on how to use xgboost for multi-step time series prediction? Thank you very much!!!
Thanks for watching. I'm not sure what you mean by multi-step- but if I understand it correctly, you would just re-train and predict for each new time period when you are forecasting.
Hi Rob, thanks for making the video, really awesome! I wonder if I do a hyper parameter tunning, do I still need to do TSCV for each set of hyper parameters, or I can just leave a recent history set for validation to avoid overfitting, and use the most recent history as test set for model selection? If my goal is not to have a time series model that works well over a long period of time but just the near future, is there advantage of TSCV over the holdout set method? Thanks you!
Hi, Great video! I watched both parts and the prohet one too. I have a question : if I understood it correctly, I shouldn't create lag features greater than my horizon prediction window. It's not so clear to me why I shouldn't do that. Could you please share more details about that part ?
I'm struggling to grasp this concept because the example involves lag1, lag2, and lag3 even though he is forecasting 1yr into the future. For instance, on '01-01-2015,' you have information to calculate the energy consumption from three years prior (on '01-01-2012').
XGBoost and most GBM models work fine with null values. Because it's a tree based algorithm it is able to split the null values out. However, I've found that imputing values can sometimes help.
Amazing vids Rob!! I'm carefully following part 1 & 2, learning lots of minor tricks along the way, love them. Quick question: any reason why the XGBRegressor objective is "reg:linear", and not just the default? Other params seem non-defaults as well.
@@robmulla means next candle prediction based on historical cdl data upto pre.cdl whether next cdl is green then model give 1 if it's red 0. Like this forex data eurusd..... i'm newbie to py facing difficulty in building perfect model. I'm getting accuracy of 54-60 % Thanks in advance👍
Great video! One question - if you’re using a feature for which future values don’t exist, let’s say the stock price of a company, then how would you create the future data frame ?
Thanks for the sub! Yes, LightGBM and XGBoost are very similar with just slightly different names for the parameters. If you use the sklearn api they are almost the same.
But will the time of the lag feature be too close to produce cumulative errors? How should this be done, such as delaying the previous point but predicting the next 24 points
Thanks for the feedback. If you have a lag feature of 1 hour then you would only be able to predict 1 hour in the future, because beyond that you would not be able to compute a lag feature (because those lag features would not have occured yet).
I really enjoy your videos. Any chance you could do one exploring dimensionality reduction techniques for time series? Particularly non-linear techniques such as VAE
Nice video. Can we have another followup video on if incoporating additional pool of time series features that we believe that can be used as independent variables (e.g. weather series, price series), how to determine the lag to be used in those features and how to do feature selections from the feature sets?
Thanks for the suggestions. That would be a good future video. Using additional features is tricky because you need to be sure to only use the feature values you have on the day predicting. So for weather data you actually would need to use the forecast from X days prior to the predicting date. Otherwise you will introduce a data leak into the model training.
This is great! Thank you! Question: Why didn't you use a feature scaling function? Isn't it better to normalize the features for gradient descent algorithms?
Great question. For many algorithms that is the case. However for tree based algorithms like xgboost it will just split a branch somewhere in the feature. So scaling isn’t necessary.
Thank you for your complete tutorial! I have tried your code on kaggle and it made a question popped up in my mind, my question is, how can the lags values in future_w_features variable can apprear in "isFutre == True" while in 'PJME_MW ' is empty?
I really enjoyed it! However, I'm still curious about how you generate lag values in the future dataframe. For example, if we have data from 2020 to 2023 and create a lag1 feature with a 364-day lag, then the lag1 value for the period from 2020 to 2021 would be NAN. But when you create a future dataframe to predict for the next year (2023-2024), does it work by shifting the lag values accordingly? In other words, is the lag values being pushed forward by a year to generate the future lag values?
Thanks for sharing amazing work. I just want to ask about features created from datetimeindex(year, month, dayofyear etc.) shouldn't we change type of them to category ?
Apprecaite the feedback. You could try making these categorical, however they aren't completely ordinal, especially day of year.. The xgboost should find splits for these features in trees based on where it determines the best break points to be. But its always worth trying and seeing what performs best using the validation setup.
well done, absolutely incredible can you do one of these with a Prophet model? it takes away so many headaches that you were dealing with in the feature creation and cross validation
great video Rob! helps me a lot! i wanna ask something, so my project uses a lot of features (x) that the values is still unknown in the future unlike in your video that the features already has a values like the month, days of weeks, etc. so, is it impossible for me to predict the future because of the features is still unknown? cause i already try it and the result show a straight line with the same values :( Thanks in advance!
The best tutorial I have ever seen on time series forecasting. Thank you so much. Your voice is so amazing even a small child can understand.
Believe it or not, but is potentially the best tutorial about time series forecasting out there. Definitely worth attention. Please keep up the good work👍
Wow, that means a lot to me! Glad you found it so helpful. I have no plans of slowing down any time soon!
I second what @Maqsud Hasanov. Thanks for sharing and more about TS please 😁
😊
Hands down the best how-to I have yet seen. THANK YOU.
İnteresting video Rob. You're a hero
Here are my two cents on this video for people struggling with timeseries forecasting.
1. Feature engineering is extremely essential. Make sure to get thr right features before training your data.
2. Instead of using base features try using derived ones such as "mean, median, std, var, rolling mean, rolling std, rolling median etc."
3. Use preprocessing to clean your data and make sure to interpolate your missing values instead of dropping them.
4. Never mix things. Forcast trends with trends and linear with linear.
As a person making a career change from an entirely different industry I really appreciate your videos. Finishing up class, with your help! Will be back to learn on my own this summer...thanks again!
Absolutely great tutorial (thanks!), but I still have questions, although they are more general questions and not particularly related to XGBoost:
1) How to cope with data that has repetitions, like repetitions of the same date (pivot tables, but I have difficulty coping with pivots when the dataframe already has multiple features)
2) How does XGBoost (or any other model) cope with categories in data, for example (and applied to this tutorial): what if the electricity usage data would include regions, how can you integrate those in a forecasting model? This also implies repetitions: date1 -> region1 data; date1 -> region2 data; date1 -> region3 data, etc...
I'll eventually find out how to work with this, but if you wanted some input for a follow up video, here it is :D
I got same problems
Thank you for showing how to train a forecasting model with cross validation. I've never truly understood it, until I saw your video. I'll apply it to my own projects!
Thank you for the amazing tutorial,I appreciate the fact that we get to see advanced methods and the workflow of an expert , it may seem hard to follow at first , especially for a beginner like me , but going through every step , bit by bit will pay off.
These are such good videos Rob. You cover so much material quickly and clearly, with straightforward language that a novice like me can understand. Much appreciated 👍
Love your teaching style. Practical and to the point, it makes it really easy to understand these features.
My mind is exploding with ideas on how to apply this.
I would love to see this used but across multiple categories in a data set. For example, for financial data, creating predictions for COGS, expenses, revenue, assets, liabilities, etc., even adding future data like a large asset being purchased or an acquisition of some sort to create financial statements for years to come.
Thank you so much for the video!
This video is, in equal parts, fascinating and concerning. Fascinating with how easy you make it look. Concerning because there is nowhere to hide in terms of how far i need to go.........
I am currently dealing with timeseries and this is one of the best videotutorial in youtube regarding this topic. Amazing work, thanks!
This is gold! Thank you :) This validates my approach for my current time series project.
Glad to hear that. Thanks for commenting.
The part 1 and 2 are such a tremendous job. I managed to do some forecasting and nowcasting for european inflation using your guidance. You should consider creating a course. You are excelent!!
Really apprecaite that feedback! Still planning on a few more for this series.
@@robmulla I'm hungry for more! Thank you for sharing this knowledge with us!
Just watched both parts, and I have got to say this was a very good tutorial.
So glad you found it helpful. Please share it anywhere you think people might find it helpful.
Again. Amazing work. I want to use your tutorial at work for our interns in the future.
Note that there is also a lag function you can use in pandas (shift), but it is based on "previous rows" rather than "previous unit of time", so it will not be as accurate as your mapping (when some rows are missing)
Additionally, you can add any other feature that is available in your dataset (multivariate dataset), this example is purely timeseries, but if you have for example a column for "county", you could experiment with making several models per county or one big model with "county" as a feature.
Wow. That's a big honor that you want to use my tutorial to help train people. Please do it!
Yes, I made the lag features in this way to ensure we didn't have any issues if certain rows were missing from the dataset, but you can also use something like `shift` in pandas. This dataset doesn't have detailed features but I agree it's good to add them if available.
Rob, this video is the reason that I am learning python, I am a hardcore R person, but this is amazing. Great work!
I love this! Glad you are starting your python journey here!
Damn. Wish I found this guy sooner, could have spared me so many days and nights of headache. Best python teacher ever.
Thanks so much!
@@robmulla Sorry for asking something like this but a tutorial on time series forecasting with Pytorches Temporal Fusion Transformer (TFT) from you would be awesome!
Amazing. Really covered very well most of my questions from the firat video. This is the best video I have seen so far about xgboost that integrates time series data with feature engineering that uses aggregate data and shows clearly how to use it with cross-validation appropriately to avoid data leakage.
I would make a 3rd part where you focus on integrating everything within a preprocessing pipeline along with hyperparameter tuning using GridSearchCV or any other method in order to make it more robust. Either way you showed me a new way to use
Nice to see you made a second part to your video. Awesome job!!!
Thanks. Glad you liked it!
I am glad I found your channel, every morning before I start my work (Work From Home Days) I watch at least one of your videos and damn! my productivity and skills have improved by a lot (especially pandas data pipelines) at work. Thanks Rob! Keep up the good work!
Great vid man! Perfect intro for someone with other ML experience.
Glad you enjoyed it! Thanks for leaving a comment.
I have never enjoyed time series forecasting, I failed learning it like 7 times past year, and now I’m starting again and not letting it go this time.
Your 2 videos (40 mins) are so helpful and the way you explain things are amazing, I think if you continue the series, you would blast YT since there is no much content about it.
You made my day tho 😂🎉
hey, curious to know how you've been doing with your forecasting?
Hi,@@kinduvabigdeal100
Pretty good, I'm using time series to forecast insurance claims.
Really nice video indeed, I am currently using the XGBoost model for time sries prediction of water consumption and it does better job than the ARIMA family so far. Thank you for sharing how to continue with the prediction, this is really hard to find anywhere as info. Great videos and keep up with the good work, i will definately follow :)
please keep making video tutorials as your videos help beginners like me (career changers) to finish their projects or assignment. i already follow your kaggle and YT.
I will try my best! Thanks for the feedback and glad to hear you've found them helpful.
@@robmulla can you make tutorial regarding this forecast or any forecast compare between XGBoost and ARIMA please as follow up series for this Time Series Project.
Once again: great video, thanks a lot!
I would love to see you showcasing some more advanced feature engineering and hyperparameter tuning. Where to start? Which rule of thumbs do you use (for example for the starting hyperparameters)? What do you look for while tuning, etc.
However: can't wait to see your next video!
Have a nice day :)
Great suggestion! I have considered a xgboost tuning guide however it really depends on the dataset. There are some packages out there that can help automate that process. For feature engineering you will be limited for a dataset like this because we only have the target value, fun feature engineering would come if you also had other correlated features (like weather forecasts, etc).
Hi Rob. Thank you for this. You are very good! Regards from Nairobi, Kenya.
Glad you liked it b!
Please Please! If you have time. We would like to see another lecture comparing the performance of XGBoost with deep neural network (DNN) and Multiple Linear regression (MLR) models in time Series Forecasting for energy (in the same topic).We will be very grateful to you, because there are no clear and useful lectures as you have provided, in a very easy and useful way.
Thanks! I get asked about this a lot so I’ll definitely put it in my list of future videos to make. Thanks for watching.
@@robmulla Thank you so much for your efforts.
I'm so glad I just discovered you, keep the great work.
Glad you found me too! Let me know if you have any feedback and share my videos anywhere you think others might appeciate them.
This is some real quality stuff dude ! I found 100's of youtubers who talk about .fit() .predict() but you have gone way beyond that ! Thanks for teaching us a correct methodology for approaching time series analysis.
Thanks a lot !
I appreciate the feedback! I think it's important to know some of these high level ideas like forecasting horizon and model validation. `fit` and `predict` is the easy part! 😆
You saved a huge amount of my time. Thank you so much
Thanks Vadim. Glad the video helped you out.
This is pure G O L D ! You deserve more views and subs. Subbed!
Thanks so much for the feedback. Just share it with 99k friends and have them subscribe 😉
please continue making these great tutorials
I will if you keep watching them!
11:22 The result of 364/7 is the number of week in a year. Ty for this video.
Very nicely done. Crisp explanations, optimum amount of details and plenty of crumbs to follow should one care to deep dive into specific areas. This is the second video on forecasting and I'd love to see more videos on the series. How about a tutorial on forecasting with RNN/LSTM?
Glad it was helpful! I try to make the videos less fluff but sometime I worry if I actually strike the right balance. I definately have plans for a RNN/LSTM forecasting video although I am not a big supporter of using them in applications like this. Also thinking about making a video about something like facebook's prophet model.
@@robmulla waiting for the fb prophet model tutorials
Really good tutorial. Easy to follow but also quick and succinct. Thanks!
Great video! A simpler way to implement the lag features would be:
df['lag1'] = df['PJME_MW'].shift(364).fillna(0)
Thanks for the feedback! Using the shift method can be really handy, however you need to be careful and ensure that there is no gaps in your data. The reason I did the mapping the way it's done in this video is because there are some timestamps that are missing and shifting would then make the values not align correctly. Hopefully that makes sense.
That makes sense. Indeed, I did assume that there are no gaps in the timeseries, because that step is usually done in a preprocessing step beforehand, if necessary (so that there are no gaps in general, not only for the lag features). But of course that was still an assumption from my side. Thanks for the reply!
One suggestion -- Please change the title to "The Best XGBoost tutorial for Time Series Forecasting on RUclips! ". That's indeed the correct title.
Haha. Thanks so much I apprecaite the kind comment.
Definitely, your tutorial is the best
Great job!
Thank you! Cheers!
Great video, thanks Rob!
Great one!
Can we have a part 3 with more model tuning, by adding weather data and other external factors impacting the energy consumption?
I am planning on making more. I have one about the prophet model and then also working on one using LSTMs
When could we expect to see it ? I would Looove to continue to scale up my skills in Python thanks to your videos !@@robmulla
Part 2!! Lets goo 🚀
🔥
Every single vids are inspiring, helpful, and informative. You are wonderful. Thank you so much for everything man.
Million thanks for your tutorials... Superb 👍👍
My pleasure 😊
What an awesome explanation!
Thanks for watching! 🧐
Thanks a lot for making the second part of the video.
Thanks for watching. Maybe there will be a part 3!
Man, thanks for the content. This gave me real good insights.
Oh, thank you very much, Rob ! Thanks for answering the inquiry about time series cross validation! It makes things much clearer now! God bless you, man!
Glad you found it helpful. Time series validation can become more of an art than a science especially when you have non-stable trends. Time series is hard!
Excellent work. Keep the content coming! Build a model to predict future topics people with ask, there is an API to pull data from YT.
But seriously, nice work, love the advanced deep dives. How i got this far was i found a RUclips short from you explaining a topic that popped in my feed, then you had a 10 minute general knowledge video, then a deep dive series... that structure i really liked.
Thanks for the feedback, I really do apprecaite it. Cool to hear that you first found me via the shorts. I've been meaning to make some more of those!
Nice explanation and great work and please make a video on deployment in such a model to a cloud with training and inference pipeline...👍
Thanks a lot. Very fluent and Good explanation.
Glad you liked it! Thanks Sami.
Amazing work as always. Your contents are super helpful. Please keep making videos. Thanks for spending your time to make these videos.
Thanks, will do! I have ideas for more videos coming out soon!
Great video. I think you should've explained that the purpose of doing cross-validation is to do hyper-parameter tuning and/or change your features to improve your results
Great point! I thought that was explained but maybe I wasn't clear enough about it.
Thanks again for the great tutorial, it is a stepping stone to understand how to use XGBoost and some key techniques to apply when dealing with timeseries data. However I think there is one conceptional problem with using lagged versions of the data. Imagine you use just lag 1. During the training process you always have the actual previous value (lag 1) but when you forecast into the unknown future even at the 2nd timestamp you already don't have the actual data for the lag1 for this unknown future, hence the trained parameter is not possible to be applied in this case during all the future steps, so everything less than the forecast horizon might be misleading in my opinion since the model will just not apply the trained relationship for those lags as they don't exists in each step. Please correct me if I am wrong, I am developing a Pipeline for TimeSeries forecasting with my own data and I really want to achieve best possible outcome.
You misunderstood the whole concept of lagging. Try using pandas shift() method to in order to understand it.
Love your explanations. Thanks!
Thanks so much for watching. Share it with anyone you think might learn from it!
Excellent video. Thank you so much.
Glad you found it helpful, Deepak!
Awesome and informative work with really creative visuals. The graphs and plots were impressive and visually appealing. As a suggestion, though, you may improve the structure by adding headers to separate each section.
Thanks. That’s great feedback and I’ll keep that in mind in the future.
Thank you so much for your awesome and clear tutorials. I have learned a lot!
Great to hear!
super helpful and inspiring . would be great if can see how the mysterious parameters are tuned
Glad you found it inspiring. Parameter tuning isn't always worth spending a lot of time on. At least at first. You can use an auto-tuner like Optuna, once you've got your model setup and it will tune parameters automatically.
Buddy your videos are excellent, good stuff.
Awesome. Thanks for this tutorial.
Thanks for watching!
Hi. From argentina, excelent video. 1 and 2. Congrats!
Hello there! Glad you have you watching from Argentina. Thanks for the congrats.
Great tutorial! Learned a lot from these time series forecasting videos. Could you please also make a video on how to detect data anomaly with this time series data? Such as using the trained model to detect possible outliers of energy use in your example data set. Thanks!!
Thanks. Good idea. Usually abnormalities on data like this are assessed by checking the values outside some distance from the mean value. 3 standard deviations is typical for a true outlier. It depends what data you want to subset to (like a certain month or hour) to determine what mean and std to use.
@@robmulla Using mean assumes a symmetric normal distribution? - where you have a non-symetric, non-normal distribution, median and percentiles may be better?
great video, it would be really great if you do a continuation and show us how to upload the model and connect it to a GUI so that it can be used by everyday users.
Great job, thank you so much.
You're very welcome! Thanks for warching.
Hi Professor,
I need your advice! But first, I want to thank you for the content you create-it's incredibly helpful and inspiring for many people.
I have a question regarding forecasting. I work in operations at a delivery start-up and I am responsible for projecting the contact volumes we receive from customers in our CRM. My goal is to determine the number of agents needed to handle the demand efficiently without overspending.
The challenging part, and the reason I'm reaching out to you, is that we need to project volumes in 30-minute intervals rather than daily. I need to forecast ahead in 30-minute intervals over several months. I'll provide an example for clarity.
Could you advise me on which method would be feasible for this particular requirement?
Thank you!
When fitting the model in the walk forward validation, the 'eval_set' parameter should not include test data as this is data leakage
Solid work!
Glad you enjoyed it Brandon.
Your videos are always explained with such clarity, well done. I was wondering if we could predict the future using the method you implemented here in the model from the previous video? Thank you.
Really appreciate that feedback. We use a fairly similar model from the previous video in this one. Just validation pipeline is different.
Thanks a lot
As always, great video
👾
Great video. Thanks
Glad you liked it! Share with a friend :D
Great Tutorial.
Glad it was helpful!
Hey Rob, thank you so much for this great video! It's helped me a lot! If you could update the code file to include the holiday feature, that would be fantastic!
Great tutorial and wonderful job , I wish you all the best + Thank you for your work
I have a question ...
what does lag1, lag2 and lag3 represent ?
Thanks for watching. The lag variables represent what the target value was in the past. I just called them lag1, lag2 etc. But if you watch at 12:19 when I create them I create them for a certain number of days in the past. Hope that helps.
Hello, thank you for making such a great teaching video. I would like to ask if you will make a teaching video on how to use xgboost for multi-step time series prediction? Thank you very much!!!
Thanks for watching. I'm not sure what you mean by multi-step- but if I understand it correctly, you would just re-train and predict for each new time period when you are forecasting.
Incredible work! Thanks so much for sharing! Do you have in your channel a video with time series using Random Forest algorithm??
I have a video that goes over time series with xgboost, which is essentially a smarter implementation of a multi-tree based model.
Hi Rob, thanks for making the video, really awesome! I wonder if I do a hyper parameter tunning, do I still need to do TSCV for each set of hyper parameters, or I can just leave a recent history set for validation to avoid overfitting, and use the most recent history as test set for model selection? If my goal is not to have a time series model that works well over a long period of time but just the near future, is there advantage of TSCV over the holdout set method? Thanks you!
Hi, Great video! I watched both parts and the prohet one too.
I have a question : if I understood it correctly, I shouldn't create lag features greater than my horizon prediction window. It's not so clear to me why I shouldn't do that. Could you please share more details about that part ?
I'm struggling to grasp this concept because the example involves lag1, lag2, and lag3 even though he is forecasting 1yr into the future. For instance, on '01-01-2015,' you have information to calculate the energy consumption from three years prior (on '01-01-2012').
Very good!
Appreciate that!
Amazing work Master! does XGBoost support missing values? or NaN values of lags features are replaced by zero?.
XGBoost and most GBM models work fine with null values. Because it's a tree based algorithm it is able to split the null values out. However, I've found that imputing values can sometimes help.
Thanks a lot for the video. Can you please talk about stationarity check as well as model accuracy?
This is amazing! Very great video. Can you do one with other correlated features and feature importance, Or another one for pre post treatment.
Amazing vids Rob!! I'm carefully following part 1 & 2, learning lots of minor tricks along the way, love them. Quick question: any reason why the XGBRegressor objective is "reg:linear", and not just the default? Other params seem non-defaults as well.
Excellent video,👍👏 pls make videos like TS forecasting, xgb classification model
Thanks for the positive note. What do you mean by time series classification? Is there a data example you know of that I could look into?
@@robmulla means next candle prediction based on historical cdl data upto pre.cdl whether next cdl is green then model give 1 if it's red 0. Like this forex data eurusd..... i'm newbie to py facing difficulty in building perfect model. I'm getting accuracy of 54-60 % Thanks in advance👍
Great video! One question - if you’re using a feature for which future values don’t exist, let’s say the stock price of a company, then how would you create the future data frame ?
Subbed....great and thorough and practical explanation. I guess using LightGBM will be very much the same and only some parameters change?
Thanks for the sub! Yes, LightGBM and XGBoost are very similar with just slightly different names for the parameters. If you use the sklearn api they are almost the same.
But will the time of the lag feature be too close to produce cumulative errors? How should this be done, such as delaying the previous point but predicting the next 24 points
Great video! i'm just wondering if your lagging features were like 1 hour, then can you still predict multiple steps? Thanks!
Thanks for the feedback. If you have a lag feature of 1 hour then you would only be able to predict 1 hour in the future, because beyond that you would not be able to compute a lag feature (because those lag features would not have occured yet).
I really enjoy your videos. Any chance you could do one exploring dimensionality reduction techniques for time series? Particularly non-linear techniques such as VAE
Great suggestion! I might include it in a future video but I'd need to learn a bit more about it first. Thanks for watching.
Thanks
Nice video. Can we have another followup video on if incoporating additional pool of time series features that we believe that can be used as independent variables (e.g. weather series, price series), how to determine the lag to be used in those features and how to do feature selections from the feature sets?
Thanks for the suggestions. That would be a good future video. Using additional features is tricky because you need to be sure to only use the feature values you have on the day predicting. So for weather data you actually would need to use the forecast from X days prior to the predicting date. Otherwise you will introduce a data leak into the model training.
This is great! Thank you! Question: Why didn't you use a feature scaling function? Isn't it better to normalize the features for gradient descent algorithms?
Great question. For many algorithms that is the case. However for tree based algorithms like xgboost it will just split a branch somewhere in the feature. So scaling isn’t necessary.
Un genio, gracias!
Thank you for your complete tutorial!
I have tried your code on kaggle and it made a question popped up in my mind, my question is, how can the lags values in future_w_features variable can apprear in "isFutre == True" while in 'PJME_MW ' is empty?
I really enjoyed it! However, I'm still curious about how you generate lag values in the future dataframe. For example, if we have data from 2020 to 2023 and create a lag1 feature with a 364-day lag, then the lag1 value for the period from 2020 to 2021 would be NAN. But when you create a future dataframe to predict for the next year (2023-2024), does it work by shifting the lag values accordingly? In other words, is the lag values being pushed forward by a year to generate the future lag values?
Thanks for sharing amazing work. I just want to ask about features created from datetimeindex(year, month, dayofyear etc.) shouldn't we change type of them to category ?
Apprecaite the feedback. You could try making these categorical, however they aren't completely ordinal, especially day of year.. The xgboost should find splits for these features in trees based on where it determines the best break points to be. But its always worth trying and seeing what performs best using the validation setup.
this is very impressive. do you have the lecture for trading analysis ?
Glad you liked it. I have a few other videos on time series you can check out.
well done, absolutely incredible
can you do one of these with a Prophet model?
it takes away so many headaches that you were dealing with in the feature creation and cross validation
Thanks so much. Yes, I plan to do one on Prophet and LSTMs at some point.
great video Rob! helps me a lot! i wanna ask something, so my project uses a lot of features (x) that the values is still unknown in the future unlike in your video that the features already has a values like the month, days of weeks, etc. so, is it impossible for me to predict the future because of the features is still unknown? cause i already try it and the result show a straight line with the same values :( Thanks in advance!