Time Series Forecasting with XGBoost - Advanced Methods

Поделиться
HTML-код
  • Опубликовано: 27 ноя 2024

Комментарии • 319

  • @pratapkumar7785
    @pratapkumar7785 6 дней назад

    The best tutorial I have ever seen on time series forecasting. Thank you so much. Your voice is so amazing even a small child can understand.

  • @hasanovmaqsud
    @hasanovmaqsud 2 года назад +166

    Believe it or not, but is potentially the best tutorial about time series forecasting out there. Definitely worth attention. Please keep up the good work👍

    • @robmulla
      @robmulla  2 года назад +20

      Wow, that means a lot to me! Glad you found it so helpful. I have no plans of slowing down any time soon!

    • @yassssssssssss
      @yassssssssssss 2 года назад +3

      I second what @Maqsud Hasanov. Thanks for sharing and more about TS please 😁

    • @GeorgeCherian7
      @GeorgeCherian7 Год назад

      😊

  • @lashlarue7924
    @lashlarue7924 Год назад +8

    Hands down the best how-to I have yet seen. THANK YOU.

  • @code2compass
    @code2compass 9 месяцев назад

    İnteresting video Rob. You're a hero
    Here are my two cents on this video for people struggling with timeseries forecasting.
    1. Feature engineering is extremely essential. Make sure to get thr right features before training your data.
    2. Instead of using base features try using derived ones such as "mean, median, std, var, rolling mean, rolling std, rolling median etc."
    3. Use preprocessing to clean your data and make sure to interpolate your missing values instead of dropping them.
    4. Never mix things. Forcast trends with trends and linear with linear.

  • @danielleshelton6706
    @danielleshelton6706 7 месяцев назад +3

    As a person making a career change from an entirely different industry I really appreciate your videos. Finishing up class, with your help! Will be back to learn on my own this summer...thanks again!

  • @blackbke
    @blackbke Год назад +1

    Absolutely great tutorial (thanks!), but I still have questions, although they are more general questions and not particularly related to XGBoost:
    1) How to cope with data that has repetitions, like repetitions of the same date (pivot tables, but I have difficulty coping with pivots when the dataframe already has multiple features)
    2) How does XGBoost (or any other model) cope with categories in data, for example (and applied to this tutorial): what if the electricity usage data would include regions, how can you integrate those in a forecasting model? This also implies repetitions: date1 -> region1 data; date1 -> region2 data; date1 -> region3 data, etc...
    I'll eventually find out how to work with this, but if you wanted some input for a follow up video, here it is :D

  • @TimelyTimeSeries
    @TimelyTimeSeries 10 месяцев назад +1

    Thank you for showing how to train a forecasting model with cross validation. I've never truly understood it, until I saw your video. I'll apply it to my own projects!

  • @InspirationBeel
    @InspirationBeel 3 месяца назад

    Thank you for the amazing tutorial,I appreciate the fact that we get to see advanced methods and the workflow of an expert , it may seem hard to follow at first , especially for a beginner like me , but going through every step , bit by bit will pay off.

  • @al_s_
    @al_s_ Год назад +9

    These are such good videos Rob. You cover so much material quickly and clearly, with straightforward language that a novice like me can understand. Much appreciated 👍

  • @mpfiesty
    @mpfiesty Год назад +2

    Love your teaching style. Practical and to the point, it makes it really easy to understand these features.
    My mind is exploding with ideas on how to apply this.
    I would love to see this used but across multiple categories in a data set. For example, for financial data, creating predictions for COGS, expenses, revenue, assets, liabilities, etc., even adding future data like a large asset being purchased or an acquisition of some sort to create financial statements for years to come.
    Thank you so much for the video!

  • @mangeshmehendale4139
    @mangeshmehendale4139 10 месяцев назад

    This video is, in equal parts, fascinating and concerning. Fascinating with how easy you make it look. Concerning because there is nowhere to hide in terms of how far i need to go.........

  • @EndikaMT
    @EndikaMT Год назад

    I am currently dealing with timeseries and this is one of the best videotutorial in youtube regarding this topic. Amazing work, thanks!

  • @StaceyBeck-h9s
    @StaceyBeck-h9s 8 месяцев назад +1

    This is gold! Thank you :) This validates my approach for my current time series project.

    • @robmulla
      @robmulla  8 месяцев назад

      Glad to hear that. Thanks for commenting.

  • @Pedrommelos
    @Pedrommelos 2 года назад +9

    The part 1 and 2 are such a tremendous job. I managed to do some forecasting and nowcasting for european inflation using your guidance. You should consider creating a course. You are excelent!!

    • @robmulla
      @robmulla  2 года назад +6

      Really apprecaite that feedback! Still planning on a few more for this series.

    • @Pedrommelos
      @Pedrommelos 2 года назад

      @@robmulla I'm hungry for more! Thank you for sharing this knowledge with us!

  • @yBlade05
    @yBlade05 2 года назад +2

    Just watched both parts, and I have got to say this was a very good tutorial.

    • @robmulla
      @robmulla  2 года назад

      So glad you found it helpful. Please share it anywhere you think people might find it helpful.

  • @a.h.s.3006
    @a.h.s.3006 2 года назад +3

    Again. Amazing work. I want to use your tutorial at work for our interns in the future.
    Note that there is also a lag function you can use in pandas (shift), but it is based on "previous rows" rather than "previous unit of time", so it will not be as accurate as your mapping (when some rows are missing)
    Additionally, you can add any other feature that is available in your dataset (multivariate dataset), this example is purely timeseries, but if you have for example a column for "county", you could experiment with making several models per county or one big model with "county" as a feature.

    • @robmulla
      @robmulla  2 года назад +1

      Wow. That's a big honor that you want to use my tutorial to help train people. Please do it!
      Yes, I made the lag features in this way to ensure we didn't have any issues if certain rows were missing from the dataset, but you can also use something like `shift` in pandas. This dataset doesn't have detailed features but I agree it's good to add them if available.

  • @prometeo34
    @prometeo34 Год назад +2

    Rob, this video is the reason that I am learning python, I am a hardcore R person, but this is amazing. Great work!

    • @robmulla
      @robmulla  Год назад +1

      I love this! Glad you are starting your python journey here!

  • @Lirim_K
    @Lirim_K Год назад +1

    Damn. Wish I found this guy sooner, could have spared me so many days and nights of headache. Best python teacher ever.

    • @robmulla
      @robmulla  Год назад +1

      Thanks so much!

    • @Lirim_K
      @Lirim_K Год назад +1

      @@robmulla Sorry for asking something like this but a tutorial on time series forecasting with Pytorches Temporal Fusion Transformer (TFT) from you would be awesome!

  • @Personal-f4d
    @Personal-f4d 5 месяцев назад

    Amazing. Really covered very well most of my questions from the firat video. This is the best video I have seen so far about xgboost that integrates time series data with feature engineering that uses aggregate data and shows clearly how to use it with cross-validation appropriately to avoid data leakage.
    I would make a 3rd part where you focus on integrating everything within a preprocessing pipeline along with hyperparameter tuning using GridSearchCV or any other method in order to make it more robust. Either way you showed me a new way to use

  • @gabrielmoreno2554
    @gabrielmoreno2554 2 года назад +2

    Nice to see you made a second part to your video. Awesome job!!!

    • @robmulla
      @robmulla  2 года назад

      Thanks. Glad you liked it!

  • @rohitvenkatesan1895
    @rohitvenkatesan1895 Год назад

    I am glad I found your channel, every morning before I start my work (Work From Home Days) I watch at least one of your videos and damn! my productivity and skills have improved by a lot (especially pandas data pipelines) at work. Thanks Rob! Keep up the good work!

  • @jakstrike1
    @jakstrike1 2 года назад +2

    Great vid man! Perfect intro for someone with other ML experience.

    • @robmulla
      @robmulla  2 года назад

      Glad you enjoyed it! Thanks for leaving a comment.

  • @ahmadhammad12
    @ahmadhammad12 Год назад +2

    I have never enjoyed time series forecasting, I failed learning it like 7 times past year, and now I’m starting again and not letting it go this time.
    Your 2 videos (40 mins) are so helpful and the way you explain things are amazing, I think if you continue the series, you would blast YT since there is no much content about it.
    You made my day tho 😂🎉

    • @kinduvabigdeal100
      @kinduvabigdeal100 Год назад

      hey, curious to know how you've been doing with your forecasting?

    • @ahmadhammad12
      @ahmadhammad12 Год назад

      Hi,@@kinduvabigdeal100
      Pretty good, I'm using time series to forecast insurance claims.

  • @kristinaarsova5946
    @kristinaarsova5946 10 месяцев назад

    Really nice video indeed, I am currently using the XGBoost model for time sries prediction of water consumption and it does better job than the ARIMA family so far. Thank you for sharing how to continue with the prediction, this is really hard to find anywhere as info. Great videos and keep up with the good work, i will definately follow :)

  • @titiQd
    @titiQd 2 года назад +1

    please keep making video tutorials as your videos help beginners like me (career changers) to finish their projects or assignment. i already follow your kaggle and YT.

    • @robmulla
      @robmulla  2 года назад +1

      I will try my best! Thanks for the feedback and glad to hear you've found them helpful.

    • @titiQd
      @titiQd 2 года назад

      @@robmulla can you make tutorial regarding this forecast or any forecast compare between XGBoost and ARIMA please as follow up series for this Time Series Project.

  • @massimothormann272
    @massimothormann272 2 года назад +5

    Once again: great video, thanks a lot!
    I would love to see you showcasing some more advanced feature engineering and hyperparameter tuning. Where to start? Which rule of thumbs do you use (for example for the starting hyperparameters)? What do you look for while tuning, etc.
    However: can't wait to see your next video!
    Have a nice day :)

    • @robmulla
      @robmulla  2 года назад +2

      Great suggestion! I have considered a xgboost tuning guide however it really depends on the dataset. There are some packages out there that can help automate that process. For feature engineering you will be limited for a dataset like this because we only have the target value, fun feature engineering would come if you also had other correlated features (like weather forecasts, etc).

  • @vectorautomationsystems
    @vectorautomationsystems 10 месяцев назад +1

    Hi Rob. Thank you for this. You are very good! Regards from Nairobi, Kenya.

    • @robmulla
      @robmulla  10 месяцев назад +1

      Glad you liked it b!

  • @abdogassar9246
    @abdogassar9246 2 года назад +2

    Please Please! If you have time. We would like to see another lecture comparing the performance of XGBoost with deep neural network (DNN) and Multiple Linear regression (MLR) models in time Series Forecasting for energy (in the same topic).We will be very grateful to you, because there are no clear and useful lectures as you have provided, in a very easy and useful way.

    • @robmulla
      @robmulla  2 года назад +2

      Thanks! I get asked about this a lot so I’ll definitely put it in my list of future videos to make. Thanks for watching.

    • @abdogassar9246
      @abdogassar9246 2 года назад

      @@robmulla Thank you so much for your efforts.

  • @CarlosReyes-ku6ub
    @CarlosReyes-ku6ub 2 года назад +1

    I'm so glad I just discovered you, keep the great work.

    • @robmulla
      @robmulla  2 года назад +1

      Glad you found me too! Let me know if you have any feedback and share my videos anywhere you think others might appeciate them.

  • @varunraste3538
    @varunraste3538 Год назад +1

    This is some real quality stuff dude ! I found 100's of youtubers who talk about .fit() .predict() but you have gone way beyond that ! Thanks for teaching us a correct methodology for approaching time series analysis.
    Thanks a lot !

    • @robmulla
      @robmulla  Год назад

      I appreciate the feedback! I think it's important to know some of these high level ideas like forecasting horizon and model validation. `fit` and `predict` is the easy part! 😆

  • @vadimshatov9935
    @vadimshatov9935 2 года назад +2

    You saved a huge amount of my time. Thank you so much

    • @robmulla
      @robmulla  2 года назад

      Thanks Vadim. Glad the video helped you out.

  • @PG-iq6zv
    @PG-iq6zv 2 года назад +1

    This is pure G O L D ! You deserve more views and subs. Subbed!

    • @robmulla
      @robmulla  2 года назад

      Thanks so much for the feedback. Just share it with 99k friends and have them subscribe 😉

  • @THE8SFN
    @THE8SFN Год назад +1

    please continue making these great tutorials

    • @robmulla
      @robmulla  Год назад

      I will if you keep watching them!

  • @Aristocle
    @Aristocle Год назад

    11:22 The result of 364/7 is the number of week in a year. Ty for this video.

  • @Arkajyoti
    @Arkajyoti 2 года назад +6

    Very nicely done. Crisp explanations, optimum amount of details and plenty of crumbs to follow should one care to deep dive into specific areas. This is the second video on forecasting and I'd love to see more videos on the series. How about a tutorial on forecasting with RNN/LSTM?

    • @robmulla
      @robmulla  2 года назад +1

      Glad it was helpful! I try to make the videos less fluff but sometime I worry if I actually strike the right balance. I definately have plans for a RNN/LSTM forecasting video although I am not a big supporter of using them in applications like this. Also thinking about making a video about something like facebook's prophet model.

    • @felixakwerh5189
      @felixakwerh5189 2 года назад +1

      @@robmulla waiting for the fb prophet model tutorials

  • @lolmatt9
    @lolmatt9 10 месяцев назад

    Really good tutorial. Easy to follow but also quick and succinct. Thanks!

  • @costadekiko
    @costadekiko 2 года назад +1

    Great video! A simpler way to implement the lag features would be:
    df['lag1'] = df['PJME_MW'].shift(364).fillna(0)

    • @robmulla
      @robmulla  2 года назад +3

      Thanks for the feedback! Using the shift method can be really handy, however you need to be careful and ensure that there is no gaps in your data. The reason I did the mapping the way it's done in this video is because there are some timestamps that are missing and shifting would then make the values not align correctly. Hopefully that makes sense.

    • @costadekiko
      @costadekiko 2 года назад

      That makes sense. Indeed, I did assume that there are no gaps in the timeseries, because that step is usually done in a preprocessing step beforehand, if necessary (so that there are no gaps in general, not only for the lag features). But of course that was still an assumption from my side. Thanks for the reply!

  • @waqitshatasheel5875
    @waqitshatasheel5875 2 года назад +1

    One suggestion -- Please change the title to "The Best XGBoost tutorial for Time Series Forecasting on RUclips! ". That's indeed the correct title.

    • @robmulla
      @robmulla  2 года назад

      Haha. Thanks so much I apprecaite the kind comment.

  • @ZINALOUDJANI-uk9ft
    @ZINALOUDJANI-uk9ft Год назад

    Definitely, your tutorial is the best

  • @massoudkadivar8758
    @massoudkadivar8758 Год назад +2

    Great job!

  • @DataDeepDive-yh4rf
    @DataDeepDive-yh4rf 9 месяцев назад +1

    Great video, thanks Rob!

  • @business_central
    @business_central 2 года назад +34

    Great one!
    Can we have a part 3 with more model tuning, by adding weather data and other external factors impacting the energy consumption?

    • @robmulla
      @robmulla  2 года назад +25

      I am planning on making more. I have one about the prophet model and then also working on one using LSTMs

    • @qdupontulb
      @qdupontulb Год назад

      When could we expect to see it ? I would Looove to continue to scale up my skills in Python thanks to your videos !@@robmulla

  • @hasijasanskar
    @hasijasanskar 2 года назад +4

    Part 2!! Lets goo 🚀

  • @averagedailycontent
    @averagedailycontent Год назад

    Every single vids are inspiring, helpful, and informative. You are wonderful. Thank you so much for everything man.

  • @suabsakulgururatana9151
    @suabsakulgururatana9151 2 года назад +1

    Million thanks for your tutorials... Superb 👍👍

    • @robmulla
      @robmulla  2 года назад +1

      My pleasure 😊

  • @dagobertocifuentes6845
    @dagobertocifuentes6845 2 года назад +1

    What an awesome explanation!

    • @robmulla
      @robmulla  2 года назад

      Thanks for watching! 🧐

  • @elahe7702
    @elahe7702 2 года назад

    Thanks a lot for making the second part of the video.

    • @robmulla
      @robmulla  2 года назад

      Thanks for watching. Maybe there will be a part 3!

  • @welverd
    @welverd 6 месяцев назад

    Man, thanks for the content. This gave me real good insights.

  • @hasanovmaqsud
    @hasanovmaqsud 2 года назад +1

    Oh, thank you very much, Rob ! Thanks for answering the inquiry about time series cross validation! It makes things much clearer now! God bless you, man!

    • @robmulla
      @robmulla  2 года назад +2

      Glad you found it helpful. Time series validation can become more of an art than a science especially when you have non-stable trends. Time series is hard!

  • @TheThunder005
    @TheThunder005 2 года назад +3

    Excellent work. Keep the content coming! Build a model to predict future topics people with ask, there is an API to pull data from YT.
    But seriously, nice work, love the advanced deep dives. How i got this far was i found a RUclips short from you explaining a topic that popped in my feed, then you had a 10 minute general knowledge video, then a deep dive series... that structure i really liked.

    • @robmulla
      @robmulla  2 года назад +1

      Thanks for the feedback, I really do apprecaite it. Cool to hear that you first found me via the shorts. I've been meaning to make some more of those!

  • @VenuraPussella
    @VenuraPussella 3 месяца назад

    Nice explanation and great work and please make a video on deployment in such a model to a cloud with training and inference pipeline...👍

  • @sami3592
    @sami3592 2 года назад +1

    Thanks a lot. Very fluent and Good explanation.

    • @robmulla
      @robmulla  2 года назад

      Glad you liked it! Thanks Sami.

  • @davoodastaraky7608
    @davoodastaraky7608 2 года назад +4

    Amazing work as always. Your contents are super helpful. Please keep making videos. Thanks for spending your time to make these videos.

    • @robmulla
      @robmulla  2 года назад

      Thanks, will do! I have ideas for more videos coming out soon!

  • @XartakoNP
    @XartakoNP Год назад

    Great video. I think you should've explained that the purpose of doing cross-validation is to do hyper-parameter tuning and/or change your features to improve your results

    • @robmulla
      @robmulla  Год назад

      Great point! I thought that was explained but maybe I wasn't clear enough about it.

  • @Asparuh.Emilov
    @Asparuh.Emilov Год назад

    Thanks again for the great tutorial, it is a stepping stone to understand how to use XGBoost and some key techniques to apply when dealing with timeseries data. However I think there is one conceptional problem with using lagged versions of the data. Imagine you use just lag 1. During the training process you always have the actual previous value (lag 1) but when you forecast into the unknown future even at the 2nd timestamp you already don't have the actual data for the lag1 for this unknown future, hence the trained parameter is not possible to be applied in this case during all the future steps, so everything less than the forecast horizon might be misleading in my opinion since the model will just not apply the trained relationship for those lags as they don't exists in each step. Please correct me if I am wrong, I am developing a Pipeline for TimeSeries forecasting with my own data and I really want to achieve best possible outcome.

    • @OskarBienko
      @OskarBienko Год назад

      You misunderstood the whole concept of lagging. Try using pandas shift() method to in order to understand it.

  • @nicholasbeaton2940
    @nicholasbeaton2940 2 года назад +1

    Love your explanations. Thanks!

    • @robmulla
      @robmulla  2 года назад

      Thanks so much for watching. Share it with anyone you think might learn from it!

  • @deepakramani05
    @deepakramani05 2 года назад +1

    Excellent video. Thank you so much.

    • @robmulla
      @robmulla  2 года назад

      Glad you found it helpful, Deepak!

  • @user-wc8ez2hg5o
    @user-wc8ez2hg5o 2 года назад +2

    Awesome and informative work with really creative visuals. The graphs and plots were impressive and visually appealing. As a suggestion, though, you may improve the structure by adding headers to separate each section.

    • @robmulla
      @robmulla  2 года назад

      Thanks. That’s great feedback and I’ll keep that in mind in the future.

  • @niloufarmsv3815
    @niloufarmsv3815 Год назад

    Thank you so much for your awesome and clear tutorials. I have learned a lot!

  • @newdata
    @newdata 2 года назад +1

    super helpful and inspiring . would be great if can see how the mysterious parameters are tuned

    • @robmulla
      @robmulla  2 года назад

      Glad you found it inspiring. Parameter tuning isn't always worth spending a lot of time on. At least at first. You can use an auto-tuner like Optuna, once you've got your model setup and it will tune parameters automatically.

  • @PandemicGameplay
    @PandemicGameplay Год назад

    Buddy your videos are excellent, good stuff.

  • @nurlannurmash4155
    @nurlannurmash4155 2 года назад +1

    Awesome. Thanks for this tutorial.

    • @robmulla
      @robmulla  2 года назад

      Thanks for watching!

  • @agustinkashiraja401
    @agustinkashiraja401 2 года назад +1

    Hi. From argentina, excelent video. 1 and 2. Congrats!

    • @robmulla
      @robmulla  2 года назад

      Hello there! Glad you have you watching from Argentina. Thanks for the congrats.

  • @alexkychen
    @alexkychen 2 года назад +1

    Great tutorial! Learned a lot from these time series forecasting videos. Could you please also make a video on how to detect data anomaly with this time series data? Such as using the trained model to detect possible outliers of energy use in your example data set. Thanks!!

    • @robmulla
      @robmulla  2 года назад

      Thanks. Good idea. Usually abnormalities on data like this are assessed by checking the values outside some distance from the mean value. 3 standard deviations is typical for a true outlier. It depends what data you want to subset to (like a certain month or hour) to determine what mean and std to use.

    • @CharlesDibsdale
      @CharlesDibsdale 2 года назад

      @@robmulla Using mean assumes a symmetric normal distribution? - where you have a non-symetric, non-normal distribution, median and percentiles may be better?

  • @key_advice
    @key_advice Год назад

    great video, it would be really great if you do a continuation and show us how to upload the model and connect it to a GUI so that it can be used by everyday users.

  • @abdogassar9246
    @abdogassar9246 2 года назад +1

    Great job, thank you so much.

    • @robmulla
      @robmulla  2 года назад +1

      You're very welcome! Thanks for warching.

  • @JuanCasalia
    @JuanCasalia 4 месяца назад

    Hi Professor,
    I need your advice! But first, I want to thank you for the content you create-it's incredibly helpful and inspiring for many people.
    I have a question regarding forecasting. I work in operations at a delivery start-up and I am responsible for projecting the contact volumes we receive from customers in our CRM. My goal is to determine the number of agents needed to handle the demand efficiently without overspending.
    The challenging part, and the reason I'm reaching out to you, is that we need to project volumes in 30-minute intervals rather than daily. I need to forecast ahead in 30-minute intervals over several months. I'll provide an example for clarity.
    Could you advise me on which method would be feasible for this particular requirement?
    Thank you!

  • @vzinko
    @vzinko Год назад +4

    When fitting the model in the walk forward validation, the 'eval_set' parameter should not include test data as this is data leakage

  • @brandondavis9305
    @brandondavis9305 2 года назад +1

    Solid work!

    • @robmulla
      @robmulla  2 года назад

      Glad you enjoyed it Brandon.

  • @charlesnwevo2706
    @charlesnwevo2706 2 года назад +1

    Your videos are always explained with such clarity, well done. I was wondering if we could predict the future using the method you implemented here in the model from the previous video? Thank you.

    • @robmulla
      @robmulla  2 года назад

      Really appreciate that feedback. We use a fairly similar model from the previous video in this one. Just validation pipeline is different.

  • @TylerMacClane
    @TylerMacClane Год назад

    Thanks a lot
    As always, great video
    👾

  • @vlplbl85
    @vlplbl85 2 года назад +1

    Great video. Thanks

    • @robmulla
      @robmulla  2 года назад

      Glad you liked it! Share with a friend :D

  • @poonsimon670
    @poonsimon670 Год назад +1

    Great Tutorial.

    • @robmulla
      @robmulla  Год назад

      Glad it was helpful!

  • @WinnieLi-m3b
    @WinnieLi-m3b 8 месяцев назад

    Hey Rob, thank you so much for this great video! It's helped me a lot! If you could update the code file to include the holiday feature, that would be fantastic!

  • @ahmedtambal2560
    @ahmedtambal2560 2 года назад +1

    Great tutorial and wonderful job , I wish you all the best + Thank you for your work
    I have a question ...
    what does lag1, lag2 and lag3 represent ?

    • @robmulla
      @robmulla  2 года назад +1

      Thanks for watching. The lag variables represent what the target value was in the past. I just called them lag1, lag2 etc. But if you watch at 12:19 when I create them I create them for a certain number of days in the past. Hope that helps.

  • @peterwang7774
    @peterwang7774 2 года назад +1

    Hello, thank you for making such a great teaching video. I would like to ask if you will make a teaching video on how to use xgboost for multi-step time series prediction? Thank you very much!!!

    • @robmulla
      @robmulla  2 года назад

      Thanks for watching. I'm not sure what you mean by multi-step- but if I understand it correctly, you would just re-train and predict for each new time period when you are forecasting.

  • @mariodelgadillo7762
    @mariodelgadillo7762 Год назад +1

    Incredible work! Thanks so much for sharing! Do you have in your channel a video with time series using Random Forest algorithm??

    • @robmulla
      @robmulla  Год назад +1

      I have a video that goes over time series with xgboost, which is essentially a smarter implementation of a multi-tree based model.

  • @MingkaiLiu-j9q
    @MingkaiLiu-j9q 4 месяца назад

    Hi Rob, thanks for making the video, really awesome! I wonder if I do a hyper parameter tunning, do I still need to do TSCV for each set of hyper parameters, or I can just leave a recent history set for validation to avoid overfitting, and use the most recent history as test set for model selection? If my goal is not to have a time series model that works well over a long period of time but just the near future, is there advantage of TSCV over the holdout set method? Thanks you!

  • @zericardo182
    @zericardo182 Год назад +1

    Hi, Great video! I watched both parts and the prohet one too.
    I have a question : if I understood it correctly, I shouldn't create lag features greater than my horizon prediction window. It's not so clear to me why I shouldn't do that. Could you please share more details about that part ?

    • @ivanb.2914
      @ivanb.2914 10 месяцев назад

      I'm struggling to grasp this concept because the example involves lag1, lag2, and lag3 even though he is forecasting 1yr into the future. For instance, on '01-01-2015,' you have information to calculate the energy consumption from three years prior (on '01-01-2012').

  • @flel2514
    @flel2514 Год назад +1

    Very good!

  • @thibautsaah3379
    @thibautsaah3379 2 года назад +1

    Amazing work Master! does XGBoost support missing values? or NaN values of lags features are replaced by zero?.

    • @robmulla
      @robmulla  2 года назад

      XGBoost and most GBM models work fine with null values. Because it's a tree based algorithm it is able to split the null values out. However, I've found that imputing values can sometimes help.

  • @SedaStepanyan
    @SedaStepanyan 11 месяцев назад

    Thanks a lot for the video. Can you please talk about stationarity check as well as model accuracy?

  • @eduardomanotas7403
    @eduardomanotas7403 Год назад

    This is amazing! Very great video. Can you do one with other correlated features and feature importance, Or another one for pre post treatment.

  • @JuanManuelBerros
    @JuanManuelBerros Год назад +1

    Amazing vids Rob!! I'm carefully following part 1 & 2, learning lots of minor tricks along the way, love them. Quick question: any reason why the XGBRegressor objective is "reg:linear", and not just the default? Other params seem non-defaults as well.

  • @stay-amazed1295
    @stay-amazed1295 2 года назад +1

    Excellent video,👍👏 pls make videos like TS forecasting, xgb classification model

    • @robmulla
      @robmulla  2 года назад

      Thanks for the positive note. What do you mean by time series classification? Is there a data example you know of that I could look into?

    • @stay-amazed1295
      @stay-amazed1295 2 года назад

      @@robmulla means next candle prediction based on historical cdl data upto pre.cdl whether next cdl is green then model give 1 if it's red 0. Like this forex data eurusd..... i'm newbie to py facing difficulty in building perfect model. I'm getting accuracy of 54-60 % Thanks in advance👍

  • @shahryar.s
    @shahryar.s 7 месяцев назад

    Great video! One question - if you’re using a feature for which future values don’t exist, let’s say the stock price of a company, then how would you create the future data frame ?

  • @fbrand
    @fbrand 2 года назад +1

    Subbed....great and thorough and practical explanation. I guess using LightGBM will be very much the same and only some parameters change?

    • @robmulla
      @robmulla  2 года назад

      Thanks for the sub! Yes, LightGBM and XGBoost are very similar with just slightly different names for the parameters. If you use the sklearn api they are almost the same.

  • @dieyu6374
    @dieyu6374 Год назад +1

    But will the time of the lag feature be too close to produce cumulative errors? How should this be done, such as delaying the previous point but predicting the next 24 points

  • @francoli2281
    @francoli2281 2 года назад +1

    Great video! i'm just wondering if your lagging features were like 1 hour, then can you still predict multiple steps? Thanks!

    • @robmulla
      @robmulla  2 года назад

      Thanks for the feedback. If you have a lag feature of 1 hour then you would only be able to predict 1 hour in the future, because beyond that you would not be able to compute a lag feature (because those lag features would not have occured yet).

  • @michaeltownsend1459
    @michaeltownsend1459 Год назад +2

    I really enjoy your videos. Any chance you could do one exploring dimensionality reduction techniques for time series? Particularly non-linear techniques such as VAE

    • @robmulla
      @robmulla  Год назад +1

      Great suggestion! I might include it in a future video but I'd need to learn a bit more about it first. Thanks for watching.

  • @Depaxa
    @Depaxa Год назад

    Thanks

  • @ayuumi7926
    @ayuumi7926 2 года назад +1

    Nice video. Can we have another followup video on if incoporating additional pool of time series features that we believe that can be used as independent variables (e.g. weather series, price series), how to determine the lag to be used in those features and how to do feature selections from the feature sets?

    • @robmulla
      @robmulla  2 года назад

      Thanks for the suggestions. That would be a good future video. Using additional features is tricky because you need to be sure to only use the feature values you have on the day predicting. So for weather data you actually would need to use the forecast from X days prior to the predicting date. Otherwise you will introduce a data leak into the model training.

  • @Alexweno07
    @Alexweno07 2 года назад +2

    This is great! Thank you! Question: Why didn't you use a feature scaling function? Isn't it better to normalize the features for gradient descent algorithms?

    • @robmulla
      @robmulla  2 года назад +3

      Great question. For many algorithms that is the case. However for tree based algorithms like xgboost it will just split a branch somewhere in the feature. So scaling isn’t necessary.

  • @pablobandeira5461
    @pablobandeira5461 Год назад

    Un genio, gracias!

  • @rizkiatthoriq5925
    @rizkiatthoriq5925 Год назад

    Thank you for your complete tutorial!
    I have tried your code on kaggle and it made a question popped up in my mind, my question is, how can the lags values in future_w_features variable can apprear in "isFutre == True" while in 'PJME_MW ' is empty?

  • @bhavikpatel6088
    @bhavikpatel6088 2 месяца назад

    I really enjoyed it! However, I'm still curious about how you generate lag values in the future dataframe. For example, if we have data from 2020 to 2023 and create a lag1 feature with a 364-day lag, then the lag1 value for the period from 2020 to 2021 would be NAN. But when you create a future dataframe to predict for the next year (2023-2024), does it work by shifting the lag values accordingly? In other words, is the lag values being pushed forward by a year to generate the future lag values?

  • @aaltinozz
    @aaltinozz 2 года назад +1

    Thanks for sharing amazing work. I just want to ask about features created from datetimeindex(year, month, dayofyear etc.) shouldn't we change type of them to category ?

    • @robmulla
      @robmulla  2 года назад

      Apprecaite the feedback. You could try making these categorical, however they aren't completely ordinal, especially day of year.. The xgboost should find splits for these features in trees based on where it determines the best break points to be. But its always worth trying and seeing what performs best using the validation setup.

  • @Boy90547
    @Boy90547 Год назад +1

    this is very impressive. do you have the lecture for trading analysis ?

    • @robmulla
      @robmulla  Год назад +1

      Glad you liked it. I have a few other videos on time series you can check out.

  • @BravePrune
    @BravePrune 2 года назад +1

    well done, absolutely incredible
    can you do one of these with a Prophet model?
    it takes away so many headaches that you were dealing with in the feature creation and cross validation

    • @robmulla
      @robmulla  2 года назад +1

      Thanks so much. Yes, I plan to do one on Prophet and LSTMs at some point.

  • @rayhanabyasa5013
    @rayhanabyasa5013 7 месяцев назад

    great video Rob! helps me a lot! i wanna ask something, so my project uses a lot of features (x) that the values is still unknown in the future unlike in your video that the features already has a values like the month, days of weeks, etc. so, is it impossible for me to predict the future because of the features is still unknown? cause i already try it and the result show a straight line with the same values :( Thanks in advance!