Thank you so much for making this video! Finding articulate tutorials describing the how and why is so difficult. You're video is PERFECT! Thank you thank you thank you
Hi Sir, you vedios are more worth than any other lakhs paid course.. You are so generous to share your hard earned knowledge to the AI community.. Could you please add few more real time scenario based time series vedios..
I watched all your videos with pleasure, you are great 👏I will ask you a question for the first time, I am doing a project with the varmax method, but since I have hourly data in the estimation of the number of rail system passengers, I think it causes noise both annually and seasonality during the day, how can I overcome this problem. Also, my prediction variable is not normally distributed, do I need to do anything for this? Otherwise ,Should I do deep learning methods?
Hi Are we suppose to remove the seasonality from the data, before we check the adfuller test check and Granger causality check? Please comment here When we do diff() to get the stationary data, for me i have applied the diff(n=2). So, how can i reach the original non-stationary data?
Thank you so much for sharing and teaching. I do find myself one question to ask. when you delete some columns by finding the high p-value features, I found that p-value of lag 1 is high but other lags' p-values are small. For example, Column - lights : P_Values - [0.1451, 0.0005, 0.0001, 0.0002, 0.0001, 0.0001, 0.0006, 0.0012]. So I think is it logical that you can add some lags of lights (from lag2 to lag8) and get rid of lag1. And how can we do this in python? I have this question because I am forecasting the NBA games results. So I wonder if I can mix time-series features and non-time-series features together? Because for some of my features, I do need lags, but for other features like 'Home' or 'away' in the coming game, I know it is fixed and I don't want any lag from this feature. Do you know how can I do this? Any algorithm suggestions? Thank you so much.
Yes you can.. Typically what I do is create the lags as individual features and then add non time series data to train regular ML models. I have worked with XGB as well as RF. I would say depending on data size you can settle on algorithm
Hi, your videos are great. I'm writing my Seminar Thesis with these Videos. But as a source I need some literature. So I would be very happy, if you could tell me where you've got your information from. Thanks a lot!
Sophie you can get VAR literature from below link - otexts.com/fpp2/VAR.html The code in the video is developed by me and there is no reference for that part
after this forecast if I can see one or more feature forecast are coming wrong, then should I use prophet? or how can I make forecast from the equation u made at last? pls answer:-)
Awesome video @AllEngineering! I'm very interested in showing causality given a dataset. Would you know any other methods/tests apart from the Granger causality you used here?
Sergey.. Even though the test says causality frankly it shows only relationship of time series with another using lagged variable. It may not mean strict causality here. To my knowledge most causality test are based on only observation over time rather by test. I maybe wrong as well here
To the best of my knowledge p_value of 0.5 is 50%, (since p_value ranges from 0 to 1 ) i.e. the correct p_value of 5% for rejecting the h0 should be 0.05 and not 0.5
very useful video. Thank you so much. I tried your code for my time series data, Everything is going well but I got following error when I am going to fit the model after developing LSTM architecture. ValueError: Failed to find data adapter that can handle input: , Could you please advise why I got this error?
While iterating for lag after model = VAR(df_train) on time series, I got the following error: LinAlgError: 2-th leading minor of the array is not positive definite Unable to resolve, please help!
This is very well explained!!! Thank you so much.can you please explain when we build a separate model to validate,instead of taking coefficients of all the 7 lags,can we take the the sum of the coefficients of all the 7 lags per dependent variable and multiply it with the input variable?
Thank you.. We can think of separate model if the lag is pretty high and only few lags and features are significant. This way new model can be built on only significant variables. If you want to forecast all variables then yes we can multiply coeff of all and that is what VAR forecast function as well does internally
Everytime you keep saying 0.5 for rejecting null hypothesis but it should be 0.05. I think this is blunder from your side because you kept repeating it.
Thank you so much for making this video! Finding articulate tutorials describing the how and why is so difficult. You're video is PERFECT! Thank you thank you thank you
Hi Sir, you vedios are more worth than any other lakhs paid course.. You are so generous to share your hard earned knowledge to the AI community..
Could you please add few more real time scenario based time series vedios..
Got great help from your video. I wasn't able to figure out how to give the frequency of '10T' but saw a similar case in your video.
Your video is truly incredible man thank you for this
I watched all your videos with pleasure, you are great 👏I will ask you a question for the first time, I am doing a project with the varmax method, but since I have hourly data in the estimation of the number of rail system passengers, I think it causes noise both annually and seasonality during the day, how can I overcome this problem. Also, my prediction variable is not normally distributed, do I need to do anything for this? Otherwise ,Should I do deep learning methods?
Hey sir ,
Will you please tell me weather I use var model when I have more than 2 variable ???
Nice explanation.
Wonderful. Thanks for sharing!
You are awesome sir, explanation is clear.
i realize it's kind of randomly asking but does anybody know a good site to stream new movies online ?
@Wyatt Landyn Lately I have been using flixzone. You can find it by googling :)
How to choose the lag? Good input!
Hi
Are we suppose to remove the seasonality from the data, before we check the adfuller test check and Granger causality check?
Please comment here
When we do diff() to get the stationary data, for me i have applied the diff(n=2). So, how can i reach the original non-stationary data?
you are simply awesome !! 👍
Thank you. This is very helpful.
Thanks for the explanation, very well explained. Wondering whether I could have access to the notebook?
In my git repo you will have all the notebook - github.com/srivatsan88/End-to-End-Time-Series/blob/master/Multivariate_Time_Series_using_VAR.ipynb
where is the sample notebook link? can you please help me with that
Sir kindly do video on end to end of time series with deployment which will be really helpful 👍
I will for sure towards the end. Have been trying to complete some more concepts in TS and post that will do it
@@AIEngineeringLife ok thanks 😊👍
thank you so much for your clarrification , can we use categorical variables in the VAR analysis ? thanks and appreciated.
Thank you so much for sharing and teaching. I do find myself one question to ask.
when you delete some columns by finding the high p-value features, I found that p-value of
lag 1 is high but other lags' p-values are small. For example,
Column - lights : P_Values - [0.1451, 0.0005, 0.0001, 0.0002, 0.0001, 0.0001, 0.0006, 0.0012].
So I think is it logical that you can add some lags of lights (from lag2 to lag8) and get rid of lag1.
And how can we do this in python?
I have this question because I am forecasting the NBA games results. So I wonder if I can mix time-series
features and non-time-series features together? Because for some of my features, I do need lags, but for other features like 'Home' or 'away' in the coming game, I know it is fixed and I don't want any lag from this feature. Do you know how can I do this? Any algorithm suggestions? Thank you so much.
Yes you can.. Typically what I do is create the lags as individual features and then add non time series data to train regular ML models. I have worked with XGB as well as RF. I would say depending on data size you can settle on algorithm
hello sir, do you have the link of the google collab for this? thank you
could you please explain how impulse response functions and variance decomposition are calculated based on the var model
Hi, your videos are great. I'm writing my Seminar Thesis with these Videos. But as a source I need some literature. So I would be very happy, if you could tell me where you've got your information from.
Thanks a lot!
Sophie you can get VAR literature from below link - otexts.com/fpp2/VAR.html
The code in the video is developed by me and there is no reference for that part
@@AIEngineeringLife Thank you :)
Hi, can you add a video on how to use VARMAX models?
Nicely explained sir, cleared most of the concepts. shall we get this notebook in your GitHub repo sir?
Here it is - github.com/srivatsan88/End-to-End-Time-Series/blob/master/Multivariate_Time_Series_using_VAR.ipynb
Could u pls explain how can I forecast the target variabe for a specified period using the last derived equation
after this forecast if I can see one or more feature forecast are coming wrong, then should I use prophet? or how can I make forecast from the equation u made at last? pls answer:-)
Awesome video @AllEngineering! I'm very interested in showing causality given a dataset. Would you know any other methods/tests apart from the Granger causality you used here?
Sergey.. Even though the test says causality frankly it shows only relationship of time series with another using lagged variable. It may not mean strict causality here. To my knowledge most causality test are based on only observation over time rather by test. I maybe wrong as well here
Hi, thanks for awesome video, for the line : model = VAR(df_train, freq="1H") how can I adapt this to a monthly time series please?
To the best of my knowledge p_value of 0.5 is 50%, (since p_value ranges from 0 to 1 ) i.e. the correct p_value of 5% for rejecting the h0 should be 0.05 and not 0.5
Thank you so much for sharing. can you please also create a small video on how to interpret impulse response function in python?
very useful video. Thank you so much. I tried your code for my time series data, Everything is going well but I got following error when I am going to fit the model after developing LSTM architecture.
ValueError: Failed to find data adapter that can handle input: ,
Could you please advise why I got this error?
While iterating for lag after model = VAR(df_train) on time series, I got the following error:
LinAlgError: 2-th leading minor of the array is not positive definite
Unable to resolve, please help!
Did u use my sample notebook or is it a custom one you created. Did u check if there is no null in data
This is very well explained!!! Thank you so much.can you please explain when we build a separate model to validate,instead of taking coefficients of all the 7 lags,can we take the the sum of the coefficients of all the 7 lags per dependent variable and multiply it with the input variable?
Thank you.. We can think of separate model if the lag is pretty high and only few lags and features are significant. This way new model can be built on only significant variables. If you want to forecast all variables then yes we can multiply coeff of all and that is what VAR forecast function as well does internally
Good job
I think your p-value needs to be the significance level and 0.05 is what is commonly used. Not 0.5 that you said in the video.
Everytime you keep saying 0.5 for rejecting null hypothesis but it should be 0.05. I think this is blunder from your side because you kept repeating it.
Excellent presentation👍. Please how can i reach you via email?