Very helpful. I'm working through and online course on time series and you playlist is going to be an excellent supplement to the course content. Thank you!!
Let's imagine you have a toy car that you play with daily. How you play with the car one day might affect how you play with it the next day. Now, imagine if we wanted to predict how you'll play with the car tomorrow based on how you played with it in the past. An AR(∞) model is like trying to predict how you'll play with the car tomorrow by looking at every single way you played with it in the past, even going back forever! But that's impossible because we can't remember or keep track of how you've played with the car since birth. So, it's like having too much information to deal with. On the other hand, an MA(1) model is more straightforward. It only looks at how you played with the car yesterday and uses that to guess how you might play with it tomorrow. It's like saying, "Hey, since you played with the car this way yesterday, you might play with it in a similar way tomorrow." It's easier to work with because it only focuses on the most recent way you played with the car, not all the ways from the past.
great video! just wondering why we dont need to input "miu" in the MA1 model, which was shown in the "Time Series Talk:Moving average Model" video? Thanks!
I was struggling with the same here and I think it would be great if this was explained in the video. I think for a matter of simplicity, they just considered mu = 0.
Thanks for the videos. I am really enjoying studying these concepts from your playlists. Just a short comment. From MA1 model, C_1 = - phi * e_0 + e_1, so, e_2 = C_2 + phi * e_1 = C_2 + phi * C_1 + phi^2 * e_0. Propagation gives e_n = C_n + phi * C_n-1 + phi^2 * C_n-2 * phi^3 * C_n-3 + ... + phi^n * e_0. When n is large enough and phi < 1, the last term goes to 0 and Cn is expressed as the sum of the past C_n-k series.
I think it can be much more intuitively illustrated by simply changing the subject of the equation. Instead of Ct = ... you formalize it as EPSt = ..., and recursively plug in the corresponding formula.
Tysm for the helpfull vids! I have question, in your Lag Operator video, you rewrite the ARMA in terms of lag operator by (phi1Lyt + phi2L^2yt+...+phi3kL^kyt), but in this video you square the phi's as well. why is it different? thnx in advance, Greetings
Thank you so much for the video! I am rewatching it because I indeed have trouble understanding this topic :D I have a question: Why do you use the coefficient Phi for the MA-process? In my textbook, we use this letter for the AR-process, and for the MA-process we use the letter Theta. Or is that not so important?
You're right, but it is just notation. I'm actually facing this problem because every book or video I read/watch has a different notation, slowing the learning process.
This was illuminating (and fun!) You are a *great* teacher!
Thanks!
This is in fact a beautiful use of the operator theory, thank you for the video
Very helpful. I'm working through and online course on time series and you playlist is going to be an excellent supplement to the course content. Thank you!!
YOU HAVE SAVED MY DEGREE THANK YOU FOR YOUR VIDEOS ON TIME SERIES ANALYSIS
The causal diagram was just too good!
I see a recommendation for his channel and that too on Time Series, I click PLAY!
your videos are amazing!!! THANK YOU SO MUCH!!
He is too innovative, I watched every video more than once
Let's imagine you have a toy car that you play with daily. How you play with the car one day might affect how you play with it the next day. Now, imagine if we wanted to predict how you'll play with the car tomorrow based on how you played with it in the past.
An AR(∞) model is like trying to predict how you'll play with the car tomorrow by looking at every single way you played with it in the past, even going back forever! But that's impossible because we can't remember or keep track of how you've played with the car since birth. So, it's like having too much information to deal with.
On the other hand, an MA(1) model is more straightforward. It only looks at how you played with the car yesterday and uses that to guess how you might play with it tomorrow. It's like saying, "Hey, since you played with the car this way yesterday, you might play with it in a similar way tomorrow." It's easier to work with because it only focuses on the most recent way you played with the car, not all the ways from the past.
thank you very much, I love your playlist on time series. wonderful explanations!!!
great video! just wondering why we dont need to input "miu" in the MA1 model, which was shown in the "Time Series Talk:Moving average Model" video? Thanks!
I was struggling with the same here and I think it would be great if this was explained in the video. I think for a matter of simplicity, they just considered mu = 0.
thank you my friend, you're the best
It's so cool..... I was able to guess in the end it was AR model before you said.... How ur videos are relatable ... awesome
thanks!
1:50 can you use phi and theta interchangeably when referring to an MA process? In other videos you used theta only for MA
What a great explanation! Congrats
please keep it up, I wish you'd re-organized the playlist for us to follow
Sir plzz make a detailed video on cointegration.. Especially Johensen cointegration...
Outstanding!!
I love your videos they are really helpful. Thank you so much
Thank you for your great video!
My pleasure!
Hi, there! I assist the students of a Time Series Econometrics course in college. Found this video while preparing a revision lesson. Pretty good!
The
best explanation ever! Thanks
Thanks for the videos. I am really enjoying studying these concepts from your playlists.
Just a short comment. From MA1 model, C_1 = - phi * e_0 + e_1, so, e_2 = C_2 + phi * e_1 = C_2 + phi * C_1 + phi^2 * e_0. Propagation gives e_n = C_n + phi * C_n-1 + phi^2 * C_n-2 * phi^3 * C_n-3 + ... + phi^n * e_0. When n is large enough and phi < 1, the last term goes to 0 and Cn is expressed as the sum of the past C_n-k series.
Great presentation!
Excellent explanation! Thank you!
So we are sayin because of invertibility we don't have to figure out error terms and use lagged value of actual time series itself. Brilliant!
It would be nice to have at the end of each video a homework data set and a list of two or three questions.
wonderful job!!!!!!!omg i love you
@rivikmath that was sooooo clear. thank you!
you are the best of the best
@ritvikmath Just a question: how do we prove that the absolute value of Phi is less than one? or is this given?
The arrow in the left diagram is more like a "function of", instead of "caused by".
What doesn't work out for me is saying that eps_t is a function of C_t, because eps_t is supposed to be white noise right?
Brilliant!
omg thank you for making this make sense to me
Good work, sir.
I think it can be much more intuitively illustrated by simply changing the subject of the equation. Instead of Ct = ... you formalize it as EPSt = ..., and recursively plug in the corresponding formula.
Thank you very much!
You're welcome!
I really miss your old video format with the white board only. Can I ask why you changed it?
Tysm for the helpfull vids! I have question, in your Lag Operator video, you rewrite the ARMA in terms of lag operator by (phi1Lyt + phi2L^2yt+...+phi3kL^kyt), but in this video you square the phi's as well. why is it different? thnx in advance, Greetings
briliant vid !!!
Can you help around the logic of why ma(1) processes donot follow Markov property
Pure genius
Hi Ritvikmath, I was wondering if you give tutoring lessons in Mathematics for Data Science?
You are a hero.🤣
Why have u omitted the mean in the MA(1) model?
Awesome
H do we solve this equation: v[k[=e[k]+1.4e[k-1]+0.38e[k-2]???
damn... this goes hard
Very clear!
Thank you so much for the video! I am rewatching it because I indeed have trouble understanding this topic :D I have a question: Why do you use the coefficient Phi for the MA-process? In my textbook, we use this letter for the AR-process, and for the MA-process we use the letter Theta. Or is that not so important?
You're right, but it is just notation. I'm actually facing this problem because every book or video I read/watch has a different notation, slowing the learning process.
Cool!
you 're great thx
Hi, I was wondering how invertability useful - what can I do with that information
thank you
When I look at that causal diagram, all I see is a RNN
Bro you save my ass again!
That's cool.
Ya le errás aquí. Tenés que preparar mejor estos temas. Thanks 🌹🌹
Sir why we put log operator on time series variables like log gdp log cpi log oil price... What is the benefit of putting log.. Plzz answer sir
Exponential time-series cannot be studied properly. Logging them removes the exponentiality
Such a great video. Thank you!
Brilliant!
Thanks!
Thank You