this was a big part of my MA in Applied Math ; considering that not many real-world equations have closed form solutions, this is the only way to go, and its a beautiful branch of Math and Computer Science.
I think Prof Brunton is missing a 2 in the denominator of the delta t squared term at 22:00 (Not that relevant for the result anyway), great lecture as always
I'd be looking forward to see a treatment for partial derivatives, i.e. stencils for PDEs. I never took the time to properly study this stuff and now whenever some pde blows up on me (expecially with Python, it happens a lot even for trivial stuff like a slightly modified diffusion equation) I'm left wondering whether I'm misusing functions or the damned thing just doesn't work very well
Δt is the leading order error term, only if Δt < 1, right? That's the only way Δt is actually smaller than its higher powers. So the unit of t matters. If we use seconds, instead of hours, the numerical value of Δt may actually be larger than 1, making the first-order term not necessarily smaller than higher order terms and more specifically, dependent on the numerical values of their products with the derivative terms (rescaled by the factorials) in our chosen system of units. Correct?
Would it be true that the central difference approximation is not just dt^2 vs dt in the forward and backward approximations but has the divide by 3 from the 3! advantage too. So to get 100 times smaller error you would only need 10/1.7 smaller dt? The 1.7 is sqrt(3) Really looking forward to differentiating real data to see how you treat the noise from digitizations and noisy real data. Thanks for the clear explanations!
nice to hear this video. I have some basic questions related to extremum seeking control. although I watch many many your past videos and read many papers. still, small things confuse me. How I will ask?
hi mr. i'm kia i wanna discustion with you maybe you can. i have study with jurnal about high order compact finite difference. but i have problem in my metode. maybe can you help me?
Call me lazy, but I just take a polynomial regression of N neighboring points, and the polynomial coefficients of this regression give me the first, second, third, etc derivatives of the function at the neighborhood's origen. Done!
Wow, very clever. I didn't know fit coefficients were the derivatives. I'm going to go try that out now. Thanks for sharing -- I'll subscribe to your channel too.
@@johnsinclair1447 Just remember to multiply the Nth coefficient by N!, and you'll get the Nth derivative of the polynomial at its origin. This is a pretty straightforward result that you can prove by differentiating the polynomial N times and then setting x=0. Another way of deriving it comes from the Taylor expansion of a function.
this was a big part of my MA in Applied Math ; considering that not many real-world equations have closed form solutions, this is the only way to go, and its a beautiful branch of Math and Computer Science.
same.. you can get arbitrarily close to any analytical function
I watched so many videos from prof. Brunton that I hear “welcome back” in my head already at the opening 😂
The best math instruction on the Internet
A wonderful lecture, as usual. Thank you, Prof. Brunton!
Thankyou Steven. You are a good teacher and it is teachers like yourself that motivate me to continue on my path of education.
Awesome explanation! I really enjoy your videos
Happy to hear it :) Thanks for watching!
you made life easier for me in my graduate study. Thank you
Happy to help! Thanks for watching :)
Thank you so much Steve! You're an inspiration
wow ! this is so great. Thanks Steve and looking forward for the next course
Do you really write backwards, or use software to invert it ???
A very nice video sir! Thanks for this.
Thanks for the video! At 3:53, f(x) seems to be a typo for f(t). At 22:17, O(∆t^5) seems to be a typo for O(∆t^4).
Do you know why the error is tend to fourth power?
I saw Steven at APS DFD yesterday, was great.
Great video!
I think Prof Brunton is missing a 2 in the denominator of the delta t squared term at 22:00 (Not that relevant for the result anyway), great lecture as always
Great video. Thank you
Perfect explain!! Thanks
I'd be looking forward to see a treatment for partial derivatives, i.e. stencils for PDEs. I never took the time to properly study this stuff and now whenever some pde blows up on me (expecially with Python, it happens a lot even for trivial stuff like a slightly modified diffusion equation) I'm left wondering whether I'm misusing functions or the damned thing just doesn't work very well
Thank you very much...❤❤❤.
beautifully explained. 👍👍
Thanks!
You are the GOAT
the 1st code, which IDE did you use Prof?
🙂
I'm hoping Functional Data Analysis will be covered.
It's a very powerful numerical framework.
Δt is the leading order error term, only if Δt < 1, right? That's the only way Δt is actually smaller than its higher powers. So the unit of t matters. If we use seconds, instead of hours, the numerical value of Δt may actually be larger than 1, making the first-order term not necessarily smaller than higher order terms and more specifically, dependent on the numerical values of their products with the derivative terms (rescaled by the factorials) in our chosen system of units. Correct?
Wouldn't it be better to polyfit cubic spline around (t) and then extract first and second derivatives from coefs?
thank you for your effort, ı appreciate this video and you a lot!
Would it be true that the central difference approximation is not just dt^2 vs dt in the forward and backward approximations but has the divide by 3 from the 3! advantage too. So to get 100 times smaller error you would only need 10/1.7 smaller dt? The 1.7 is sqrt(3)
Really looking forward to differentiating real data to see how you treat the noise from digitizations and noisy real data. Thanks for the clear explanations!
nice to hear this video. I have some basic questions related to extremum seeking control. although I watch many many your past videos and read many papers. still, small things confuse me. How I will ask?
Thank you so much
hi mr. i'm kia i wanna discustion with you maybe you can. i have study with jurnal about high order compact finite difference. but i have problem in my metode. maybe can you help me?
Thank you
Call me lazy, but I just take a polynomial regression of N neighboring points, and the polynomial coefficients of this regression give me the first, second, third, etc derivatives of the function at the neighborhood's origen. Done!
Wow, very clever. I didn't know fit coefficients were the derivatives. I'm going to go try that out now. Thanks for sharing -- I'll subscribe to your channel too.
@@johnsinclair1447
Just remember to multiply the Nth coefficient by N!, and you'll get the Nth derivative of the polynomial at its origin. This is a pretty straightforward result that you can prove by differentiating the polynomial N times and then setting x=0. Another way of deriving it comes from the Taylor expansion of a function.
@@johnsinclair1447
And let me know how it went.
Can you please talk about HIGH ORDER finite difference methods?
Think i just fell in love in a comment section… shawty is fine 🫢
How did you record these videos? Did you write on a piece of glass and then mirror the video?
Probably, so we have the same perspective as him.
Hey you are right handed here 😃
daje like
Problem solves
Please professor kindly upload a lot of python related videos from basic to advanced
By
vasanth from india,tamilnadu