In elementary calculus, the differential is necessary. In more generalized integration, as Dr. Peyam says, it isn't needed, and is sometimes extraneous. BTW, the analogy is flawed; before the late 1950's, we always drove without seatbelts, *because there weren't any!* And that worked just fine. I'd say it's more like driving a stick-shift without depressing the clutch pedal. Or just driving without a transmission. Fred
as much as i don't like it, trained mathematicians do that to speed things up. tbh limits of integration are redundant as well and so you will see some people omitting them too. look up einstein summation convention for a similar concept, where you will see "an orgy of indices" -- as my math professor used to say -- with an implicit rule that all pairs of these indices are to be summed over. it is a mindfuck when you see it for the first time, but it proves to be very powerful and concise and, tbh a necessity, for the people in business.
@@michalbotor Yes! Einstein summation convention *does* actually make much of the math of GR, reasonably concise. And it does observe its own rigor. There's even a way to annotate non-summation, when that's the desired result. Fred
Yes, and that's because they show up in spherical harmonics, which are an orthogonal set of functions in ℝ³ , and which figure in the solutions of many important PDE's in physics. Fred
These polynomials are used for the angular part of the laplacian in spherical coordinates. The time independent Schrödinger equation in spherical coordinates is a great application of spherical harmonics. The legrende function is the product of legrende polynomials with a sinusoid.
Orthogonal polynomials also appear in the spherical harmonics, analytical solutions of the angular part of Schrödinger's equation for a potential V = blabla/r. Section 4.1.2 in the Griffith, if you want a nice read =)
Griffiths is weak on this and establishes it as essentially voodoo magic, Mary Boas's "Mathematical Methods in the Physical Sciences" Chapter 12 Section 2 is a much more complete introduction.
Hey Dr.Peyam I got an interesting linear algebra problem that I read when revisting Splinder book -Snow White distributed 21 liters of milk among the seven dwarfs. The first dwarf then distributed the contents of his pail evenly to the pails of other six dwarfs. Then the second did the same, and so on. After the seventh dwarf distributed the contents of his pail evenly to the other six dwarfs, it was found that each dwarf had exactly as much milk in his pail as at the start. What was the initial distribution of the milk? Generalize to N dwarfs
We have generating function f(x,t) = 1/sqrt(1 - 2xt + t^2) If someone like odes Legendre polynomial is particular solution of following ode (1-x^2)y'' -2xy' + n(n+1)y = 0 which satisfies condition y(1) = 1 I found in tables integral which gives not quite the Legendre polynomial but Legendre polynomial is easy to get from this integral \frac{2}{\pi}\int_{\theta}^{\pi}\frac{\sin{\left(\left(n+\frac{1}{2} ight)t ight)}}{\sqrt{2\left(\cos{\left(\theta ight)} - \cos{\left(t ight)} ight)}}\mbox{d}t with assumption that theta is in interval where cos(\frac{\theta}{2}) > 0 , f. e. (-\pi ;\pi) There are connections with Chebyshov polynomials \sum_{k=0}^{n}P_{k}(x)P_{n-k}(x) = U_{n}(x) \sum_{k=0}^{n}P_{k}(x)T_{n-k}(x) = (n+1)P_{n}(x)
I solved recurrence relation and got G(x,t) = 1/sqrt(1-2xt+t^2) To expand it i used binomial expansion twice then i shifted indexes Finally I got P_{m}\left(x ight) = \sum\limits_{k=0}^{\lfloor\frac{m}{2} floor}{{2m - 2k \choose m-k} \cdot {m - k \choose k} \cdot \frac{\left(-1 ight)^{k}}{2^{m}} \cdot x^{m-2k}}
I must have missed something, because you kept saying that each polynomial needed to evaluate to 1 at x=1. Isn't it rather that in an orthonormal basis, each vector (polynomial) dotted with itself has to = 1? ("Ortho" means mutually orthogonal; "normal" means normalized to unit length.) Fred
@@drpeyam OK, so in some sense of the function (value) itself, not vector normalization. I just thought that vector normalization would be wanted, because an orthonormal basis makes it so simple to find components for an arbitrary vector (function in this case). Also, this is what the -Gramm- - -Schmitt- Gram-Schmidt process is based on. Just out of curiosity, what does having always f(1) = 1, do for us? Fred
Legendre polynomials and other orthogonal polynomials are used for numeric approximation and integration. They particularly arise from eigenproblems for pde. Chebyshev polynomials rule for interpolation. 😁
@@drpeyam Thank you for the reply; I've been having trouble understanding them! can you suggest a clever strategy to study/get used to Special Functions?
Hello Dr. Peyam, this again was a very interesting video. The dot product of functions (or rather inner product) here exists, because the integral always converges, since the functions are bounded and their support is finite. Do you also plan to make a video on the space of bounded functions with infinite support that possesses an inner product (L2 space on all R^n or C^n) and how it is expanded by distributions (Gelfandsches Raumtripel) to create the expanded Hilbert space that forms the very heart of signal processing (in which the Fourier transform is still an automorphism, but cool functions like non-decaying complex exponentials exist whose integrals don't converge in standard L2)? Furthermore, the orthogonal basis you create contains infinitely many elements. It would be very nice if you added a video about the difference between a Schauder basis and a Hamel basis (apologies, if you did already and I didn't find it)!
I have heard about similar procedure that could give you polynomial that would fit sin function better than Taylor series, but I couldn't understand it fully back than and I can't find it anywhere, do you know what that was and can you make an episode about it?
You can record video about it You can get them via orthogonalization with inner product \int\limits_{-1}^{1}p(x)q(x)\mbox{d}x You can get them by solving recurrence relation You can get them by solving ordiinary differential equation Generating function you can easily get form recurrence relation Rodrigues formula you can get from ode
@@vangrails Right. You could make that choice (and it makes some things work better if you study Hilbert spaces) but the only difference would be to multiply each polynomial by a constant (e.g. p_0 should be multiplied by 1/sqrt(2)). I'm guessing Dr. Peyam preferred to keep that part of the video as simple as possible.
wow. This was a much more straightforward approach to legendre polynomials. How is making an orthonormal set from the basis {1,x,x^2,...} in the interval [-1,1] relate to the solution of the legendre equation? They seem completely unrelated to me.
@@drpeyam it is formula which allows us to get this polynomials via diferentiating with respect to x in opposite to generating function which needs to be differentiating with other variable fe with respect to t This is what i have read so far Let orthogonal polynomial satisfy following ordinary differential equation Q(x)y'' + L(x)y' + λy = 0 Let R(x) be the reciprocal of Wronskian Weight function will be R(x)/Q(x) P_{n}(x) = 1/e_{n} 1/W(x) * d^n/dx^n (W(x)(Q(x))^n) where 1/e_{n} is a factor depending on n but it is not clear for me how to determine it In my opinion this is the formulas that mathadventuress had on his mind
This was awesome! I knew about GS in linear algebra but never thought about it for functional spaces. Does this generalize to differential operators? I ask because they too are elements of a vector space. Thank you for enlightening me! I tried deriving this based on the behavior of polynomials I was seeking, but kept getting P1(x)=0 and thinking it didn’t make sense. You showed that everything is okay 🙏🏽😊
Yes maybe it would be good idea that he record series of video for orthogonal polynomials (Calculation general form, applications in numerical methods , applications in physics) I have found so far only Indian crap
fortunately there are formulas for the legendre polynomials like rodrigues' formula Pₙ(x) = (1/2ⁿn!) (dⁿ/dxⁿ) (x² - 1)ⁿ and bonnet's recursion formula (n + 1)Pₙ₊₁(x) = (2n + 1)xPₙ(x) - nPₙ₋₁(x) that make the computation of the n-th legendre polynomials considerably easier.
blackpenredpen said that writing an integral without dx is like driving without seatbelt
It’s ok, at this level of math we don’t need to worry about it 😉
In elementary calculus, the differential is necessary. In more generalized integration, as Dr. Peyam says, it isn't needed, and is sometimes extraneous.
BTW, the analogy is flawed; before the late 1950's, we always drove without seatbelts, *because there weren't any!* And that worked just fine.
I'd say it's more like driving a stick-shift without depressing the clutch pedal. Or just driving without a transmission.
Fred
as much as i don't like it, trained mathematicians do that to speed things up. tbh limits of integration are redundant as well and so you will see some people omitting them too. look up einstein summation convention for a similar concept, where you will see "an orgy of indices" -- as my math professor used to say -- with an implicit rule that all pairs of these indices are to be summed over. it is a mindfuck when you see it for the first time, but it proves to be very powerful and concise and, tbh a necessity, for the people in business.
@@michalbotor Yes! Einstein summation convention *does* actually make much of the math of GR, reasonably concise.
And it does observe its own rigor. There's even a way to annotate non-summation, when that's the desired result.
Fred
Legendre polynomials are the part of solutions of Shrodinger equation for hydrogen atom
@Ranjit Tyagi what
Yes, and that's because they show up in spherical harmonics, which are an orthogonal set of functions in ℝ³ , and which figure in the solutions of many important PDE's in physics.
Fred
Ohhh boy. How long I have waited for this. I love this channel.
These polynomials are used for the angular part of the laplacian in spherical coordinates. The time independent Schrödinger equation in spherical coordinates is a great application of spherical harmonics. The legrende function is the product of legrende polynomials with a sinusoid.
Orthogonal polynomials also appear in the spherical harmonics, analytical solutions of the angular part of Schrödinger's equation for a potential V = blabla/r. Section 4.1.2 in the Griffith, if you want a nice read =)
Griffiths is weak on this and establishes it as essentially voodoo magic, Mary Boas's "Mathematical Methods in the Physical Sciences" Chapter 12 Section 2 is a much more complete introduction.
"It looks scary but this is not that scary" . Thanx Dr Peyam for sharing your wisdom, simple but to the point.
This is very interesant, this basis is important in electromagnetism in the matter. Thank u
Thank you, Grandma Schmidt.
Legen... wait for it... drery!!! Legendrery!
Hey Dr.Peyam I got an interesting linear algebra problem that I read when revisting Splinder book
-Snow White distributed 21 liters of milk among the seven dwarfs.
The first dwarf then distributed the contents of his pail evenly to the pails of other six dwarfs. Then
the second did the same, and so on. After the seventh dwarf distributed the contents
of his pail evenly to the other six dwarfs, it was found that each dwarf had exactly as
much milk in his pail as at the start.
What was the initial distribution of the milk?
Generalize to N dwarfs
Sounds like a system of 7x7 equations problem :)
I like the way he first says thanks
This is what DIFFERENTIATES him from others
Thanks for watching!!
We have generating function
f(x,t) = 1/sqrt(1 - 2xt + t^2)
If someone like odes
Legendre polynomial is particular solution of following ode
(1-x^2)y'' -2xy' + n(n+1)y = 0
which satisfies condition y(1) = 1
I found in tables integral which gives not quite the Legendre polynomial but Legendre polynomial is easy to get from this integral
\frac{2}{\pi}\int_{\theta}^{\pi}\frac{\sin{\left(\left(n+\frac{1}{2}
ight)t
ight)}}{\sqrt{2\left(\cos{\left(\theta
ight)} - \cos{\left(t
ight)}
ight)}}\mbox{d}t
with assumption that theta is in interval where cos(\frac{\theta}{2}) > 0 , f. e. (-\pi ;\pi)
There are connections with Chebyshov polynomials
\sum_{k=0}^{n}P_{k}(x)P_{n-k}(x) = U_{n}(x)
\sum_{k=0}^{n}P_{k}(x)T_{n-k}(x) = (n+1)P_{n}(x)
IT'S OVER 9000
Hahaha
0:40 it's over 9000! :-D
It says polynonomials :)
Yes, so I guess that's a "no-no."
Fred
is there any motivation behind the p_n(1) = 1 condition? the first normalization condition that comes to my mind is ||p_n|| = 1
Both are nice, I think what makes the first one nice is that they all have a common value, think of particles ending at the same position
Do more about these and polynomial solutions like it are often introduced as solutions to differential equations but never explained.
I solved recurrence relation and got
G(x,t) = 1/sqrt(1-2xt+t^2)
To expand it i used binomial expansion twice then i shifted indexes
Finally I got
P_{m}\left(x
ight) = \sum\limits_{k=0}^{\lfloor\frac{m}{2}
floor}{{2m - 2k \choose m-k} \cdot {m - k \choose k} \cdot \frac{\left(-1
ight)^{k}}{2^{m}} \cdot x^{m-2k}}
I must have missed something, because you kept saying that each polynomial needed to evaluate to 1 at x=1.
Isn't it rather that in an orthonormal basis, each vector (polynomial) dotted with itself has to = 1?
("Ortho" means mutually orthogonal; "normal" means normalized to unit length.)
Fred
No, normalization in the sense that at 1 we all get 1, you’re talking about a different normalization
@@drpeyam OK, so in some sense of the function (value) itself, not vector normalization.
I just thought that vector normalization would be wanted, because an orthonormal basis makes it so simple to find components for an arbitrary vector (function in this case). Also, this is what the -Gramm- - -Schmitt- Gram-Schmidt process is based on.
Just out of curiosity, what does having always f(1) = 1, do for us?
Fred
^ I’m also curious :O
@@ffggddss You're right. The basis is orthogonal and it's normalized in a certain way, but it's not orthonormal since =/= 1
@@martinepstein9826 Right. At about 1m20s, he says it's "an orthonormal basis." That means that v[i]•v[j] = δ[ij]. That isn't the case here.
Fred
I feel oddly referenced at 0:40 ...
Great video btw, as always! ^^
Legendre polynomials and other orthogonal polynomials are used for numeric approximation and integration. They particularly arise from eigenproblems for pde.
Chebyshev polynomials rule for interpolation. 😁
I love your video!
“Legendary” Polynomials!!!
How easy is it to tell a Fourier legendre expansion apart from a Fourier chebyshev or Fourier laguerre expansion of a function
ok I tried it, they don't look very distinct.
Can you do a video for intuition or significance of Special Functions?
There is no intuition behind them
@@drpeyam Thank you for the reply; I've been having trouble understanding them! can you suggest a clever strategy to study/get used to Special Functions?
Hello Dr. Peyam, this again was a very interesting video. The dot product of functions (or rather inner product) here exists, because the integral always converges, since the functions are bounded and their support is finite. Do you also plan to make a video on the space of bounded functions with infinite support that possesses an inner product (L2 space on all R^n or C^n) and how it is expanded by distributions (Gelfandsches Raumtripel) to create the expanded Hilbert space that forms the very heart of signal processing (in which the Fourier transform is still an automorphism, but cool functions like non-decaying complex exponentials exist whose integrals don't converge in standard L2)?
Furthermore, the orthogonal basis you create contains infinitely many elements. It would be very nice if you added a video about the difference between a Schauder basis and a Hamel basis (apologies, if you did already and I didn't find it)!
I don’t know about any of that stuff, unfortunately :(
I have heard about similar procedure that could give you polynomial that would fit sin function better than Taylor series, but I couldn't understand it fully back than and I can't find it anywhere, do you know what that was and can you make an episode about it?
I think those are the Legendre polynomials
Hey Dr. Peyam, I've seen some weird closed form expressions for the n'th Legendre polynomial. It would be cool to see you derive one in a video!
You can record video about it
You can get them via orthogonalization with inner product \int\limits_{-1}^{1}p(x)q(x)\mbox{d}x
You can get them by solving recurrence relation
You can get them by solving ordiinary differential equation
Generating function you can easily get form recurrence relation
Rodrigues formula you can get from ode
Why is P0(x) = 1? Its integral from -1 to 1 is 2.
He's taking the norm to be its value at x=1, not the integral over [-1,1]. Since p_0(1)=1 it has norm 1. It's just a simpler choice of normalization.
@@gianmarcomolino1065 Okay, so here the norm is not the square root of the inner product of a "vector" with itself.
@@vangrails Right. You could make that choice (and it makes some things work better if you study Hilbert spaces) but the only difference would be to multiply each polynomial by a constant (e.g. p_0 should be multiplied by 1/sqrt(2)). I'm guessing Dr. Peyam preferred to keep that part of the video as simple as possible.
Is convolution have relation with this polynomials?
Not sure 🙂
Very nice!
These are useful in function approximation..
wow. This was a much more straightforward approach to legendre polynomials. How is making an orthonormal set from the basis {1,x,x^2,...} in the interval [-1,1] relate to the solution of the legendre equation? They seem completely unrelated to me.
Sir, please explain about the Hermite and laugere Polynomials
What about Rodrigues formula
The what?
@@drpeyam it is formula which allows us to get this polynomials via diferentiating with respect to x
in opposite to generating function which needs to be differentiating with other variable fe with respect to t
This is what i have read so far
Let orthogonal polynomial satisfy following ordinary differential equation
Q(x)y'' + L(x)y' + λy = 0
Let R(x) be the reciprocal of Wronskian
Weight function will be R(x)/Q(x)
P_{n}(x) = 1/e_{n} 1/W(x) * d^n/dx^n (W(x)(Q(x))^n)
where 1/e_{n} is a factor depending on n but it is not clear for me how to determine it
In my opinion this is the formulas that mathadventuress had on his mind
Lol , " It kind of becomes a nightmare at this point so let's stop."
Or use a recursion formula! ;)
Lol I was just studying least squares approximation with orthonormal functions
I guess youtube recommendation system is just THAT good :p
Thanks a lot for useful video D peyam السلام عليكم
This was awesome! I knew about GS in linear algebra but never thought about it for functional spaces. Does this generalize to differential operators? I ask because they too are elements of a vector space. Thank you for enlightening me! I tried deriving this based on the behavior of polynomials I was seeking, but kept getting P1(x)=0 and thinking it didn’t make sense. You showed that everything is okay 🙏🏽😊
He is crazy, but I like him.
Thank you sir
Please center your camera a little more. It is difficult to see your writing. Otherwise, love your videos, thank you
haha I actually thought you are going to solve the Schrödinger equation :D
Analitically for all V sksbjd
You are defying the law of gravity lol
Gracias!!
Strange... Thanks.
Yes maybe it would be good idea that he record series of video for orthogonal polynomials
(Calculation general form, applications in numerical methods , applications in physics)
I have found so far only Indian crap
😂😂haha Indian crap
fortunately there are formulas for the legendre polynomials like
rodrigues' formula Pₙ(x) = (1/2ⁿn!) (dⁿ/dxⁿ) (x² - 1)ⁿ and
bonnet's recursion formula (n + 1)Pₙ₊₁(x) = (2n + 1)xPₙ(x) - nPₙ₋₁(x)
that make the computation of the n-th legendre polynomials considerably easier.