I am continually amazed by the quality and value of your videos aswell as papers and notes. It is not only very helpful but also enjoyable to read and watch! Thanks!
This is one of the best videos on the LaPlacian on the entire internet. If you made a LaPlacian/Fourier component, that would complete the complexity cycle personally!!
Incredible content, it's rare to see such clear maths lectures online! I thought I had a good grasp of the Laplacian before, but this video blew my mind anyway! Thanks a lot and keep up the good work! :D
You'll find a link to a conference lecture on the course page. It's not missing, just not part of the playlist. ruclips.net/video/4YHmaoQoT9s/видео.html
According to the 2020 course site, the L19 content is covered in a video by Prof. Crane titled "Conformal geometry processing" posted under the account name "CG Group Telecom ParisTech". I won't risk posting a direct link, because RUclips algorithms have recently been obsessed with removing comments containing any link. Searching with either group of keywords might be insufficient, but including both works for me.
Thank you for these amazing lectures Dr. Crane! I was wondering that at the end of lecture 18 you mention discrete Laplace operator, but I was not able to find Lecture 19 for it. The link for lecture 19 at your website points to a lecture on conformal mapping.
Very informative :) A simple practical application of the Graph Laplacian is to regularize mesh deformation, something I talk about in my own DGP playlist.
Great animation around m 12 about the deviation from the average. Only, as epsilon -> 0 the function will become smaller and smaller. You have to divide by epsilon^2 to make it converge to something in that limit. This is made very clear on the next slide, the one with the average over a ball.
Amazing explanations. I really appreciate this so much. One thing I was wondering about is how you made all these visuals. For instance at 55:10 did you use Matlab to compute this, or do you use multiple different software? How do you make it look so nice?
Love these lectures and wish I had found them years ago! One minor "aargh!" What happened to Lecture 19? I came here for discrete Laplacian but it's the only missing lecture. So sad!
Awesome, Lecture. Is the Discrete Laplacian Lecture lost unfortunately? The notes and suggested readings are great, but would have liked to cement my understanding.
Here the inverse is already written explicitly, so upper indices would imply yet another inverse. In other words, g^{ij} = (g^{-1})_{ij}, and likewise, (g^{-1})^{ij} would be equal to ((g^{-1})^{-1})_{ij} = g_{ij}. Of course, the reason for explicitly writing g^{-1} here is that the upper/lower index convention can get confusing if you're not used to it! (You notice I also don't use the Einstein summation convention here. ;-))
@@lucagagliano5118 Yep! You can find slides at the link in the video description. There are also some written course notes that go along with the slides. So the only thing that's missing is vocal narration.
@@keenancrane But your narration is still critical. Your explanations are really lucid and simple. I sincerely hope you can make and add Lecs 1-13 sometime soon after the semester. There are very few courses on this topic. Thanks a ton for making these available though!!
Not only is this video an excellent, digestible exploration of the Laplace operator, it is a celebration of it
I am continually amazed by the quality and value of your videos aswell as papers and notes. It is not only very helpful but also enjoyable to read and watch! Thanks!
This is one of the best videos on the LaPlacian on the entire internet. If you made a LaPlacian/Fourier component, that would complete the complexity cycle personally!!
Incredible content, it's rare to see such clear maths lectures online! I thought I had a good grasp of the Laplacian before, but this video blew my mind anyway! Thanks a lot and keep up the good work! :D
Excellent content. The intuition part is what makes it stand out. Thanks a bunch!
this may be the best lecture series I’ve ever seen❤
your videos are hands down some of the best mathematics lectures I have ever seen
Wonderful lectures. Thank you for posting these lectures on the web.
Thank you so much for uploading this brilliant lecture series! Please post Lecture 19! It is missing from this perfect set
I second this :D
You'll find a link to a conference lecture on the course page. It's not missing, just not part of the playlist.
ruclips.net/video/4YHmaoQoT9s/видео.html
Same!
Was the discretized laplacian operator lecture omitted? I know the first thirteen or so lectures were not recorded, but lecture 19 is MIA?
Yes, I was looking forward to this one too, couldn't find it. Will be great if it's uploaded
According to the 2020 course site, the L19 content is covered in a video by Prof. Crane titled "Conformal geometry processing" posted under the account name "CG Group Telecom ParisTech".
I won't risk posting a direct link, because RUclips algorithms have recently been obsessed with removing comments containing any link.
Searching with either group of keywords might be insufficient, but including both works for me.
Thank you very much for posting this video, I got chills several times watching it.
Great content... jogged some nice memories back to previous PDE upper division coursework
Awesome lecture series! Could you please upload the lectures 1-13?!
Wonderful!I have ever seen my life such clear visual explanations!Lots of thanks for all!
Plz make some video about random motion and diffusion...
Thank you for these amazing lectures Dr. Crane! I was wondering that at the end of lecture 18 you mention discrete Laplace operator, but I was not able to find Lecture 19 for it. The link for lecture 19 at your website points to a lecture on conformal mapping.
Thank you very much! Could you please upload the lecture 19?
Very informative :) A simple practical application of the Graph Laplacian is to regularize mesh deformation, something I talk about in my own DGP playlist.
Great animation around m 12 about the deviation from the average. Only, as epsilon -> 0 the function will become smaller and smaller. You have to divide by epsilon^2 to make it converge to something in that limit. This is made very clear on the next slide, the one with the average over a ball.
Amazing explanations. I really appreciate this so much. One thing I was wondering about is how you made all these visuals. For instance at 55:10 did you use Matlab to compute this, or do you use multiple different software? How do you make it look so nice?
Thanks a ton for the amazing series...Can u please post Lecture no. 19...it’s missing...
Love these lectures and wish I had found them years ago! One minor "aargh!" What happened to Lecture 19? I came here for discrete Laplacian but it's the only missing lecture. So sad!
What do you use to make your 3D images? They are beautiful
There's a little bit of information about my process here: www.cs.cmu.edu/~kmcrane/faq.html#figures
Awesome, Lecture. Is the Discrete Laplacian Lecture lost unfortunately? The notes and suggested readings are great, but would have liked to cement my understanding.
49:04 There is a minus sign missed for E_D(u) at the bottom left.
where is lecture 19? I miss that.
47:00. That u has large Dirichlet energy.... "u has big Dirichlet energy." lol nice.
insert "u good?... no" cat meme..., great work, thank you for upload your lectures, i always follow your "Discrete Differential Geometry" course
Thanks so much, very useful series!
1:07:36 there should be a dL along the boundry for the last equation?
Very informative! Thank you!
So freakin' good.
19:00 inverse metric tensor has upper indices, not lower
Here the inverse is already written explicitly, so upper indices would imply yet another inverse. In other words, g^{ij} = (g^{-1})_{ij}, and likewise, (g^{-1})^{ij} would be equal to ((g^{-1})^{-1})_{ij} = g_{ij}. Of course, the reason for explicitly writing g^{-1} here is that the upper/lower index convention can get confusing if you're not used to it! (You notice I also don't use the Einstein summation convention here. ;-))
really awesome
Where are all the video lectures? I can't find the first one.
We switched to remote teaching mid-semester, so I've only recorded some of the lectures. I may try to record the first set after the semester ends.
@@keenancrane Ah ok, are the presentations available already? Are you sharing them?
@@lucagagliano5118 Yep! You can find slides at the link in the video description. There are also some written course notes that go along with the slides. So the only thing that's missing is vocal narration.
@@keenancrane That's great!!!
@@keenancrane But your narration is still critical. Your explanations are really lucid and simple. I sincerely hope you can make and add Lecs 1-13 sometime soon after the semester. There are very few courses on this topic. Thanks a ton for making these available though!!