Just discovered your channel while looking for explanations related to localisation and mapping for my Robotics BEng and could not be more appreciative of the videos! great content!
Hi Cyrill, which previous lecture are you referring to here? 7:28 "And one way to simplify it is to apply Bayes' rule. Remember, the thing that we did just a few minutes ago in in the previous lecture on probability theory?" What is the title, or how can I find that one? I would like to watch it before this one.
@@CyrillStachniss, thank you - that was indeed very helpful, especially the derivation of Bayes' rule and the note about background knowledge (additional givens), which had been the part really confusing me previously. I am still a little unclear on why it is permissible to reduce the "evidence" denominator to a "normalization constant", but I think I can find out more about that by searching around.
can anyone help me out because when i was watching this video I couldn't understand why we are learning this concept and the notations are explained verbally for e.g. z,x,u this notations were hard to figure out until sir didn't defined it verbally
Amazing explanation, still providing value after all these years
This is one of the most valuable channels on RUclips for me. Thank you for your work and your contribution.
Thanks
Same here. Its literally saving my career life. Thank you :)
Excellent lecture! Thank you and team very much for sharing your wisdom and your hard work!
Crystal clear lecture, you saved my days! Viele dank!
The best videos on mobile robotics: concise and crisp
Thank you for the premium content.
Thank you so much sir, Its very hard to find such a detailed explaination on this topic on internet. It really helped a lot.
Just discovered your channel while looking for explanations related to localisation and mapping for my Robotics BEng and could not be more appreciative of the videos! great content!
Thank you!
This is the best explanation of bayes filter. Thank you!
I think it's not possible to find a better explanation, thank you!
Thanks
Nicely Explained. Thank you Cyrill stachniss.
very good explanation! man I hope every professor would be like this :)
So great a video! Thank you so much, Prof. Cyrill.
Great presentation ❤️👌
Hi Cyrill, which previous lecture are you referring to here? 7:28 "And one way to simplify it is to apply Bayes' rule. Remember, the thing that we did just a few minutes ago in in the previous lecture on probability theory?" What is the title, or how can I find that one? I would like to watch it before this one.
ruclips.net/video/JS5ndD8ans4/видео.html
@@CyrillStachniss, thank you - that was indeed very helpful, especially the derivation of Bayes' rule and the note about background knowledge (additional givens), which had been the part really confusing me previously. I am still a little unclear on why it is permissible to reduce the "evidence" denominator to a "normalization constant", but I think I can find out more about that by searching around.
Will it be possible to have access to the pdf files of the slides?
@4:33 Hmm why does the distribution move when the robot moves 1 m?
because the belief about where you are has moved,
and that movement has also introduced noise, so the distributions has less sharp of peaks.
You are a gem.
Excellent 👌
Very good content
Great video
Is there any relationship between Viterbi algorithm used in HMM and this filter?
excellent
can anyone help me out because when i was watching this video I couldn't understand why we are learning this concept and the notations are explained verbally for e.g. z,x,u this notations were hard to figure out until sir didn't defined it verbally
thank you professor
cool! after attending burgard's course then come here. also nice
Thank you so much
You're welcome!