For people with ANY background on independent events and PMF distributions, this is a great refresher/review. Well done and thanks so much for posting.
For anyone considering following this tutorial, i'd suggest watching the 2014 version, which covers the same material, but in a much clearer and more structured fashion (I've watched both the 2012 & 2014). Allen Downey: Bayesian statistics made simple - PyCon 2014
Wow. It seems that you can understand the delirium behind the "maximum likelihood estimation' (google it). The point in maximizing P(x1 |θ)P(x2 | θ)...P(xn | θ)= joint distribution P(x1,x2,...xn |θ) is that model parameter of a theory θ is our hypothesis H and we must give the best estimate after given n observations x1, x2, ... xn (in this lecture, teacher invited us to solve the repeated observation of the coin flip) Given observation xes and for every parameter θ we compute the joint probability and find the largest one. The parameter that gives the largest product-probability is the best estimation of θ!
According to your formula for the confidence remaining after flipping the coin, why does your confidence always go to 0 regardless of the sequence you obtained?
Here's an updated version for 2016 lecture: ruclips.net/video/TpgiFIGXcT4/видео.html You can also find the source code at his repo on github: github.com/AllenDowney/BayesMadeSimple/blob/master/boston16.ipynb github.com/AllenDowney/BayesMadeSimple
Seems like some interesting stuff. Unfortunately, mostly out of focus screens. Questions that are not at all audible. This is a very, very raw presentation that could do with a lot of cleaning up.
For any just starting this circa 2014, python released newer versions. The thinkbayes.py script doesn't work in Python 3. I'm having much better luck using python 2. It's also easier (possible) to import numpy and scipy in Python 2.
Actually thought he was a bit condescending towards the students. The video also had several interruptions where the coordinator (or whomever was running the seminar) kept walking up to the front and nagging the instructor.
the teacher is very good at explaining bayesian statistics, but his python implementations are not very 'pythonic', he should have implemented the __XXX___ methods like __mul__ instead of Mult for multiplying, also his CammelCase of methods and function confuses me :D
After taking a stochastic course this term I think it is pretty much impossible to understand what he is talking about. Saying something is not the same as explaining it, there is just way to much to cover to really understand Bayes and hypothesis and likelihood and so on.
For people with ANY background on independent events and PMF distributions, this is a great refresher/review. Well done and thanks so much for posting.
For anyone considering following this tutorial, i'd suggest watching the 2014 version, which covers the same material, but in a much clearer and more structured fashion (I've watched both the 2012 & 2014).
Allen Downey: Bayesian statistics made simple - PyCon 2014
unlike 3/4 of my lecturers, you gave me interest to your subject, and to what you are trying to educate, keep it up.
Thank you! Tough subject... practice makes perfect: liked the fact that the time to (try to) solve the exercises was left in the video!
This professor is cool! Nice lecture!
Sir, you delivered it very nicely starting from very basic..Thanks !!
This teacher is pretty awesome.
Great explanation. It helped me understand Bayes. Thanks for sharing.
sites.google.com/site/simplebayes/home/pycon-2012/prepara site for the slides etc.
Enjoyed lecture; excellent examples and sequencing of materials.
Great explanation. Thanks for sharing.
Awesome stuff! Really good teaching!
Too dificult to read the screens, which are not in focus.
Would it be possible to edit them directly into this video?
very good video! cudos to Allen!
I was the 1K upvote!
Wow. It seems that you can understand the delirium behind the "maximum likelihood estimation' (google it). The point in maximizing P(x1 |θ)P(x2 | θ)...P(xn | θ)= joint distribution P(x1,x2,...xn |θ) is that model parameter of a theory θ is our hypothesis H and we must give the best estimate after given n observations x1, x2, ... xn (in this lecture, teacher invited us to solve the repeated observation of the coin flip) Given observation xes and for every parameter θ we compute the joint probability and find the largest one. The parameter that gives the largest product-probability is the best estimation of θ!
nice and cool lecture
Really liked the simplicity of instruction. The questions were not audible - would have been even better if you repeated the questions.
well you could have just approached him after the lecture and ask.
According to your formula for the confidence remaining after flipping the coin, why does your confidence always go to 0 regardless of the sequence you obtained?
Excellent
Thanks.
Here's an updated version for 2016 lecture:
ruclips.net/video/TpgiFIGXcT4/видео.html
You can also find the source code at his repo on github:
github.com/AllenDowney/BayesMadeSimple/blob/master/boston16.ipynb
github.com/AllenDowney/BayesMadeSimple
awesome...
But if we know he's married, the probability he is married is 1 right?
I thought the same.
Where is the URL of website?
Seems like some interesting stuff. Unfortunately, mostly out of focus screens. Questions that are not at all audible. This is a very, very raw presentation that could do with a lot of cleaning up.
the solution to the Elvis problem is at 2:27:45
For any just starting this circa 2014, python released newer versions. The thinkbayes.py script doesn't work in Python 3. I'm having much better luck using python 2. It's also easier (possible) to import numpy and scipy in Python 2.
Actually thought he was a bit condescending towards the students. The video also had several interruptions where the coordinator (or whomever was running the seminar) kept walking up to the front and nagging the instructor.
"run this experiment in multiple universes" ....
stats are iffy
This system is great for gambling
first they hide the room, secondly they close the door, third made it socially uncomfortable, then the title say "as easy as possible". cool
He looks like steve job on his final days.lean
whata cutie patoody
the teacher is very good at explaining bayesian statistics, but his python implementations are not very 'pythonic', he should have implemented the __XXX___ methods like __mul__ instead of Mult for multiplying, also his CammelCase of methods and function confuses me :D
After taking a stochastic course this term I think it is pretty much impossible to understand what he is talking about. Saying something is not the same as explaining it, there is just way to much to cover to really understand Bayes and hypothesis and likelihood and so on.
bae?