Decision Analysis 5: Posterior (Revised) Probability Calculations

Поделиться
HTML-код
  • Опубликовано: 27 окт 2024
  • Calculating Posterior Probabilities for Decision Trees
    Other videos:
    Decision Analysis 1: Maximax, Maximin, Minimax Regret
    • Decision Analysis 1: M...
    Decision Analysis 1.1 (Costs): Maximax, Maximin, Minimax Regret
    • Decision Analysis 1.1 ...
    Decision Analysis 2.1: Equally Likely (Laplace) and Realism (Hurwicz)
    • Decision Analysis 1b: ...
    Decision Analysis 2: EMV & EVPI - Expected Value & Perfect Information
    • Decision Analysis 2: E...
    Decision Analysis 3: Decision Trees 1
    • Decision Analysis 3: D...
    Decision Trees 2: EVSI - Expected Value of Sample Information
    • Decision Analysis 4 (T...
    Sensitivity Analysis in Decision Analysis: • Decision Analysis - Se...

Комментарии • 98

  • @barnabasprince3306
    @barnabasprince3306 7 лет назад +10

    You have no idea how you just saved my life Joshua. Great help and simplified explanations. Thanks so much.

  • @mosessei428
    @mosessei428 9 лет назад +9

    I just want to say thanks Joshua. Your material has been of great help. God Bless You

    • @joshemman
      @joshemman  9 лет назад

      +Moses Sei
      You're welcome Moses.

  • @DanaLoveRobinson
    @DanaLoveRobinson 9 лет назад +5

    Joshua Emmanuel - you are awesome!! Thank you so much for simplifying this concept. You have blessed me tremendously. I was going crosseyed trying to understand this principle when rearranged mathematically in a way that was more difficult for me to conceptualize. Thank you!

  • @eshrn04102
    @eshrn04102 2 года назад +3

    Omg!! You just save my whole semester in just a 3-minute video! thanks a lot!

  • @mdsaimumhossain7878
    @mdsaimumhossain7878 7 месяцев назад +1

    Best explanation of Bayesian Posteriors I've seen so far!

  • @sarun777
    @sarun777 5 лет назад +6

    Finally a Bayes Classification explanation I can actually understand. Thank you !

  • @rowing-away
    @rowing-away 4 года назад +1

    Amazing video. I've watched at least 5 others & didn't understand, but the way you used tables & explained everything both with words & visuals made it so easy to follow. Thanks for the help!

  • @OhadS
    @OhadS 9 лет назад +39

    Joshua, Thanks Man ! so much better than university's lectures !

  • @souravbiswas-tk3se
    @souravbiswas-tk3se 8 лет назад +12

    It is described as simple, short and nice way to understand the concept clearly. Material and presentations are too good. Thank you for sharing.....

  • @joshuapretorious5332
    @joshuapretorious5332 2 года назад +2

    Wow, thankyou so much. I am 4th year engineer and this is going to help so much with my test tomorrow on Decision Theory

  • @AOK342
    @AOK342 5 лет назад +2

    You really have a gift for teaching! Thank you!

  • @MarketingRsch
    @MarketingRsch 3 года назад

    This is one of the best explanations of this topic I've seen. Awesome!

  • @lopamudra22
    @lopamudra22 2 года назад

    Thank you so much Joshua Sir. Your explanation on posterior probability saved me.I was able to resolve my issue.

    • @joshemman
      @joshemman  2 года назад

      Glad it helped, Lopamudra.

  • @mohammadrahman2356
    @mohammadrahman2356 2 года назад

    Awesome! A very simplified presentation of a complex concept.

  • @shubhradatta6763
    @shubhradatta6763 4 года назад

    thanks so much, man. you teach in a way that's so much easy to understand. appreciate it

  • @charlesopikkwot7247
    @charlesopikkwot7247 16 дней назад

    Thank you for your online video presented in clear way.

  • @masudmortaaza2052
    @masudmortaaza2052 2 года назад

    Joshu,the man with brain 💥💥💥
    Love you brother ❣️❣️❣️❣️

  • @ammazainuddin1910
    @ammazainuddin1910 7 лет назад

    Thanks for the instructions on how to make a decision tree. Honestly, this really helped me in the final stage of my final project.. thankyou so much.. 😇😇

  • @arpithadivakar8273
    @arpithadivakar8273 Год назад

    Absolutely love your explanation and the table format was super helpful
    Thank you so much!

  • @danielgalasso7061
    @danielgalasso7061 4 года назад

    Thank you so much for the video Joshua! My stats teacher is a complete bozzo so thanks for saving my grade

  • @iidtxbc
    @iidtxbc 5 лет назад +1

    Thank you for the excellent concise lecture. I am wondering how I should conclude or interpret the analysis?

  • @selvinmacwan2689
    @selvinmacwan2689 3 года назад

    hey Joshua, thank you so much for this Video mate! due to this video i will be able to score high in my exam. more power to you!!!

  • @saipawankumar5297
    @saipawankumar5297 3 года назад

    You explained in 4 minutes what hours of lectures couldn't

  • @tnmyk_
    @tnmyk_ 2 года назад

    Amazing tutorial! Concise and precise! Thanks for making this video!

  • @andreinafara9310
    @andreinafara9310 8 лет назад +1

    Thanks Joshua! You make this material easy to understand and really helpfull.
    GBU

  • @jlsfjl
    @jlsfjl 8 месяцев назад +1

    I subscribed immediately. That was clear!

  • @AILab0001
    @AILab0001 2 года назад

    OMG 😍😍🙏🏾🙏🏾Thank you so much Joshua. Please, can you show us how we draw the decision tree after determining the posterior probabilities?

    • @joshemman
      @joshemman  2 года назад +1

      See if this helps:
      ruclips.net/video/FUY07dvaUuE/видео.html

    • @AILab0001
      @AILab0001 2 года назад

      I watched this video but I am not able to apply it to my exercise. I have many qustions. can you help me?

  • @Texanator34
    @Texanator34 3 года назад +1

    Please do a Bayes Theorem Video! Love your content!

  • @VivekYadav-xh9ds
    @VivekYadav-xh9ds 4 года назад

    You explained in very simple manner

  • @livyaamonicaa
    @livyaamonicaa 5 лет назад

    thank you so much!! it helps me lot for tomorrow exam!!

  • @aminedrift4339
    @aminedrift4339 3 года назад

    This was very well explained, thank you !

  • @arturolara8183
    @arturolara8183 8 лет назад +1

    These are great videos. Thank you so much!!

  • @aadityakarthik6963
    @aadityakarthik6963 3 года назад

    This video was amazing!!! Thank you so much!

  • @christianadrcwxomonua540
    @christianadrcwxomonua540 Год назад

    Hello Mr. Joshua. Can you kindly explain the value of information. That is the Market value, cost value and Economic value? This is related to the new infonomics trend.

  • @pateldharmil2974
    @pateldharmil2974 11 месяцев назад

    Great video and helpfull

  • @AldosWorldTV
    @AldosWorldTV 8 лет назад +5

    lmfao i made the exact same mistake you mentioned by trying to do this before you explained it all hahah love this video

  • @fk1384
    @fk1384 8 лет назад

    Excellent material big thanks

  • @furkanbulut
    @furkanbulut 3 года назад

    very good presentation, thank you so much :)

  • @jessedonkers9584
    @jessedonkers9584 6 лет назад

    Wow thanks for this clear video!!

  • @mebratedinku778
    @mebratedinku778 3 года назад

    Helpful one.

  • @ashutoshrogye7837
    @ashutoshrogye7837 7 лет назад

    Thank you!! You really made it easy to understand :)

  • @danielgalloecheverri2770
    @danielgalloecheverri2770 6 лет назад

    Thank you so much, I could understand the concept!!!

  • @toluadewara
    @toluadewara Год назад +1

    Simplified teaching.

  • @angienepton2257
    @angienepton2257 4 года назад

    Hello Joshua... Thnx a lot... Its helpful... But l want to inquire about how to make a decision given data to calculate the posterior probabilities... How do u conclude from the posterior probs concerning a certain problem at hand...

  • @spunky6059
    @spunky6059 3 года назад

    Thank you so much

  • @sreekanthhartex5541
    @sreekanthhartex5541 2 года назад

    Excellent. Can you please explain the interpretation of the posteriori to a decision.

    • @joshemman
      @joshemman  2 года назад

      posteriori refers to "later" or "after"
      In essence, the additional information obtained after the decision is made.

  • @Megan-vi5uu
    @Megan-vi5uu 5 лет назад

    final exams twos day later... thank you for saving my life

  • @Rodamedima
    @Rodamedima 5 лет назад

    thank you your the best

  • @nurullllllraihana13
    @nurullllllraihana13 2 года назад

    Joshua,what if the question wrote the other way around.For example "when the consultant's report was positive 65%,the economic was growth".So the (Growth | Positive)= 0.65 and (Decline | Positive ) = 0.35.Am I right?please answer my question cause i am confuse with the question when they change the place of the word "given" which is consider as "when" in this question.

    • @joshemman
      @joshemman  2 года назад

      Correct. You can replace "when" with "given".

  • @GUNKELFIVE
    @GUNKELFIVE 8 лет назад +1

    Thank you!

  • @JLCalle
    @JLCalle 4 года назад

    Joshua excellent videos, by any chance do you have a video about Decision Making under risk , sensitivity analysis ?

    • @joshemman
      @joshemman  4 года назад

      Sorry, I have nothing on Sensitivity in Decision Analysis.

    • @DavidsonLoops
      @DavidsonLoops 2 года назад

      Awesome question, have you found any answers since 1 year?

  • @danelleduardo1313
    @danelleduardo1313 Год назад

    If we never had a consultant, so we have no historical data about their performance. How should we assign the probabilities of negative and positive report?

    • @joshemman
      @joshemman  Год назад +1

      Unfortunately, we cannot assign those probabilities unless provided. We will just stick with the prior probabilities in making the decision.

  • @HGooner94
    @HGooner94 6 лет назад

    Thanks again! you saved me

  • @villanuevaliannepaze.9413
    @villanuevaliannepaze.9413 5 лет назад

    thank you so much for this!!

  • @michaeldonlin1950
    @michaeldonlin1950 4 года назад

    Love the stuff. Super helpful for reviewing for a big stats exam comin up!

  • @jagrutimahajan3265
    @jagrutimahajan3265 6 лет назад

    Thank you so much!!

  • @KundanKumar-ul1sw
    @KundanKumar-ul1sw 7 лет назад +1

    thanks a lot ....

  • @zoozo2007
    @zoozo2007 4 года назад

    I have a couple 2 examples about decision making with probability, could you please solve it for me, Thank you

  • @rhondaross8475
    @rhondaross8475 7 лет назад

    Thank you!!

  • @khanyisanimlaba3026
    @khanyisanimlaba3026 9 лет назад

    You and a friend are playing toss the coin and the coin are tossed twice. Each of you get a turn to call heads or tails to indicate the manner the coin would land after it has been tossed. You are required to:
    a. Compute the probability that a head would result on the first toss.
    b.Compute the probability that a tail would result on the second toss given that the first toss resulted in a head.
    c. Compute the probability that two tails would result
    d. Compute the probability that a tail would result on the first toss and a head on the second toss.
    e. Compute the probability that a tail would result on the first toss and a head on the second toss or a head on the first and a tail on the second toss. Elaborate your answer
    f. Compute the probability that at least one head would occur on the two tosses. Elaborate your answer

    • @joshemman
      @joshemman  9 лет назад

      Khanyisani Mlaba
      Check out the solutions here: ruclips.net/video/6HppFWelx64/видео.html

  • @elly5630
    @elly5630 2 года назад

    Hello @Joshua, Do you do coaching?

    • @joshemman
      @joshemman  2 года назад +1

      Sure. Please see About section of this channel for contact info.

  • @banterification
    @banterification 8 лет назад +1

    thanks man

  • @Ayyash1908
    @Ayyash1908 Год назад

    Is conditional probabilities should equal to 1 or it can more or less?

  • @daisyflorence289
    @daisyflorence289 3 года назад

    Why is this example so easy compared to my professor's exam questions. His questions can not be solved at all.

  • @emmanuelmark8071
    @emmanuelmark8071 7 лет назад

    tanks so much sır.
    ı rlly apprecıate for dıs vıdeos

  • @mrn.i3753
    @mrn.i3753 2 года назад

    Is this the same as calculating Bayes Probability or am i missing something

    • @joshemman
      @joshemman  2 года назад

      Yes, it is the same...just in table format.

  • @memo1413a
    @memo1413a 8 лет назад +1

    Is it necessary to get 1 or it's okay to get .99?

    • @joshemman
      @joshemman  8 лет назад +2

      The posterior probabilities normally should add up to 1. If they don't, then check your rounding or increase the number of decimal places you round to. I will suggest rounding intermediate results to 4 decimal places.

  • @Priyanka.___.pradhan.
    @Priyanka.___.pradhan. 2 года назад

    Prior probability. Conditional
    P(E1)=0.10. 0.4
    P(E2)=0.70. 0.7
    P(E3)=0.20. 0.5
    How this will be calculated is not happening to me.

    • @joshemman
      @joshemman  2 года назад

      How many conditions do you have?

  • @TheKennethfilm
    @TheKennethfilm 6 лет назад

    thank man

  • @sineadcapapas
    @sineadcapapas 5 лет назад

    Can the marginal probability not equal to 1?

    • @joshemman
      @joshemman  5 лет назад

      The sum of the marginal probabilities must equal 1.

  • @user-vd8ot8ez4h
    @user-vd8ot8ez4h 3 года назад

    I lovvvvve youu

  • @gasher2000
    @gasher2000 5 лет назад

    Why don't the conditional probabilities add up to 1?

    • @joshemman
      @joshemman  5 лет назад

      Conditional probabilities don't add up to 1 because they are not complementary.

  • @bejjeambaw322
    @bejjeambaw322 7 лет назад

    how ca i get it

  • @Hani-qe3vq
    @Hani-qe3vq 4 года назад

    What is your email?

  • @SurvivingPerspective
    @SurvivingPerspective 6 лет назад

    thank you!