Thanks, this is a very helpful video. One question, in the video you mentioned that since probability is between 0 and 1 and probabilities sum to 1, you need to raise e to the power of each score and divide by the sum of the scores to obtain a probability. Is there a reason that you choose e as the base of the exponent? Why not choose another number? My confusion is that if I chose a number like 10 as the base, I'm pretty sure my softmax model would classify everything the same as if I had chosen e, but that the probabilities calculated would be different. I'm wondering if softmax is actually returning the real probability, or just a number between 0 and1 that behaves like the real probability. Thanks!
This is a really good question that I hadn't thought about before. First, using base 10 will probably work fine because of all the reasons you say. If you were training a neural network, you could probably use any number and the network would just adjust the logits to do what it must do. I see there are some practical reasons to use e: forums.fast.ai/t/why-does-softmax-use-e/78118 And finally I want to ask tongue-in-cheek: What does it mean when you say "real probability"? : ) No one knows the real probability except the Creator, and all we're doing is trying to model it ;)
@@kamperh Yeah maybe the "real probability" can only be 0 or 1, as the data point either does belong to the class or does not. But we don't know which class it belongs to, so SoftMax gives us a probability that is different from the so-called "real probability" but that helps us make a guess. Thank you for your help!
the whole logistic regression playlist is o good. great job!
Very happy it helps!
Fantastic explanation, thanks a lot Herman!
Your videos are quite easy to comprehend. Good job.
Thank you for explaining the whole concept in a much simpler way.
Very clear and concrete explanation! Thank you so much.
Very big pleasure! :D
Thank you very much, it helped me a lot for my university classes. Very clear explanation. :)
Very happy it helped!! :D
This is a remarkable explanation, simple and complete, thank you very much !
love it! so well explained and love the mathmetical proof
thank you for the explanation,
Better than the professor at University ! ...
What an awesome video, thank you! I made sure to watch lots of ads for you, even though it probably doesnt make a difference :D
Thanks for the encouragement!! :)
Thanks, this is a very helpful video. One question, in the video you mentioned that since probability is between 0 and 1 and probabilities sum to 1, you need to raise e to the power of each score and divide by the sum of the scores to obtain a probability. Is there a reason that you choose e as the base of the exponent? Why not choose another number? My confusion is that if I chose a number like 10 as the base, I'm pretty sure my softmax model would classify everything the same as if I had chosen e, but that the probabilities calculated would be different. I'm wondering if softmax is actually returning the real probability, or just a number between 0 and1 that behaves like the real probability. Thanks!
This is a really good question that I hadn't thought about before. First, using base 10 will probably work fine because of all the reasons you say. If you were training a neural network, you could probably use any number and the network would just adjust the logits to do what it must do. I see there are some practical reasons to use e: forums.fast.ai/t/why-does-softmax-use-e/78118 And finally I want to ask tongue-in-cheek: What does it mean when you say "real probability"? : ) No one knows the real probability except the Creator, and all we're doing is trying to model it ;)
@@kamperh Yeah maybe the "real probability" can only be 0 or 1, as the data point either does belong to the class or does not. But we don't know which class it belongs to, so SoftMax gives us a probability that is different from the so-called "real probability" but that helps us make a guess. Thank you for your help!
Very well explained!
Great work man!!
Very good Explanation :)
Great job
nicely explained sherlock homes
I'm a big fan of the Sherlock books and series, so I'll take it as a compliment :P
Prof. Herman, i have an issue for this can i sent to your email please ?