Hey I had a question about the log-likelihood. What if the derivative of log-likelihood wrt theta gives me a constant? How do I interpret that in terms of the MLE of theta?
Interesting question! I think that would mean that the MLE is just any real number; that is, all real numbers result in the same log-likelihood, which means they are all equally 'good' (or bad) at optimizing the log-likelihood. If this was the case, the broader intuition would be that the PDF of the random variable simply doesn't change with the parameter theta, so it makes sense that we can't build an estimate for theta from the observed data. Does that help?
your videos are about to save my semester :)
Good luck! Let me know if you have any questions.
how can i get ahold of your outside youtube?
you the goat breh
Glad you're enjoying these!!
Hey I had a question about the log-likelihood. What if the derivative of log-likelihood wrt theta gives me a constant? How do I interpret that in terms of the MLE of theta?
Interesting question! I think that would mean that the MLE is just any real number; that is, all real numbers result in the same log-likelihood, which means they are all equally 'good' (or bad) at optimizing the log-likelihood.
If this was the case, the broader intuition would be that the PDF of the random variable simply doesn't change with the parameter theta, so it makes sense that we can't build an estimate for theta from the observed data.
Does that help?
@@chessability. yes it does, thank you. It was definitely difficult to interpret.