Your videos are such great resources, thank you! I am just wondering, how do we know the theoretical distribution that the coverage should follow? Is there any resource that explains this in more detail? Thanks!
One good resource is this paper: arxiv.org/abs/1209.2673 Informally, you can think of the conformal scores as being uniformly distributed on the coverage scale, and then apply the fact that order statistics of uniformly distributed random variables are beta distributed. en.wikipedia.org/wiki/Order_statistic .
Thanks for the video, it is very accessible despite the complexity of the topic. Indeed, I have a question: what about the application of conformal prediction to the case of binary classification? My doubt regards the size of the prediction set since in the case of binary classification we only have two labels and I guess having a coverage on a prediction set greater than one would be useless.
Great tutorials on Conformal Prediction, thanks a lot! I have a question regarding the algorithm to evaluate the performance. There, in line 4, you compute 1-D_cal.max(axis=1). Shouldn't the score be defined as in your first tutorial as 1-D_cal[np.arange(len(y_cal),) y_cal], i.e., the estimated probabilities of the true labels? Let's say we have a classifier which classifies the wrong class with a probability of almost 1. Then the scores are close to zero and hence our quantile. In line 6, the rhs would be close to one and hence only false prediction would be added to the set. So 'covered' computed in line 7 will mostly be false. Or am I missing something here?
Looking forward to part 3 !
Great easy to understand part 2!
Just what I needed for my master's degree proposal! Full of ideas thanks to you guys! :)
Happy to help! 🎓
Nicely organized and neat introduction. Thanks!
Really interesting, love the videos, the presentations look really good and the way you explain the material really helps with understanding
Extremely helpful, thanks so much!
Thanks for this video. Very informational and to the point.
Very informative video. Thanks
Your videos are such great resources, thank you! I am just wondering, how do we know the theoretical distribution that the coverage should follow? Is there any resource that explains this in more detail? Thanks!
One good resource is this paper: arxiv.org/abs/1209.2673
Informally, you can think of the conformal scores as being uniformly distributed on the coverage scale, and then apply the fact that order statistics of uniformly distributed random variables are beta distributed. en.wikipedia.org/wiki/Order_statistic .
@@anastasiosangelopoulos Thanks a lot!
Thanks for the video, it is very accessible despite the complexity of the topic.
Indeed, I have a question: what about the application of conformal prediction to the case of binary classification?
My doubt regards the size of the prediction set since in the case of binary classification we only have two labels and I guess having a coverage on a prediction set greater than one would be useless.
Great tutorials on Conformal Prediction, thanks a lot!
I have a question regarding the algorithm to evaluate the performance. There, in line 4, you compute 1-D_cal.max(axis=1). Shouldn't the score be defined as in your first tutorial as 1-D_cal[np.arange(len(y_cal),) y_cal], i.e., the estimated probabilities of the true labels? Let's say we have a classifier which classifies the wrong class with a probability of almost 1. Then the scores are close to zero and hence our quantile. In line 6, the rhs would be close to one and hence only false prediction would be added to the set. So 'covered' computed in line 7 will mostly be false. Or am I missing something here?
up
what do you guys use for generating these slides? I really like the font and layout
Goodnotes, and we write by hand!