Thank you for the positive feedback. I will keep making more videos with first principles of Machine Learning and Deep Learning. Collecting more content. Stay tuned!
You decide to reduce the dimensionality of your data(N × p) using Best Subset Selection. The library you’re using has a function regress(X, Y ) that takes in X and Y and regresses Y on X. What is the expected number of times regress(·, ·) will be called during your dimensionality reduction? O(2N) O(2p) O(Np) O(p2)
Thank you for making these videos. You take the pain away from learning and understanding complicated materials.
Thank you for the positive feedback. I will keep making more videos with first principles of Machine Learning and Deep Learning. Collecting more content. Stay tuned!
you made me save hours of reading and lectures. Thanks!
Thanks dude..for making me understand
You are most welcome. I am back to recording again so expect more videos :)
You decide to reduce the dimensionality of your data(N × p) using Best Subset Selection. The library you’re using has a function regress(X, Y ) that takes
in X and Y and regresses Y on X. What is the expected number of times regress(·, ·) will be called during your dimensionality reduction?
O(2N)
O(2p)
O(Np)
O(p2)
Saved the day
thank you for making
Just a silly question, this method is also called all possible regressions? can you recommend me a book where i can find an excercise for this?
English is little bit weak but I have understood, presenter has good command over the subject and has capability to make someone understand.
Awesome! Thanks for the feedback. I am back to recording so expect more videos :)
wtf
Tony Zhou what’s wrong Tony?
Nothing is wrong. Ur amazing
@@AIPlayerrrr Haha thanks for the positive feedback. I will be recording some more videos including Deep Learning ones so stay tuned! :)