- Видео 529
- Просмотров 60 512
Yang Song
США
Добавлен 18 ноя 2012
Видео
24-1 Review1-1 Final exam, adjacency list review
Просмотров 1192 года назад
24-1 Review1-1 Final exam, adjacency list review
20-1 Review of Graph Representation BFS
Просмотров 422 года назад
20-1 Review of Graph Representation BFS
thanks
wish you were my professor
Awesome, thank you so much!
YANG SONG MY GOAT BEST PROFESSOR🔥🔥🔥🙏🙏
do you choose a new function each time you insert or only once when you initialize the hashtable? and if its each time you insert, how do you search?
it's a single hash function per hashtable. very important note.
Very Good Channel. Thank you from India. Keep uploading stay healthy.
thank you sir
Fantstic explanation, it really illuminated me ! I just wish you had spent more time explaining the multiplication method. Thank you very much, nevertheless !
It is really fruitful to learn from you
this was asked yesterday in Gate2023 cse.
Thanks a lot!
could u please share the jupyter notebook
Thanks a lot <3
Hi, this is wonderful lecture. I just have one doubt. At timestamp 8:15 , why did you looped K times over the training set to find the nearest neighbors(making it O(m*k)? If we take the distance from training samples into a list and then sort the list in O(m*log(m)) time, then take top k values, overall time in my understanding would be [O(m*d) + O(m log(m)) + O(k)] for one sample, will it be worse than what you suggested, ie O(m*d + m*k) ? If someone asks me the time complexity, can I suggest my approach to them also?
nice video
Great explanation make a vedio on fison trees
can we use this if the target variable y has more then just two possible values ?
Great video yang, keep it up!
RB Trees for Insert and Delete have an additional option - they can flip colours of nodes without changing the tree structure. When this is done, the number of black nodes is preserved in paths so the black node count is the same downwards from any node. This option means RB trees do fewer rotations under Insert (2 maximum) or Delete (3 maximum)
Finally, I found the best explanation for randomized selection after i watched idk maybe 5 to 10 videos... Thank you Sir!
Tnq
Hello Can you share your code please !!! thanks in advance
Heloo, i really love ur video 💕 because it helps me a lot 👍👍👍 n mind if i ask u? How can we explain about the result of features_important... can u explain me based on result in ur video... many thanks 🎉 (sorry if my english is bad)
!!! Amazing! Please, could you explain RFECV?
Excellent explanation.
Can you please explain why time complexity of hierarchical clustering is O(n^3)?
Thank you so much & greetings from Germany :D
thankyou very much sir
HI Yang, do u have any github page where I can find these codes
Sir can you provide the github link
Thank you! Your video helps me a lot !
I'm doing last minute prep for my final this semester, and I am so happy to have found your videos as they have been a great recap, especially hearing it explained differently and I find how you're showing this material to be easy to retain! thank you!
can u help with some problems?
please share notebook with us
Thank you in advance sir🙏🏻
Greetings sir, Thank you very much for your teaching and well explanation. Here in decision tree graph we have not specified anything related to target variable (absent or present) but in tree graph has taken . We only passed feature names not target Can you please explain this
Ive tried all three of those peppers!
0:29 chat-elegant-girls.online
0:36 girls-for-you.online
0:23 virtual-chat-girls.online
That was very helpful!
You are so good sir!
jangan angkat buah la
masih ah nak angkat buah
0:54 dating-girls.online
very good explanation sir . it was very helpful
Thanks for the tutorial! May i ask you if you know anything about SHAP and LIME?