Tutorial III

Поделиться
HTML-код
  • Опубликовано: 4 ноя 2024

Комментарии • 8

  • @adityabikramarandhara9477
    @adityabikramarandhara9477 3 года назад +3

    Finally understood, k-nearest neighbors. Thanks

  • @abhijeetsharma5715
    @abhijeetsharma5715 3 года назад +2

    Lectures by the professor told Feature-Reduction can be of two types: Feature-Selection & Feature-Extraction.
    This tutorial names feature-extraction as feature-reduction. Isn't it?

    • @_datwangai
      @_datwangai 2 года назад

      Yes, but both of them mean the same. Both names intend to reduce the number of features which are either redundant/highly correlated. Thus, we try to reduce the feature space by extracting valuable features among them.

  • @41abhishek
    @41abhishek 5 лет назад +1

    in igen value and vector calculation, should we not derive the covariance matrix before | A-Lamda I | = 0 calculation?

  • @lakshman587
    @lakshman587 3 года назад +2

    33:02 K value should be always a odd number

  • @rashmi_ambale
    @rashmi_ambale 4 года назад +1

    where can we get the lecture material? github link is not working

  • @alaazakaria4630
    @alaazakaria4630 6 лет назад +1

    thanks sir

  • @simba1026
    @simba1026 5 лет назад

    Super kick bumper kick