Co-variance of a data Matrix Dimensionality reduction and visualization Lecture 8 @Applied AI Course

Поделиться
HTML-код
  • Опубликовано: 3 янв 2025

Комментарии • 8

  • @snake1625b
    @snake1625b 6 лет назад +3

    Your entire series on this is 10/10!

  • @helloshifna4532
    @helloshifna4532 Год назад

    clarity of concepts and simple explanations...Thank you very much sir

  • @shreeshapr7207
    @shreeshapr7207 6 лет назад +3

    Is n in the denominator is missing in the final equation?

  • @lix2k3
    @lix2k3 3 года назад

    Excellent explanations!

  • @samcool6569
    @samcool6569 4 года назад

    The final eqn at the end of the video, @ 23:23, is it also missing the 'n' in denominator?

  • @amanoswal7391
    @amanoswal7391 5 лет назад +1

    Please provide the link for your correlation video

  • @questforprogramming
    @questforprogramming 5 лет назад +1

    @ 12:50 is that 'xi2' or it should be 'xj1'? Also
    @ 16:08 is that 'xi2' or it should be 'xj1'?
    As per my view it should be( 1/n sigma... xi1 * xj1 )
    Please correct me if I am wrong..!!

  • @rana31ify
    @rana31ify 5 лет назад

    The gist is if a matric is square & symmetric then you corresponding calculations are equal xD. But well explained.