What is Norm in Machine Learning?

Поделиться
HTML-код
  • Опубликовано: 1 окт 2024

Комментарии • 120

  • @jsridhar72
    @jsridhar72 4 года назад +80

    I felt as if I am watching 3Blue1Brown :)

  • @krmt
    @krmt 3 года назад +52

    Brilliant, you explained something my lecturer and about 5 other websites couldn't in plain clear terms. Thank you.

    • @NormalizedNerd
      @NormalizedNerd  3 года назад +1

      Happy to help :D

    • @PunmasterSTP
      @PunmasterSTP Год назад

      I just came across and was curious. How’d the rest of your class go?

    • @krmt
      @krmt Год назад +1

      @@PunmasterSTP Thanks for asking, not great unfortunately so I pulled out. I think the course plan was very well thought out but the teaching was really lacking when trying to explain some new and difficult concepts. I'll look for other paths for learning this topic.

    • @PunmasterSTP
      @PunmasterSTP Год назад +1

      @@krmt I'm sorry to hear that, and I hope your current academic endeavors are going well.

    • @BadMeditator
      @BadMeditator 8 месяцев назад

      which course is this?

  • @martynasvenckus423
    @martynasvenckus423 2 года назад +5

    Wait, isn't it RMSE, and not MSE, that uses l2 norm? Because in MSE we are not taking a square root after we sum the squared elements of Y-Y_pred vector, which is done when calculating l2 norm

  • @khalilalkebsi2580
    @khalilalkebsi2580 2 года назад +5

    Great and intuitive explanation with a very clear visualization. Thanks for your effort. but I would like to pay your attention to the little mistake in the part of MSE where the L2 norm should represent the RMSE not MSE with no root. keep going.

    • @finalpurez
      @finalpurez Год назад

      I was rather confused when he mentioned mean squared errror until I saw your comments. Thanks for the clarification!

  • @anuragdhadse1426
    @anuragdhadse1426 3 года назад +2

    It's RMSE(Root Means Squared Error) corresponding to L2 Norm not MSE (Mean Squared Error). At 4:09 the equation should not be powered up to square should not be there...

  • @chocolatechipturtle
    @chocolatechipturtle 2 года назад +5

    Thank you! I'm just starting linear algebra, and this helped me understand/visualize what's going on with norms much better. Loved it!

  • @libelldrian173
    @libelldrian173 3 года назад +1

    Great video, this guy is probably 3brown1blue. You know, cause Indian people have mostly brown eyes :v

  • @liamw6976
    @liamw6976 Год назад +4

    Im taking my first fundamental analysis class and its kicking my ass.
    This video cleared up 90% of my confusion about wtf a norm was. Youre doing a good thing, keep doing it, I hope you sleep amazing tonight!

  • @namanacharya1148
    @namanacharya1148 3 года назад +5

    short crisp and rich of information that's rare nowadays on youtube

  • @sannalun845
    @sannalun845 3 года назад +3

    It was such a rewarding 5 mins watching this video. Please keep them coming.🔥

  • @PunmasterSTP
    @PunmasterSTP Год назад +1

    Norm in machine? More like “Natural explanation that’s just the thing!” Thanks for sharing. 👍

  • @tsunningwah3471
    @tsunningwah3471 3 года назад +4

    Love you. You deserve a place in heaven

  • @nirbhay_raghav
    @nirbhay_raghav 3 года назад +2

    Please make video on manim. How to use it? No good tutorials out there.

    • @NormalizedNerd
      @NormalizedNerd  3 года назад +1

      Many people are already doing that. Check these out:
      Theorem of Beethoven (YT channel)
      Talking Physics (Blog)
      eulertour.com
      r/manim

  • @trantrungnghia9642
    @trantrungnghia9642 26 дней назад

    "the victory"?? IT'S "THE VECTOR A"

  • @hello-lb3vf
    @hello-lb3vf Год назад

    great video. amazing explanation
    may i please ask that you remove the social tags you included in the top left? it's extremely distracting

  • @wokeclub1844
    @wokeclub1844 Год назад +7

    Thanks man. Just had a class where I got lost what norms were. The college education system is broken. You guys are the saviours.

  • @arno.claude
    @arno.claude 5 месяцев назад

    You're putting the "brown" in 3Blue1Brown 🙂

  • @robtuke
    @robtuke 3 года назад +1

    This is an insanely concise and informative video I found not just the information I was looking for but also other information I didn't know I needed as well. I disagree with VEERARAGHAVAN J, this is better than 3Blue1Brown

  • @bhargavartworks
    @bhargavartworks 3 года назад +2

    Great Video, simple and informative explanation.

  • @chiraggamer3047
    @chiraggamer3047 5 месяцев назад

    hellow people from the future this video really really ehlped

  • @dr_drw
    @dr_drw 4 года назад +2

    Great job. Love manim

  • @hudasedaki5529
    @hudasedaki5529 7 месяцев назад

    wow, it's that simple! thank you!!!!

  • @thecarlostheory
    @thecarlostheory 2 года назад

    WHAT SOWFWARE USE TO ANIMATE THE VIDEO?

  • @abdellahelguermez3786
    @abdellahelguermez3786 2 года назад +1

    Great explanation, thank you so much for the video.

  • @sanketemala1118
    @sanketemala1118 Год назад

    so high quality stuff, feels illegal to watch free

  • @PritishMishra
    @PritishMishra 3 года назад +1

    Thanks, Amazing video and mainly Animations (Are you using Manim ?)

  • @GamduchuBingshuti
    @GamduchuBingshuti 7 месяцев назад

    3:58 what is the maximum of the vector 😊

  • @highvolt8881
    @highvolt8881 9 месяцев назад

    sorry but i did not understand anything.

  • @donfeto7636
    @donfeto7636 10 месяцев назад

    4:03 L2 is RMSE not MSE

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 года назад +1

    Can you do a video on Gaussian naive bayes.

    • @NormalizedNerd
      @NormalizedNerd  3 года назад

      Thanks for the suggestion...I'll try to make one.

  • @tommasobassignana1943
    @tommasobassignana1943 4 года назад +3

    Great, as always. Keep'em coming!

  • @gauravaghariya
    @gauravaghariya Месяц назад

    Appreciate you 👍

  • @mathematicsclass2021
    @mathematicsclass2021 4 года назад +1

    👍

  • @andywang4189
    @andywang4189 Год назад

    Thanks very clear

  • @snehalkumar.singh.cse2070
    @snehalkumar.singh.cse2070 2 года назад

    |x^n| + |y^n| = 1
    1 = 1/(x^n + y^n)
    if n --> inf
    then either x = + or -1 or y = + or -1 for the equality to hold
    which gives us a square

  • @__________________________6910
    @__________________________6910 2 года назад

    I from from year 2022

  • @KrishnaKSanjay
    @KrishnaKSanjay 7 месяцев назад

    👍

  • @gg4y
    @gg4y 3 года назад

    what do you mean by "every point on the circumference of the square is a vector with l1 =1" ? Do you mean the perimeter of the square ? at ruclips.net/video/FiSy6zWDfiA/видео.html

  • @pitiwatkittiwimonchai4656
    @pitiwatkittiwimonchai4656 2 года назад

    great sir

  • @ga7073
    @ga7073 2 года назад

    Genious

  • @Ngkforever9543
    @Ngkforever9543 2 года назад

    Great!

  • @vigneshbalaji21
    @vigneshbalaji21 11 месяцев назад

    Nice video, I am a little confused how L2 norm and mean are related. I see that L2 norm and mean (not mean error) are exactly the same. Does that imply L2 norm's other name is mean ?

  • @rami8038
    @rami8038 Год назад

    thanks. could you please tell me what is the name of the program you made this video.

  • @zhaoyun1230
    @zhaoyun1230 Год назад

    What is the name of the piano piece at 3:03 min mark? Thanks.

    • @bhavyakukkar
      @bhavyakukkar Год назад

      'No. 10 - A New Beginning' by Esther Abrami, part of the RUclips Audio Library

  • @KARTIKEYAKULKARNI
    @KARTIKEYAKULKARNI Год назад

    Thanks for the video, theses 5 minutes cleared up doubts that i had accumulated over 5 lectures haha

  • @swaralipibose9731
    @swaralipibose9731 3 года назад +1

    Lacking of words just amazing 😃

  • @rishimishra691
    @rishimishra691 3 года назад

    Great. Nice explantion

  • @tagoreji2143
    @tagoreji2143 2 года назад

    Good Explanation.Thank you

  • @ufuoma833
    @ufuoma833 3 года назад

    now it makes sense. Maraming salamat po.

  • @abrahanpinedo
    @abrahanpinedo 3 года назад

    Good job man

  • @benebeck5628
    @benebeck5628 3 года назад

    Damn top right corner. So annoying

  • @husamwadi2635
    @husamwadi2635 2 года назад

    You're a saint, thank you

  • @osamansr5281
    @osamansr5281 3 года назад +1

    THANK YOU SO MUCH!!!

  • @pranabjyoti5265
    @pranabjyoti5265 4 года назад +1

    nice brother

  • @三条新月厨
    @三条新月厨 2 года назад

    Love the energy in your voice!

  • @tp5111
    @tp5111 2 года назад

    Thank you for the quick and effective explanation of norm.

  • @almonddonut1818
    @almonddonut1818 2 года назад

    Thank you so much for this!!

  • @Rudra-go6us
    @Rudra-go6us 3 года назад

    a quality education thank u

  • @saikiransrivatsav4187
    @saikiransrivatsav4187 3 года назад

    500th like! first time the ridge and lasso ambiguity got cleared, once for all

  • @litttlemooncream5049
    @litttlemooncream5049 2 года назад

    the dynamic change really helps a lot!! thanks

  • @daresendez
    @daresendez 2 года назад

    This is great! Thank you!

  • @ARADHAKCA
    @ARADHAKCA 2 года назад

    wonderfully explained. I appreciated it a lot.

  • @piku9712
    @piku9712 3 года назад

    Really good. Thanks !! 🔥

  • @xruan6582
    @xruan6582 3 года назад

    I like this video

  • @AmirHX
    @AmirHX 3 года назад

    Thanks sir 🙏🏼

  • @sayantansadhu6380
    @sayantansadhu6380 4 года назад +1

    This was really good. ❤❤

  • @amalnasir9940
    @amalnasir9940 3 года назад

    Great thanks! Do we have to go through the center to measure the distance between the two points? In Euclidean distance, for example, aren’t we measuring the distance through the hypotenuse? After we got the lengths of the two lines from the center?

  • @helloiamstefan
    @helloiamstefan 2 года назад

    most underrated video in math youtube

  • @spensernaor3811
    @spensernaor3811 3 года назад

    perfect explanation. thanks so much!

  • @augustoc.romero1130
    @augustoc.romero1130 3 года назад

    Really good man, thanks

  • @ndmarich
    @ndmarich 3 года назад

    Wow you have opened my eyes to the truth

  • @shubhpachchigar1457
    @shubhpachchigar1457 3 года назад

    Can you share resources to where I can learn about L(inf) norms?

    • @NormalizedNerd
      @NormalizedNerd  3 года назад +2

      Well, it's actually pretty easy to see...
      L_n = (X_1^n + X_2^n + ... + X_k^n)^(1/n)
      When n -> inf, X_max^n >> all the other terms.
      And that is the reason why we only get X_max when we take the nth root.

    • @shubhpachchigar1457
      @shubhpachchigar1457 3 года назад

      @@NormalizedNerd yes exactly

  • @Icebreaker9876
    @Icebreaker9876 3 года назад

    Thank you so much, you explanation makes it so much easier to grasp

  • @lukaradulovic339
    @lukaradulovic339 3 года назад

    Amazing!

  • @ОлегМоргалюк
    @ОлегМоргалюк 3 года назад

    Great !!! Keep going !)

  • @marcocanil4147
    @marcocanil4147 3 года назад

    Nice, may I ask you what you use to create animations? :)

    • @NormalizedNerd
      @NormalizedNerd  3 года назад +1

      I use manim (open source python library)

    • @marcocanil4147
      @marcocanil4147 3 года назад

      @@NormalizedNerd ok, thank you very much :)

    • @kmishy
      @kmishy 3 года назад

      @@marcocanil4147 I was thinking Manim is a software. But I was amazed to see you are using python library