Animated guide to Linear Regression

Поделиться
HTML-код
  • Опубликовано: 3 фев 2025

Комментарии • 85

  • @mrnobody1286
    @mrnobody1286 Год назад +75

    still breathing … good

  • @PwnFunction
    @PwnFunction  Год назад +58

    how is the new outro?

    • @coder_rc
      @coder_rc Год назад +7

      Loving it, fantastic job

    • @Aditya_khedekar
      @Aditya_khedekar Год назад +1

      the intro is soo good!! dont change

    • @Aditya_khedekar
      @Aditya_khedekar Год назад

      and also thats not simple maths cz i don't remember anything

    • @HHHjb_
      @HHHjb_ Год назад

      is that what made you take so long?
      yes its good

    • @roschabdolaziz3828
      @roschabdolaziz3828 Год назад

      dude i missed your videos

  • @Nulledx
    @Nulledx 11 месяцев назад +4

    Dude, I can't stress enough how amazing I felt when I saw that you uploaded a video, it's been a while, hope you are doing great!

  • @hellfishii
    @hellfishii Год назад +17

    This is the canonical problem of least squares in optimization theory, some lineal algebra can help us skip the iteration problems and find the best vector space of 1 or n dimension that fits the r^n -> 1 problem to optimize. Great video btw.

  • @maurolimaok
    @maurolimaok 8 месяцев назад +1

    Don't let the channel die.
    I'm learning to code on Odin, and bookmarked it.
    Soon I'll be able to get it.

  • @arghosaha6994
    @arghosaha6994 Год назад +9

    You are alive .
    Was waiting for you.....
    Welcome back😊

  • @fugoogle_was_already_taken
    @fugoogle_was_already_taken Год назад +9

    Linear regression is usually done with help of linear algebra methods for matrix factorizations, by methods of least squares. Gradient descent is usually used when no noniterative methods can be used -> like fitting ML models

  • @HuangTim1018
    @HuangTim1018 11 месяцев назад +7

    finally the penguin is back, need more content!

  • @Walking_W
    @Walking_W 11 месяцев назад +1

    its a good day when pwn uploads

  • @aidenpham2405
    @aidenpham2405 Год назад +2

    What a great way to start a (lunar) new year! Welcome back!

  • @rodneynsubuga6275
    @rodneynsubuga6275 Год назад +3

    Finally the man who taught me randomness is back

  • @uClash
    @uClash 5 месяцев назад

    Woa, watching this before 8am... this was too much thinking for me this early :) needs a warning at the beginning hah! Oh man, just brings back memories from College. My nephew texted me the other day, asking if calculus was important for programming. I think you came up with a decent example of this in use. I've never had to use calculus in over 20 years of programming, but I do think that, depending on the job, some will use it. My best advice/guidance to him was that all of the math that I took in college really helped with problem solving, and really taught me to think at a very deep level.

  • @Ciano56
    @Ciano56 Год назад +1

    Love the simple explanation and visuals to explain these problems. Really helps!

  • @adrian_sp6def
    @adrian_sp6def Год назад +13

    14:58 indent error line 57

  • @dan2te2
    @dan2te2 9 месяцев назад

    200k subs! Congratulations!

  • @vinayagarwal770
    @vinayagarwal770 11 месяцев назад +1

    Please make more such videos, as you said covering gradient descent.
    I really wanna watch an explanation and application of Neural networks by you. I know it might be a long video, and hard to follow through, but just, train us, like bit by bit, teach us from the ground up. Keep this up. Also, can you please provide a list of topics that should be learned, to reach the complexity of neural networks? Like a roadmap for machine learning concepts like this, that would really help out.
    Thanks.
    Love your videos❤.

  • @JeersNX
    @JeersNX 11 месяцев назад +1

    babe, wake up, pwnfunction posted!

  • @The_Great_Pietro
    @The_Great_Pietro Год назад

    Heyyy! Really nice to see you back bro! 😊

  • @trevorhart4120
    @trevorhart4120 11 месяцев назад +1

    Welcome back! 🎉

  • @lacno29
    @lacno29 Год назад

    Hey! Welcome back! Nice to see you again.

  • @_abdul
    @_abdul 11 месяцев назад

    Welcome Back Man. Was missing you.

  • @theonlyasher
    @theonlyasher Год назад +1

    damnn a video after a year, less gooo

  • @psibarpsi
    @psibarpsi 11 месяцев назад +1

    Yo! How do you make your videos? What software do you use? Where did you learn to use it? And, how much time did it take for you to actually learn it?

  • @josephseed3393
    @josephseed3393 Год назад +2

    THE GOAT IS BACKKK

  • @beastnighttv
    @beastnighttv 11 месяцев назад

    feels good seeing you alive

  • @userou-ig1ze
    @userou-ig1ze 6 месяцев назад +1

    Why no new videos?

  • @motbus3
    @motbus3 2 месяца назад +1

    it sucks that youtube kind punishes you if you don't post regularly, but I miss videos in this channel

  • @qoobes
    @qoobes 11 месяцев назад

    Amazing content, keep it up!

  • @wolfrevokcats7890
    @wolfrevokcats7890 Год назад +1

    The legend is back

  • @x1expert1x
    @x1expert1x 11 месяцев назад

    you would be a great teacher

  • @kevinnyawakira4600
    @kevinnyawakira4600 Год назад

    Woow...i thought i was dreaming. Welcome back

  • @sankalpa02
    @sankalpa02 9 месяцев назад

    please make more content like this

  • @LetsDark
    @LetsDark Год назад +2

    The RUclips compression hates your gradiants and your axis

  • @violinsheetmusicblog
    @violinsheetmusicblog Год назад +1

    How did you know you had to subtract the learning_rate * dedm and learning_rate * dedb when adjusting m and b? Why not add?

  • @hiimthelegend6644
    @hiimthelegend6644 9 месяцев назад

    Hello, would you like to share the name of software you're using for editing videos? They looks quite awesome!

  • @0xExploitXpErtz
    @0xExploitXpErtz 11 месяцев назад

    Nice to see you're alive...

  • @qexat
    @qexat 11 месяцев назад

    Hi Pwn, the link (and its label) in the description are from the previous video, you forgot to change them when pasting.

  • @RolandOwner-dj1zc
    @RolandOwner-dj1zc 2 месяца назад

    master come back

  • @cyndaguy
    @cyndaguy Год назад

    I know elementary school linear regression but I'm lost immediately at 6:30 with this weird python notation. We're unpacking a list, zipping it together, and then casting it to a list again?

    • @d00dEEE
      @d00dEEE Год назад +2

      Think of zip as transpose. And since zip has lazy execution (it's a generator), the outer list forces it to run and actually produce the data.

  • @shubhxms
    @shubhxms 11 месяцев назад

    cool vid! whats that code theme?

  • @Blendernoob14
    @Blendernoob14 Год назад

    The legend is back!

  • @Bloubz77
    @Bloubz77 11 месяцев назад +1

    at 1:15 the line should rotate around x=0

  • @levoganisian5852
    @levoganisian5852 Год назад

    PWN STILL ALIVE!

  • @nitrogenez
    @nitrogenez 11 месяцев назад

    oh, so you're alive? cool

  • @hakimmalik6995
    @hakimmalik6995 7 месяцев назад

    Bringg more videosss

  • @AgentM124
    @AgentM124 Год назад

    Any reason for 'm' instead of 'a'.
    I know
    y = ax+b
    Variable names are arbitrary?

  • @deidara_8598
    @deidara_8598 11 месяцев назад

    Using machine learning for linear extrapolation is like hunting ducks with a GAU-8 gatling gun

  • @xahmi
    @xahmi 11 месяцев назад +1

    long time no see bruh

  • @ContentYTpro
    @ContentYTpro 11 месяцев назад

    Show me how to make such a model for casino crash games. Based on the time of the end of the rounds and its coefficient. We will be very grateful to you.

  • @Masterix.
    @Masterix. Год назад +5

    I thought you were dead

  • @antonaparin
    @antonaparin Год назад

    What happened to the intro? I liked it soo much

  • @motyakskellington7723
    @motyakskellington7723 11 месяцев назад

    quality content

  • @madhavanand756
    @madhavanand756 Год назад +2

    Fantastic job, its a wonderful content the kind of depth and clarity you have about concept is commendable and you ignited the curiosity of maths.
    If possible please make a video on how you learn things ond depth and breadth. Seriously wonderful job dude.
    🫡

  • @MithicSpirit
    @MithicSpirit Год назад +1

    There's no way my man just did gradient descent when there's a simple closed-form formula for linear regression lmao. Fair enough if you want a simple example, but I feel like this neither shows the power of gradient descent nor an efficient way to find a line of best fit.

  • @agastronics
    @agastronics Год назад

    Where have you been?

  • @einsjannis
    @einsjannis Год назад

    sounds like least squares but overcomplicated for the sake of buzzwords

  • @scliffbartoni9771
    @scliffbartoni9771 Год назад

    Omg he's back

  • @IsarEdits
    @IsarEdits Год назад +2

    Bro forgot indent at line 56 15:01

  • @user-zf9oe2rw9f
    @user-zf9oe2rw9f Год назад +1

    Hi its nice you are back 🎉🎉

  • @javidking63
    @javidking63 Год назад

    you should keep the glasses. thats much better. 3:11

  • @Stdvwr
    @Stdvwr Год назад

    A video can have no sound effects or editing at all, but as soon as money is mentioned there got to be that cha-ching sound... At this point I suspect this is a RUclips bug.

  • @danielnielsen24
    @danielnielsen24 Год назад

    yooo he's back

  • @xfxox
    @xfxox Год назад

    "y" is almost the same as "4" in this font

  • @craftminerCZ
    @craftminerCZ Год назад +1

    pro tip for a beginner content creator :) don't do gradient background, the banding over yt bitrate is horrible ^^

  • @coder_rc
    @coder_rc Год назад +3

    hi pwn

  • @ehrenmann69
    @ehrenmann69 Год назад

    he back :D

  • @pu239
    @pu239 11 месяцев назад

    i was waiting for a pwning video :/

  • @zer0day463
    @zer0day463 10 месяцев назад

    Missing the old voice bro, 😭

  • @edems131
    @edems131 Год назад

  • @maxniederman9411
    @maxniederman9411 11 месяцев назад

    Oh god... please don't use gradient descent for linear regression.

  • @johanngambolputty5351
    @johanngambolputty5351 Год назад

    Or you can just look at the principal eigenvector of the covariance matrix ;)
    Edit: actually, its way simpler (you can just analytically solve for d/dm = 0 straight away):
    m = y' . x'/ x' . x' (dot prod)
    b = - m
    where x' = x - and is the average of the vector.