How I think about
How I think about
  • Видео 6
  • Просмотров 74 473
How I think about Logistic Regression - Part 3
A (hopefully) simple and intuitive explanation of how to interpret the parameters of logistic regression.
We first discuss how interpretation works in the context of probability, then we dive into how to think about logistic regression as a machine learning tool.
Intro & Overview 00:00-02:30
Probabilistic Interpretation 02:30-08:02
Machine Learning Interpretation 08:02-
Part 1: ruclips.net/video/smLdMzVlmyU/видео.html
The Math Behind Logistic Regression: ruclips.net/video/xgY05vLWicA/видео.html
Part 2: ruclips.net/video/Wug2BlYaNdU/видео.html
Visualization and animation code on GitHub: github.com/gallettilance/reproduce-those-animations
Thumbnail by endless.yarning
#statistics #machi...
Просмотров: 185

Видео

How I think about Neural Networks (Super Accessible No Math Intro for Beginners)
Просмотров 45 тыс.Месяц назад
Through simple and intuitive examples, this video will not only teach you what neural networks are designed to do and how they work, you’ll also be presented with a perspective and way of thinking to better grasp their limitations and pitfalls. My hope is that this video will help provide a foundation to better understand the current AI/ML landscape. 00:00 - 00:50 Intro 00:51 - 02:48 Machine Le...
How I think about Logistic Regression - Part 2
Просмотров 5392 месяца назад
A (hopefully) simple and intuitive explanation of how logistic regression works with more than 2 classes. We first talk about One v Rest Classification, then talk about the softmax function, how it's derived, and what role it plays in logistic regression. Note: if you watched the previous version of this video, I decided to republish it with a few minor edits. It's mostly the same but I wanted ...
How I think about Logistic Regression - Technical Interlude
Просмотров 1,8 тыс.3 месяца назад
The Math Behind Logistic Regression. Negative Log Likelihood 00:00-05:25 Gradient Descent Step by Step 05:26-07:20 Scale Your Data 07:21-09:09 Part 1: ruclips.net/video/smLdMzVlmyU/видео.html Part 2: ruclips.net/video/Wug2BlYaNdU/видео.html Visualization and animation code on GitHub: github.com/gallettilance/reproduce-those-animations Thumbnail by endless.yarning #mathformachinel...
How I think about Logistic Regression - Part 1
Просмотров 4,7 тыс.4 месяца назад
A (hopefully) simple and intuitive explanation of logistic regression for binary classification. Intro and Overview 00:00-02:15 How to think about the Threshold 02:15-04:35 Assigning Likeliness 04:36-08:14 Maximum Likelihood 08:15-11:02 Finding the Best Threshold 11:03-11:34 Recap 11:35-12:35 The Math Behind Logistic Regression: ruclips.net/video/xgY05vLWicA/видео.html Part 2: ruclips.net/video...
How I think about Gradient Descent
Просмотров 22 тыс.5 месяцев назад
What is gradient descent optimizing exactly? Source code to generate these animations: github.com/gallettilance/reproduce-those-animations #gradientdescent #machinelearning #neuralnetworks #optimization #math #datascience #educational #machinelearningtutorialforbeginners #datasciencebasics #datasciencetutorial #machinelearning #datascience #datasciencebasics #datasciencetutorial #machinelearnin...

Комментарии

  • @مرتضی_افروزه
    @مرتضی_افروزه 14 дней назад

    That's awesome.🤠

  • @مرتضی_افروزه
    @مرتضی_افروزه 14 дней назад

    wow that's very good😍

  • @Khimg-vlogs
    @Khimg-vlogs 16 дней назад

    Wowow nice sie😊❤😊❤😊

  • @mie5953
    @mie5953 16 дней назад

    Very useful perspective. I can't wait to see the videos about SVMs and other cool approaches.

  • @athuljyothis1951
    @athuljyothis1951 17 дней назад

    great way to think about Logistic Regression from another point of view

  • @shahulrahman2516
    @shahulrahman2516 Месяц назад

    Great

  • @Rakesh123456789rak
    @Rakesh123456789rak Месяц назад

    hello, thank you for this beautiful explanation. The point you covered here exactly confused me so far. In this example input can move in 2D plane but the function can move in 3D plane. so moving down the hill is inappropriate. thanks for making that distinction. also let me know when the follow up video will be available.

    • @howithinkabout
      @howithinkabout Месяц назад

      So glad to hear it! To be honest I’m not sure. I really want to publish quality videos and I haven’t yet found a narrative for part 2 that ties everything together. I also only make these videos in my spare time which factors into things. I’ll do my best to at the very least release part 2 by the end of 2024

    • @Rakesh123456789rak
      @Rakesh123456789rak Месяц назад

      @@howithinkabout that works.

  • @mie5953
    @mie5953 Месяц назад

    This is a very good and clear explanation! I'm really looking forward to the next videos.

    • @howithinkabout
      @howithinkabout Месяц назад

      @@mie5953 so glad to hear it! This was definitely the most challenging video I’ve made so far

    • @SahilKhan-v5g
      @SahilKhan-v5g 12 дней назад

      A😅 jjhki8ioo9pp oui me out with this p😂 0kjkiii8iio 8😊

  • @InspirationBeel
    @InspirationBeel Месяц назад

    very instructif , thank you so much

  • @jakeaustria5445
    @jakeaustria5445 2 месяца назад

    Thank you

  • @mie5953
    @mie5953 2 месяца назад

    Nice video, keep going! On the next step, gradients will be small, [-0.12, 0.1, -0.001], so we won't get a large improvement. The weights after the next step will be [0.574, -2.302, 2.927], and NLL here equals 0.0476, which is again smaller than the previous one (0.0486). Step: 0, NLL: 0.5707 W = [1.0000, -2.0000, 3.0000] grad = [4.3791, 2.9224, 0.7301] Step: 1, NLL: 0.0486 W = [0.5621, -2.2922, 2.9270] grad = [-0.1198, 0.1043, -0.0016] Step: 2, NLL: 0.0476 W = [0.5741, -2.3027, 2.9271] grad = [-0.0801, 0.1181, 0.0038] ... (a few hundreds of steps later) Step:999, NLL: 0.0024 W = [1.4923, -4.3960, 2.8096] grad = [-0.0029, 0.0067, 0.0004]

  • @squib3083
    @squib3083 2 месяца назад

    AMAZING work here. Really enjoy your presentation style.

  • @stunks6147
    @stunks6147 2 месяца назад

    Excited to see how you explain neural networks! Keep up the good work!

    • @howithinkabout
      @howithinkabout 2 месяца назад

      Thank you! Part 3 will be out next (hopefully in the next week) and right after that will be a neural network series

  • @Giovimax98
    @Giovimax98 2 месяца назад

    This is actually also a good introduction to Maximum Likelyhood Extimation even though you didn´t mention the method explicitely.

    • @howithinkabout
      @howithinkabout 2 месяца назад

      @@Giovimax98 Thanks! Yeah I found the name intimidating and distracting more than helpful

  • @xavierchen-t8p
    @xavierchen-t8p 2 месяца назад

    Another goated video! Keep it up!

  • @xavierchen-t8p
    @xavierchen-t8p 2 месяца назад

    FKING GODSEND… BLESS UR LIFE… BLESS UR FAMILY… U’RE A GREAT TEACHER

  • @FitsumWondessen
    @FitsumWondessen 3 месяца назад

    really great video

  • @jakeaustria5445
    @jakeaustria5445 3 месяца назад

    Wow, this channel needs to explode in views!

  • @cornevanzyl5880
    @cornevanzyl5880 3 месяца назад

    Mitochondria are the powerhouse of the cell

  • @jakesimonds5051
    @jakesimonds5051 3 месяца назад

    These videos are fantastic. Your pacing is (for me at least) excellent, the illustrations are awesome, and you're doing a fantastic job of job of motivating everything. Keep it up!!!!!

    • @howithinkabout
      @howithinkabout 3 месяца назад

      So glad to hear it! Thanks for the kind and encouraging words :) I'll do my best!

  • @ssingh7317
    @ssingh7317 3 месяца назад

    Keep this Machine Learning concepts series videos :)

    • @howithinkabout
      @howithinkabout 3 месяца назад

      I got big plans for this channel :) but let me know what you would like to learn about!

    • @ssingh7317
      @ssingh7317 3 месяца назад

      @@howithinkabout I would love to watch mathematics for better understanding of algorithms.

  • @frannydonington9925
    @frannydonington9925 3 месяца назад

    Another great video!! Such good quality explanations. A really great study tool :)

  • @sriveralopez
    @sriveralopez 3 месяца назад

    Yet another one of those videos that, in your head, you think have 100k+ views but it turns out we're just lucky to be here first.

    • @howithinkabout
      @howithinkabout 3 месяца назад

      🙏🙏🙏 that’s so encouraging and kind! Thank you!

  • @HM-wo6ic
    @HM-wo6ic 3 месяца назад

    That was a pleasure to watch.

  • @shahulrahman2516
    @shahulrahman2516 3 месяца назад

    Great video

    • @howithinkabout
      @howithinkabout 3 месяца назад

      thanks so much!! I hope you enjoy part 2 when you get around to it :)

  • @kryzhaist2483
    @kryzhaist2483 4 месяца назад

    Just discovered your channel. Amazing content! Thank you very much for your work, looking forward to see more of it!

    • @howithinkabout
      @howithinkabout 4 месяца назад

      It's definitely hard work to make these videos but comments like yours make it so worth it - thank you so much!

  • @ryanschofield6160
    @ryanschofield6160 4 месяца назад

    Awesome!

  • @Mystic2122
    @Mystic2122 4 месяца назад

    Great video, excited for more

    • @howithinkabout
      @howithinkabout 4 месяца назад

      Thank you so much for watching!! Part 2 (and 3!) should be out real soon

  • @frannydonington9925
    @frannydonington9925 4 месяца назад

    This was so helpful! A really great, new perspective (and way of explaining it)

  • @mrjackrabbitslim1
    @mrjackrabbitslim1 4 месяца назад

    Awesome. Question: why would you be adjusting the constants i.e in the hours/exam length basic chart? The final probability went up but what does that two-fold increase represent in the real world, not math language?

    • @howithinkabout
      @howithinkabout 4 месяца назад

      Thanks so much for watching!! And great question! As you change these parameters you generate different probabilities across your space allowing you to describe it better. (and thus make better predictions etc). So it's less about what these parameters **mean** and more about what they let you **do** (which is to mold the sigmoid function to the data). In logistic regression there is an interpretation of the parameters as increases to the log-odds but that's pretty mathy and as far as I can tell just happenstance and not by design (happy to elaborate on this if you want). In probit regression for example there exists no such interpretation but the mechanism is the same. Similarly for Neural Networks.

    • @mrjackrabbitslim1
      @mrjackrabbitslim1 4 месяца назад

      @@howithinkabout ah I see. It makes sense that you'd want to shape the sigmoid by manipulating the parameters, intuitively it's still hard to convert the abstracted constant value into a tangible example. But as Von Neumann said "in mathematics you don't understand things. You just get used to them". Waiting eagerly for the next vid!

    • @howithinkabout
      @howithinkabout 4 месяца назад

      @@mrjackrabbitslim1 I'm more of the opinion that if you don't understand it, someone's not explaining it well enough :) I made an animation just for you github.com/gallettilance/reproduce-those-animations/blob/main/examples/linear_function.gif to demonstrate what happens to the threshold as you change the parameters. This lets you rotate, shift, and center the sigmoid function. The constant specifically is responsible for shifting things. I'll try to make this more clear in part 2 - thanks for sharing your thoughts!

    • @mrjackrabbitslim1
      @mrjackrabbitslim1 4 месяца назад

      @@howithinkabout wow, thank you!

  • @MarcOBrien-ie4vz
    @MarcOBrien-ie4vz 5 месяцев назад

    Wow what an informative and clear summation with such cool animations well done!

  • @Cheke_180
    @Cheke_180 5 месяцев назад

    Brooo, really loved your content haha. To better add to the analogy, lets say you have your eyes closed/you have no touch sensation, i think that might be a great idea but correct me if i'm wrong. In any case, really loved the video, keep the good work 🎉🎉

    • @howithinkabout
      @howithinkabout 5 месяцев назад

      Love that idea! If I find a way to visualize it I'll include this in part 2 :)

    • @Cheke_180
      @Cheke_180 5 месяцев назад

      @@howithinkabout sounds awesome! I will be watching it ;)

  • @dann_y5319
    @dann_y5319 5 месяцев назад

    Cool video!!!!

  • @sama32lambda
    @sama32lambda 5 месяцев назад

    Awesome video. It's super intuitive looked in this way

    • @howithinkabout
      @howithinkabout 5 месяцев назад

      Thank you so much! That makes me so happy to hear!

  • @ijosakawi
    @ijosakawi 5 месяцев назад

    Very nice video! I think there's a slight issue: the derivative of x^5 - 40x^3 - 5x can be solved really easily. Its derivative is just 5x^4 - 120x^2 - 5, and you can set that to zero, substitute u for x^2 to get 5u^2 - 120u - 5 = 0, use the quadratic formula to solve for u, take its square roots to get x, and check which is lowest in the original f(x). But the specific equation isn't what's important, and the video is very nice otherwise!

    • @ijosakawi
      @ijosakawi 5 месяцев назад

      (by "the derivative can be solved really easily" I mean "you can easily find the zeroes of the derivative")

    • @howithinkabout
      @howithinkabout 5 месяцев назад

      You're absolutely right! I had to make a decision as to what is "easy" to solve and decided that u substitutions are not :D But great to point out I'm sure many watching the video will learn something from your comment!

  • @mrjackrabbitslim1
    @mrjackrabbitslim1 5 месяцев назад

    Awesome. We'll watch as many of these as you're going to make.

  • @arihansharma6384
    @arihansharma6384 5 месяцев назад

    This is an AWESOME introduction to gradient descent! I also love that it's more of a high-level overview rather than delving into the nitty gritty details of the calculus required to make it happen- it's surprisingly beneficial for those that are already used to the concepts. Looking forward to watching the Part 2 soon!

    • @howithinkabout
      @howithinkabout 5 месяцев назад

      So glad to hear that! That means a lot to me especially in this early stage of starting this channel! And I completely agree. The nitty gritty often comes in the way of truly understanding certain concepts but since these are often the only details we're tested on in school it's hard to realize that something is missing.

  • @FreerunnerCamilo
    @FreerunnerCamilo 5 месяцев назад

    First year CS major here dipping my toes in ML and this explanation makes a lot of sense, would love more videos like this! Subbed.

    • @howithinkabout
      @howithinkabout 5 месяцев назад

      That’s so great to hear, thank you for your encouraging words! Please feel free to suggest topics you would like me to cover

  • @dhruvssharma1458
    @dhruvssharma1458 5 месяцев назад

    How would the concept of momentum tie into your explanation? Because when using ADAM one usually specifies the learning rate (which is the step size) and the momentum

    • @howithinkabout
      @howithinkabout 5 месяцев назад

      Great question! I'm planning to talk more about variants of GD (which includes ADAM) in a next video about how to avoid some of the pitfalls of GD. But the tldr is this: adam tries to use historical information contained in the successive gradients to make better step adjustments.

  • @Shourya-bc7ku
    @Shourya-bc7ku 5 месяцев назад

    loved the video, the format, the animations. hope to see more from you

    • @howithinkabout
      @howithinkabout 5 месяцев назад

      thank you so much for the encouraging words!

  • @alefalfa
    @alefalfa 5 месяцев назад

    I watched 3blue1brown and thought there was nothing else to learn about gradient descent. I was wrong. Thank you for the video!

    • @howithinkabout
      @howithinkabout 5 месяцев назад

      thank you so much for the kind words!! It means so much

  • @HIMANIEC
    @HIMANIEC 5 месяцев назад

    wow!