This is the canonical problem of least squares in optimization theory, some lineal algebra can help us skip the iteration problems and find the best vector space of 1 or n dimension that fits the r^n -> 1 problem to optimize. Great video btw.
Linear regression is usually done with help of linear algebra methods for matrix factorizations, by methods of least squares. Gradient descent is usually used when no noniterative methods can be used -> like fitting ML models
Woa, watching this before 8am... this was too much thinking for me this early :) needs a warning at the beginning hah! Oh man, just brings back memories from College. My nephew texted me the other day, asking if calculus was important for programming. I think you came up with a decent example of this in use. I've never had to use calculus in over 20 years of programming, but I do think that, depending on the job, some will use it. My best advice/guidance to him was that all of the math that I took in college really helped with problem solving, and really taught me to think at a very deep level.
Please make more such videos, as you said covering gradient descent. I really wanna watch an explanation and application of Neural networks by you. I know it might be a long video, and hard to follow through, but just, train us, like bit by bit, teach us from the ground up. Keep this up. Also, can you please provide a list of topics that should be learned, to reach the complexity of neural networks? Like a roadmap for machine learning concepts like this, that would really help out. Thanks. Love your videos❤.
I know elementary school linear regression but I'm lost immediately at 6:30 with this weird python notation. We're unpacking a list, zipping it together, and then casting it to a list again?
Show me how to make such a model for casino crash games. Based on the time of the end of the rounds and its coefficient. We will be very grateful to you.
Fantastic job, its a wonderful content the kind of depth and clarity you have about concept is commendable and you ignited the curiosity of maths. If possible please make a video on how you learn things ond depth and breadth. Seriously wonderful job dude. 🫡
There's no way my man just did gradient descent when there's a simple closed-form formula for linear regression lmao. Fair enough if you want a simple example, but I feel like this neither shows the power of gradient descent nor an efficient way to find a line of best fit.
A video can have no sound effects or editing at all, but as soon as money is mentioned there got to be that cha-ching sound... At this point I suspect this is a RUclips bug.
Or you can just look at the principal eigenvector of the covariance matrix ;) Edit: actually, its way simpler (you can just analytically solve for d/dm = 0 straight away): m = y' . x'/ x' . x' (dot prod) b = - m where x' = x - and is the average of the vector.
still breathing … good
how is the new outro?
Loving it, fantastic job
the intro is soo good!! dont change
and also thats not simple maths cz i don't remember anything
is that what made you take so long?
yes its good
dude i missed your videos
Dude, I can't stress enough how amazing I felt when I saw that you uploaded a video, it's been a while, hope you are doing great!
This is the canonical problem of least squares in optimization theory, some lineal algebra can help us skip the iteration problems and find the best vector space of 1 or n dimension that fits the r^n -> 1 problem to optimize. Great video btw.
Don't let the channel die.
I'm learning to code on Odin, and bookmarked it.
Soon I'll be able to get it.
You are alive .
Was waiting for you.....
Welcome back😊
Linear regression is usually done with help of linear algebra methods for matrix factorizations, by methods of least squares. Gradient descent is usually used when no noniterative methods can be used -> like fitting ML models
finally the penguin is back, need more content!
its a good day when pwn uploads
What a great way to start a (lunar) new year! Welcome back!
Finally the man who taught me randomness is back
Woa, watching this before 8am... this was too much thinking for me this early :) needs a warning at the beginning hah! Oh man, just brings back memories from College. My nephew texted me the other day, asking if calculus was important for programming. I think you came up with a decent example of this in use. I've never had to use calculus in over 20 years of programming, but I do think that, depending on the job, some will use it. My best advice/guidance to him was that all of the math that I took in college really helped with problem solving, and really taught me to think at a very deep level.
Love the simple explanation and visuals to explain these problems. Really helps!
14:58 indent error line 57
This is why i hate python
200k subs! Congratulations!
Please make more such videos, as you said covering gradient descent.
I really wanna watch an explanation and application of Neural networks by you. I know it might be a long video, and hard to follow through, but just, train us, like bit by bit, teach us from the ground up. Keep this up. Also, can you please provide a list of topics that should be learned, to reach the complexity of neural networks? Like a roadmap for machine learning concepts like this, that would really help out.
Thanks.
Love your videos❤.
babe, wake up, pwnfunction posted!
Heyyy! Really nice to see you back bro! 😊
Welcome back! 🎉
Hey! Welcome back! Nice to see you again.
Welcome Back Man. Was missing you.
damnn a video after a year, less gooo
Yo! How do you make your videos? What software do you use? Where did you learn to use it? And, how much time did it take for you to actually learn it?
THE GOAT IS BACKKK
feels good seeing you alive
Why no new videos?
it sucks that youtube kind punishes you if you don't post regularly, but I miss videos in this channel
Amazing content, keep it up!
The legend is back
you would be a great teacher
Woow...i thought i was dreaming. Welcome back
please make more content like this
The RUclips compression hates your gradiants and your axis
How did you know you had to subtract the learning_rate * dedm and learning_rate * dedb when adjusting m and b? Why not add?
Hello, would you like to share the name of software you're using for editing videos? They looks quite awesome!
Nice to see you're alive...
Hi Pwn, the link (and its label) in the description are from the previous video, you forgot to change them when pasting.
My bad, thanks!
master come back
I know elementary school linear regression but I'm lost immediately at 6:30 with this weird python notation. We're unpacking a list, zipping it together, and then casting it to a list again?
Think of zip as transpose. And since zip has lazy execution (it's a generator), the outer list forces it to run and actually produce the data.
cool vid! whats that code theme?
The legend is back!
at 1:15 the line should rotate around x=0
PWN STILL ALIVE!
oh, so you're alive? cool
Bringg more videosss
Any reason for 'm' instead of 'a'.
I know
y = ax+b
Variable names are arbitrary?
Using machine learning for linear extrapolation is like hunting ducks with a GAU-8 gatling gun
long time no see bruh
Show me how to make such a model for casino crash games. Based on the time of the end of the rounds and its coefficient. We will be very grateful to you.
I thought you were dead
Wtf
What happened to the intro? I liked it soo much
quality content
Fantastic job, its a wonderful content the kind of depth and clarity you have about concept is commendable and you ignited the curiosity of maths.
If possible please make a video on how you learn things ond depth and breadth. Seriously wonderful job dude.
🫡
There's no way my man just did gradient descent when there's a simple closed-form formula for linear regression lmao. Fair enough if you want a simple example, but I feel like this neither shows the power of gradient descent nor an efficient way to find a line of best fit.
Where have you been?
sounds like least squares but overcomplicated for the sake of buzzwords
Omg he's back
Bro forgot indent at line 56 15:01
Hi its nice you are back 🎉🎉
you should keep the glasses. thats much better. 3:11
A video can have no sound effects or editing at all, but as soon as money is mentioned there got to be that cha-ching sound... At this point I suspect this is a RUclips bug.
yooo he's back
"y" is almost the same as "4" in this font
pro tip for a beginner content creator :) don't do gradient background, the banding over yt bitrate is horrible ^^
hi pwn
👋
he back :D
i was waiting for a pwning video :/
Missing the old voice bro, 😭
Oh god... please don't use gradient descent for linear regression.
Or you can just look at the principal eigenvector of the covariance matrix ;)
Edit: actually, its way simpler (you can just analytically solve for d/dm = 0 straight away):
m = y' . x'/ x' . x' (dot prod)
b = - m
where x' = x - and is the average of the vector.