To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning To learn more about Grid: www.grid.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
I almost quit understanding cnn with the fancy jargons all over the internet. After watching your playlist, you gave me ray of hope. You are freaking genius of explaining things in simplicity. hope to see your playlist with advance cnn topics (object detection, semantic segmentation and siamese network). Thank You 3000
"Mathematicians and machine learning people define tensors in different ways". This one sentence made a world of difference for my learning. May be it's just me; but I can't thank you enough.
I can't thank you enough sir, this is so well explained i'm almost crying. Thank you so much for your efforts I'll definitely buy some of the study material you offer when I will be able to.
Thank you! As a physicist I was originally confused by tensors in neural networks. Great video. It'll be cool to include some tensor manipulations in this video or a future one :)
I have a cs study project aboug GNNs and was looking up Tensors. And i was hit by the agony of Tensors in the Context of deep mathematics and physics. The moment i open a CS Video about Tensors im met with music and good vibes
Thank you very much for all the effort you put into your presentations! and thank you for making it as fun, simple and useful as possible! You're the best dude out there! 💘💘
Finally I am not confused as I did not know ML tensor is different from the tensor in maths (even though I still don't know how GPU works)! Thank you!!!
perhaps a top 3 jingle, I really enjoyed it. Even with time to reflect, I am going with: 1) Statquest, its bad to the bone; and 2) were going to do a lot of maths step by step by step... statquest ...the bangers
I am in a bit of a quandary- trying to decide, of your skills, which is superior: you skill as a singer or your skill as a teacher of Machine Learning!
Loved it. Great vid. An ML explainer I can actually understand. Exciting, such BAM! Gonna watch everything else next. I should take your ML course, I assume you have one -- with exercises and such?
Thank you, i was about to leave this planet because of the wonderful people who are given the task to teach students about ML but cant teach a thing and give zero when they fail eventually.
Sir, you have taught me more in few videos than my Professors did in 1 full year. I am ever grateful to you. Also, could you please do more videos on Tensor flow (theory part e.g., eager/graph execution, name scopes, placeholders etc.)?
This is the first time I've even tried doing a Premiere so I have no idea what the normal procedure is. How long do people usually have to wait? I picked a week out simply I thought 1) it would be fun to try a premiere (since I've never done one before and want to see what it is like) and 2) I'm all booked until a week from today. Would it be better to not announce the video/premiere until later this week?
@@statquest I'd suggest 4-24 hours between upload and the release time, maybe up to 48 hours for a major event release. The reasons it is awkward to set a longer delay: 1) For people who get notifications, unless it is a premiere for an unusually important video that they should indeed be looking forward to as an event, it can be annoying to get notified about something that can't be watched for multiple days. 2) The video gets added to the Subscriptions feed right away (as a Premiere), even though it isn't watchable until the date. So it just sits there cluttering the feed. This can have two effects: (a) For people who hide videos after they've watched them in this feed, it's tempting to just hide the video if it is sitting there for too long. (b) And for those that don't use the "hide" feature, the video will also be buried by the time it goes live even if it is resurfaced at the release. In this second case, the value of the Premiere is largely lost, because the reminder was buried under a bunch of other videos for several days, so the value of a reminder via Premier doesn't do much good. That's my line of thinking anyways.
Your videos are great!! I just saw your video on decision trees and you explained the concepts so clearly, I immediately subscribed. Would you ever go over Patient Rule Induction methods (PRIM)? It seems like a really interesting algorithm in OLAP contexts, but all I really see of it are complicated, math-notation-heavy white papers and patent applications that tweak the original to be more efficient (but use their own made up lexicon to describe it).
I have a video on CNNs here: ruclips.net/video/HGwBXDKFk9I/видео.html however, in the future I plan on more applied videos that show how to do it in PyTorch-Lightning.
😆let's go ,I think I can't able to sleep well tonight ,i need at least 3 day to proper classification n get command on it , but as always it's really help me a lot to clear my all the doubts n confusion 💥 💥 double bam 😄 👍
Great video. I dont see that tensors in math and physics are somehow different from Ml, though, because they are still the same tool, just with different applications. You still even have Einstein's summation notation (Einsum).
Interesting. Thanks. I come from manifold/engineering point of view. Which turns out to be a useful mental tool for some sorts of chemistry. Y' have to imagine, often, how some sorts of molecules interact. Using or having a background in manifold or Linear Algebra, turns out an excellent adjunct. Who knew? I thought that the maths were just a lot of fun at the time.
Hi, I Need a more Advanced video about tensors... The feed forward step can be written as g(Wx+b) where W is square weights matrix, x is the input vector, b is the bias and g the activation function.. now. What if x is not a Vector, but Is a Matrix or a cube? I Need the generalized algorithm for feedforward step. There Is no place on the internet with that algorithm. Thank you
Tensors also have automatic differentiation. And, as far as I can tell, "safetensor" is a way to store tensors on disk that comes with some nice features, like not having to load the entire file into memory in order to inspect the values.
"Tensor cores are processing units that accelerate the process of matrix multiplication", so then we're calling them Tensors instead of Matricies, so we can use Tensor cores, which multiply matricies. Makes sense.
Well tensors are generalized matrices not limited to two dimensions to matrices so just as 2D concepts are useful in our 3D world I'm assuming matrix operations are useful in tensors.
Could you do a video about "Bach training", or what it is called :), and how all partial derivatives are handeld in those situations? For example if they are added into a sum, or that the average derivative is calculated.
For details on "batch training" see my video on Stochastic Gradient Descent: ruclips.net/video/vMh0zPT0tLI/видео.html Also, whether or not we add or average the derivatives depends on the loss function. If we use the Sum of the Squared Residuals, we simply add. If we use Mean Squared Error, we use the average.
@@manujarora5062 For each data point, we calculate the derivative. We can add them, or we can average them. For details, see: ruclips.net/video/sDv4f4s2SB8/видео.html
I would really love it if you could do a video on Projection Pursuit Analysis, since there aren't any great videos explaining the statistical underpinnings. Thanks for the excellent content as always!
Ugh math ? Anti-BAM!!! Awesome explanation :) !! I'm biologists and used to think that tensors in math and ML are the same ! Anyone knows how to think them ?
My only question is: Why can tensors run in gpus? I've been trying to find information on it for the longest time and still found nothing. Why can't numpy arrays be stored in GPU? Thanks in advance! PS: Thanks to statquest, I was able to pass my data science class!!
GPUs have their own instruction set, which is different from what you find on a standard GPU, so you have to code for that specifically. For details, see: en.wikipedia.org/wiki/CUDA
It's always helped me to remember that a tensor is a thing that transforms like a tensor, but a *tensor* is a thing like a thing that transforms like a tensor but which may or may not trandform like a tensor.
Hello this one is homework Question: To examine the bone mineral density of women with ankle fractures, the investigators recruited 10 postmenopausal women with ankle fractures and 12 healthy postmenopausal women to serve as controls. The stiffness index of the lunar Achilles in each woman was obtained. The mean stiffness index for the ankle fracture group was 76.4 with a standard deviation of 5.83. In the control group, the mean was 82.3 with a standard deviation of 6.34. Assume that both samples are drawn from normal populations. (i) Test at 5% level of significance, whether the variances of the stiffness indices for the two groups are equal. (ii) Using p-value approach, examine whether these data provide sufficient evidence to conclude that, in general, the mean stiffness index is higher in healthy postmenopausal women than in postmenopausal women with ankle fractures? Take a=0.05 (iii) Obtain a 95% confidence interval for the difference of two population mean stiffness indices. Does this interval confirm the conclusion derived in part (ii).
To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning
To learn more about Grid: www.grid.ai/
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
Oh ok 👍
Can you please do a video on transformers?
@@rskandari I'm working on one.
I almost quit understanding cnn with the fancy jargons all over the internet. After watching your playlist, you gave me ray of hope. You are freaking genius of explaining things in simplicity. hope to see your playlist with advance cnn topics (object detection, semantic segmentation and siamese network). Thank You 3000
Glad I could help!
Oh yea it would be very helpful to have videos with the advance topics!
"Mathematicians and machine learning people define tensors in different ways".
This one sentence made a world of difference for my learning.
May be it's just me; but I can't thank you enough.
Thank you! :)
@@statquest Thanks man. Most statements you made really were eye openers in this field. Thank you again.
Me reading ML papers and finding tensors: Ugh
Me watching StatQuest and finding tensors: Triple BAM!!!
Hooray!
Triple BAM indeed!
You rock! I learned much more from your series in NN in 3 days than sitting in a machine learning class for one semester!
Wow, thanks!
I thought that I understood ANN, but now I feel that everything is so much more intuitive. Thank you!
Glad it was helpful!
So excited I’ve been trying to understand tensors can’t wait 🥳
Hooray! :)
I can't thank you enough sir, this is so well explained i'm almost crying. Thank you so much for your efforts I'll definitely buy some of the study material you offer when I will be able to.
Thank you! :)
Thank you! As a physicist I was originally confused by tensors in neural networks. Great video. It'll be cool to include some tensor manipulations in this video or a future one :)
Thanks! My videos on coding in PyTorch show tensors in action: statquest.org/video-index/
The best channel I've ever seen for data science ❤️
Thank you!
Cool. I just gave a lecture today on how to do linear regression with Pytorch using basic tensor operations. I'm sure your presentation will be great!
Cool!
You are the best teacher on my list !!
Wow, thanks!
I have a cs study project aboug GNNs and was looking up Tensors. And i was hit by the agony of Tensors in the Context of deep mathematics and physics. The moment i open a CS Video about Tensors im met with music and good vibes
bam! :)
Just got video at the right time and I already kniw after seeing this video i will have my concepts cleared
BAM! :)
Thank you very much for all the effort you put into your presentations! and thank you for making it as fun, simple and useful as possible! You're the best dude out there! 💘💘
Thank you very much!
Finally I am not confused as I did not know ML tensor is different from the tensor in maths (even though I still don't know how GPU works)! Thank you!!!
Hooray!
Subbed. I need more StatQuest in my life.
BAM! :)
Everyone: Darth Vader is the greatest villain to Luke Skywalker's hero
StatQuest: Bam, meet Ugh
Exactly! BAM vs ugh....
Very well explained with those interesting pictorial representations of inputs, activation functions, and all.
Thanks!
perhaps a top 3 jingle, I really enjoyed it. Even with time to reflect, I am going with: 1) Statquest, its bad to the bone; and 2) were going to do a lot of maths step by step by step... statquest ...the bangers
This is definitely one of my favorites. I also really like this one: ruclips.net/video/azXCzI57Yfc/видео.html
Simple and nice Tutorial Professor. But,
Expected an In-depth and more ComprehensiveTutorial about Tensor.
Thank you Professor.
Noted
Thanks for sharing after a Long time
Thanks!
This is what I am waiting for BAM!!!
Hooray! :)
I am in love with tensors after seeing your video🤣
bam! :)
Very well explained as usual. Can we have one video for Automatic Differentiation also please?
I'll keep that in mind.
Me everytime Josh uploads: YIPPEEEEE
Bam!
Whoaaa!
That is a very clear and fun explanation, never learned like this.
Feeling Blessed
[Edit: This Guy is seriously Under Rated]
Thanks!
Thank you for this really good explanation!
Thank you!
I am in a bit of a quandary- trying to decide, of your skills, which is superior: you skill as a singer or your skill as a teacher of Machine Learning!
bam! :)
@@statquest I should add; both skills are extraordinary!
@@exxzxxe You're too kind!
I literally needed this
Bam! :)
I live for the guitar intro and BAMs
YES! :)
Loved it. Great vid. An ML explainer I can actually understand. Exciting, such BAM! Gonna watch everything else next. I should take your ML course, I assume you have one -- with exercises and such?
I don't have a course yet. I hope that one day I will. :)
I liked the intro.
Tensormaster!
Bam! :)
BAM, BAM, BAM, BAM..................BAM.. Great Sir
Thank you!
Thank you, i was about to leave this planet because of the wonderful people who are given the task to teach students about ML but cant teach a thing and give zero when they fail eventually.
Glad this was helpful.
looking forward to more fancy topics in Deep Learning. Btw, thanks for sharing.
Thanks!
Sorry, Mr.Josh, I can't watch the premiere, because my area is 01:00 at that time😂😂😂 I will definetly watch the video the 2nd day🤔🤔🤔👍🏻👍🏻👍🏻
Bam! :)
Subscribed just for that intro
bam! :)
It would be great if you did a video covering automatic differentiation next!
I'll keep that in mind! :)
@@statquest Please double keep it in mind, it would be super helpful!
Baam
Super excited for this one!
Bam!
Uhhahah, boxes with numbers inside🤗. Very exciting!! They come in different colors, right? 🤩
Ha! Of course!!! BAM! :)
Very interesting way to teach :)
Thanks!
9:02 shameless self promo -> proudly self promo 😆 😆 😆
bam!
You are amazing bro ! Thanks for the amazing vidoes.
Glad you like them!
Sir, please continue this series on Tensors. Especially tensor factorization.
Please.
I'll keep that in mind!
Sir, you have taught me more in few videos than my Professors did in 1 full year. I am ever grateful to you.
Also, could you please do more videos on Tensor flow (theory part e.g., eager/graph execution, name scopes, placeholders etc.)?
I'm doing PyTorch right now if you are interested in that. Just search for PyTorch on this page: statquest.org/video-index/
Why such a long delay from the time this video is posted to time it is actually available? A full week seems excessive...
This is the first time I've even tried doing a Premiere so I have no idea what the normal procedure is. How long do people usually have to wait? I picked a week out simply I thought 1) it would be fun to try a premiere (since I've never done one before and want to see what it is like) and 2) I'm all booked until a week from today. Would it be better to not announce the video/premiere until later this week?
@@statquest I'd suggest 4-24 hours between upload and the release time, maybe up to 48 hours for a major event release. The reasons it is awkward to set a longer delay:
1) For people who get notifications, unless it is a premiere for an unusually important video that they should indeed be looking forward to as an event, it can be annoying to get notified about something that can't be watched for multiple days.
2) The video gets added to the Subscriptions feed right away (as a Premiere), even though it isn't watchable until the date. So it just sits there cluttering the feed. This can have two effects: (a) For people who hide videos after they've watched them in this feed, it's tempting to just hide the video if it is sitting there for too long. (b) And for those that don't use the "hide" feature, the video will also be buried by the time it goes live even if it is resurfaced at the release. In this second case, the value of the Premiere is largely lost, because the reminder was buried under a bunch of other videos for several days, so the value of a reminder via Premier doesn't do much good.
That's my line of thinking anyways.
Awesome!!! Thanks for the tips!!! I really appreciate it. I'll keep this in mind for the next Premiere that I do.
the song is really awesome
bam! :)
StatSquatch is totally awesome!
Hooray! BAM! :)
Your videos are great!! I just saw your video on decision trees and you explained the concepts so clearly, I immediately subscribed.
Would you ever go over Patient Rule Induction methods (PRIM)? It seems like a really interesting algorithm in OLAP contexts, but all I really see of it are complicated, math-notation-heavy white papers and patent applications that tweak the original to be more efficient (but use their own made up lexicon to describe it).
I'll keep that in mind.
Excellent video!
Thank you! Cheers!
I love your videos about neural networks, could you also make some videos about policy gradients, which tend to be nice for continuous data.
I'll keep that in mind.
Hoping that you are gonna make a series on CNN from this video🤞
I have a video on CNNs here: ruclips.net/video/HGwBXDKFk9I/видео.html however, in the future I plan on more applied videos that show how to do it in PyTorch-Lightning.
😆let's go ,I think I can't able to sleep well tonight ,i need at least 3 day to proper classification n get command on it , but as always it's really help me a lot to clear my all the doubts n confusion 💥 💥 double bam 😄 👍
:)
@@statquest thanks from bottom of my heart sir lots of ppl getting skills base quality knowledge 🙏👍
Would you be able to make a video on how tensors support automatic differentiation?
That's a good idea. I'll keep that in mind.
Multiple bams!!
Thats so easily bammed to me now!!
:)
Great video. I dont see that tensors in math and physics are somehow different from Ml, though, because they are still the same tool, just with different applications. You still even have Einstein's summation notation (Einsum).
noted
Ooo0Oooo very exciting!
Bam!
Tensors for students, their mamas and papas
Tensor for breakfast and thoose whos from Belfast
Bim para bam bom paw...
StatQuest ! 💘
That's awesome!!! :)
This was tense 😊
:)
Hi Josh! Thank you for all your amazing videos! Can you make a video about Graph Neural Network? Thanks a lot!
I'll keep that in mind.
Interesting. Thanks. I come from manifold/engineering point of view. Which turns out to be a useful mental tool for some sorts of chemistry. Y' have to imagine, often, how some sorts of molecules interact. Using or having a background in manifold or Linear Algebra, turns out an excellent adjunct. Who knew? I thought that the maths were just a lot of fun at the time.
:)
Nice video... btw any plan on making videos on transformer neural networks and attention?
Yep!
Little slow, but great explanation.
Thanks!
Thanks!
@@statquest No no, thank you!!
At 1.25 speed it was awesome!
@@Antz_411 1.25xBAM!!!
Hi, I Need a more Advanced video about tensors... The feed forward step can be written as g(Wx+b) where W is square weights matrix, x is the input vector, b is the bias and g the activation function.. now. What if x is not a Vector, but Is a Matrix or a cube? I Need the generalized algorithm for feedforward step. There Is no place on the internet with that algorithm. Thank you
I'll keep that in mind.
So Tensors are basically just faster matrices?
And also, is there a difference between tensors and safetensors when talking about image generation AI?
Tensors also have automatic differentiation. And, as far as I can tell, "safetensor" is a way to store tensors on disk that comes with some nice features, like not having to load the entire file into memory in order to inspect the values.
@@statquest Ahh okay, I think I got it now :) Thanks a lot!
Could u also make one for the assumptions of linear and logistics regression
I'll keep that in mind.
Bam! Tensor is flowing
Ha! you made me laugh! :)
"Tensor cores are processing units that accelerate the process of matrix multiplication", so then we're calling them Tensors instead of Matricies, so we can use Tensor cores, which multiply matricies. Makes sense.
Unfortunately Neural Networks have lots of terminology along these lines.
Well tensors are generalized matrices not limited to two dimensions to matrices so just as 2D concepts are useful in our 3D world I'm assuming matrix operations are useful in tensors.
Sooo, tensor is array or ndarray with extra properties for storing neuralnet weights and bias?
Yes, and they store your data so that you can take advantage of hardware acceleration and automatic differentiation.
Indeed tensors are also storing inputs and output values.
Another banger
:)
Could you do a video about "Bach training", or what it is called :), and how all partial derivatives are handeld in those situations? For example if they are added into a sum, or that the average derivative is calculated.
For details on "batch training" see my video on Stochastic Gradient Descent: ruclips.net/video/vMh0zPT0tLI/видео.html Also, whether or not we add or average the derivatives depends on the loss function. If we use the Sum of the Squared Residuals, we simply add. If we use Mean Squared Error, we use the average.
@@statquest not clear about what you mean by adding the derivatives.
Are you referring to adding the derivative to weight/ bias ?
@@manujarora5062 For each data point, we calculate the derivative. We can add them, or we can average them. For details, see: ruclips.net/video/sDv4f4s2SB8/видео.html
I would really love it if you could do a video on Projection Pursuit Analysis, since there aren't any great videos explaining the statistical underpinnings. Thanks for the excellent content as always!
I'll keep that in mind.
Ugh math ? Anti-BAM!!!
Awesome explanation :) !!
I'm biologists and used to think that tensors in math and ML are the same ! Anyone knows how to think them ?
The tensors from math have specific mathematical properties that are completely ignored by people that do neural networks.
@@statquest thank you Josh :)
The simple explanation is that a tensor is something that transforms like a tensor
Noted
Automatic Differentiation
Yep!
RNN, NLP and word embedding pliss !!! Tkss!!!
I'm working on them.
Waiting for vanishing and exploding(BAMMM) gradients!
Noted!
thanks
bam!
BAM!
:)
My only question is: Why can tensors run in gpus? I've been trying to find information on it for the longest time and still found nothing.
Why can't numpy arrays be stored in GPU?
Thanks in advance!
PS: Thanks to statquest, I was able to pass my data science class!!
GPUs have their own instruction set, which is different from what you find on a standard GPU, so you have to code for that specifically. For details, see: en.wikipedia.org/wiki/CUDA
Oh wow! He's gone heavy metal now.
:)
I think that would be awesome for GRU units and we can compare with LSTM. Please !!!
I'll keep that in mind.
Noice 👍
Thanks Matt!
they be creating tension. thas it
:)
This guy sounds like Mr. Garrison from South Park.
ha! :)
Why SMOTE video is gone?
I haven't done a video on SMOTE yet...
I have a Deep Learning exam in two days, so thanks I guess
Best of luck! BAM!
NOOO MATH IS NOT UGHH, ITS AWESOMEEEE
:)
once again, saved my ass
bam! :)
Bamm !!!
:)
As a physicist, now I'm very confused
Yes, for Physics, tensors are a little more than just fancy data structures that are optimized for high speed computing.
triple bam,!!!!!!!!!!
:)
triple ough.. so much math!!
bam! :)
exciting 😆
Thank you! :)
Thank you. HOWEVER, NO ENOUGH EXPLANATION. I'D LIKE TO ASK YOU TO THIS TUTORIAL WITH MORE DETAILS AND VERY SLOW! PLEASE
I'll keep that in mind.
BAM
:)
I love you
:)
It's always helped me to remember that a tensor is a thing that transforms like a tensor, but a *tensor* is a thing like a thing that transforms like a tensor but which may or may not trandform like a tensor.
ha!
Hello this one is homework
Question:
To examine the bone mineral density of women with ankle fractures, the investigators recruited 10 postmenopausal women with ankle fractures and 12 healthy postmenopausal women to serve as controls. The stiffness index of the lunar Achilles in each woman was obtained. The mean stiffness index for the ankle fracture group was 76.4 with a standard deviation of 5.83. In the control group, the mean was 82.3 with a standard deviation of 6.34. Assume that both samples are drawn from normal populations.
(i) Test at 5% level of significance, whether the variances of the stiffness
indices for the two groups are equal.
(ii) Using p-value approach, examine whether these data provide sufficient
evidence to conclude that, in general, the mean stiffness index is higher
in healthy postmenopausal women than in postmenopausal women
with ankle fractures? Take a=0.05
(iii) Obtain a 95% confidence interval for the difference of two population
mean stiffness indices. Does this interval confirm the conclusion derived
in part (ii).
If you want help with homework, you should post to some of the stats channels on Reddit. Those people are super helpful! BAM!
@@statquest what name is the channel ple
@@statquest please