To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning To learn more about Grid: www.grid.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
I almost quit understanding cnn with the fancy jargons all over the internet. After watching your playlist, you gave me ray of hope. You are freaking genius of explaining things in simplicity. hope to see your playlist with advance cnn topics (object detection, semantic segmentation and siamese network). Thank You 3000
"Mathematicians and machine learning people define tensors in different ways". This one sentence made a world of difference for my learning. May be it's just me; but I can't thank you enough.
I can't thank you enough sir, this is so well explained i'm almost crying. Thank you so much for your efforts I'll definitely buy some of the study material you offer when I will be able to.
Thank you! As a physicist I was originally confused by tensors in neural networks. Great video. It'll be cool to include some tensor manipulations in this video or a future one :)
Finally I am not confused as I did not know ML tensor is different from the tensor in maths (even though I still don't know how GPU works)! Thank you!!!
I have a cs study project aboug GNNs and was looking up Tensors. And i was hit by the agony of Tensors in the Context of deep mathematics and physics. The moment i open a CS Video about Tensors im met with music and good vibes
Thank you very much for all the effort you put into your presentations! and thank you for making it as fun, simple and useful as possible! You're the best dude out there! 💘💘
I am in a bit of a quandary- trying to decide, of your skills, which is superior: you skill as a singer or your skill as a teacher of Machine Learning!
Loved it. Great vid. An ML explainer I can actually understand. Exciting, such BAM! Gonna watch everything else next. I should take your ML course, I assume you have one -- with exercises and such?
perhaps a top 3 jingle, I really enjoyed it. Even with time to reflect, I am going with: 1) Statquest, its bad to the bone; and 2) were going to do a lot of maths step by step by step... statquest ...the bangers
Your videos are great!! I just saw your video on decision trees and you explained the concepts so clearly, I immediately subscribed. Would you ever go over Patient Rule Induction methods (PRIM)? It seems like a really interesting algorithm in OLAP contexts, but all I really see of it are complicated, math-notation-heavy white papers and patent applications that tweak the original to be more efficient (but use their own made up lexicon to describe it).
Sir, you have taught me more in few videos than my Professors did in 1 full year. I am ever grateful to you. Also, could you please do more videos on Tensor flow (theory part e.g., eager/graph execution, name scopes, placeholders etc.)?
This is the first time I've even tried doing a Premiere so I have no idea what the normal procedure is. How long do people usually have to wait? I picked a week out simply I thought 1) it would be fun to try a premiere (since I've never done one before and want to see what it is like) and 2) I'm all booked until a week from today. Would it be better to not announce the video/premiere until later this week?
@@statquest I'd suggest 4-24 hours between upload and the release time, maybe up to 48 hours for a major event release. The reasons it is awkward to set a longer delay: 1) For people who get notifications, unless it is a premiere for an unusually important video that they should indeed be looking forward to as an event, it can be annoying to get notified about something that can't be watched for multiple days. 2) The video gets added to the Subscriptions feed right away (as a Premiere), even though it isn't watchable until the date. So it just sits there cluttering the feed. This can have two effects: (a) For people who hide videos after they've watched them in this feed, it's tempting to just hide the video if it is sitting there for too long. (b) And for those that don't use the "hide" feature, the video will also be buried by the time it goes live even if it is resurfaced at the release. In this second case, the value of the Premiere is largely lost, because the reminder was buried under a bunch of other videos for several days, so the value of a reminder via Premier doesn't do much good. That's my line of thinking anyways.
Thank you, i was about to leave this planet because of the wonderful people who are given the task to teach students about ML but cant teach a thing and give zero when they fail eventually.
I have a video on CNNs here: ruclips.net/video/HGwBXDKFk9I/видео.html however, in the future I plan on more applied videos that show how to do it in PyTorch-Lightning.
Interesting. Thanks. I come from manifold/engineering point of view. Which turns out to be a useful mental tool for some sorts of chemistry. Y' have to imagine, often, how some sorts of molecules interact. Using or having a background in manifold or Linear Algebra, turns out an excellent adjunct. Who knew? I thought that the maths were just a lot of fun at the time.
Great video. I dont see that tensors in math and physics are somehow different from Ml, though, because they are still the same tool, just with different applications. You still even have Einstein's summation notation (Einsum).
😆let's go ,I think I can't able to sleep well tonight ,i need at least 3 day to proper classification n get command on it , but as always it's really help me a lot to clear my all the doubts n confusion 💥 💥 double bam 😄 👍
I would really love it if you could do a video on Projection Pursuit Analysis, since there aren't any great videos explaining the statistical underpinnings. Thanks for the excellent content as always!
"Tensor cores are processing units that accelerate the process of matrix multiplication", so then we're calling them Tensors instead of Matricies, so we can use Tensor cores, which multiply matricies. Makes sense.
Well tensors are generalized matrices not limited to two dimensions to matrices so just as 2D concepts are useful in our 3D world I'm assuming matrix operations are useful in tensors.
My only question is: Why can tensors run in gpus? I've been trying to find information on it for the longest time and still found nothing. Why can't numpy arrays be stored in GPU? Thanks in advance! PS: Thanks to statquest, I was able to pass my data science class!!
GPUs have their own instruction set, which is different from what you find on a standard GPU, so you have to code for that specifically. For details, see: en.wikipedia.org/wiki/CUDA
Ugh math ? Anti-BAM!!! Awesome explanation :) !! I'm biologists and used to think that tensors in math and ML are the same ! Anyone knows how to think them ?
Tensors also have automatic differentiation. And, as far as I can tell, "safetensor" is a way to store tensors on disk that comes with some nice features, like not having to load the entire file into memory in order to inspect the values.
Hi, I Need a more Advanced video about tensors... The feed forward step can be written as g(Wx+b) where W is square weights matrix, x is the input vector, b is the bias and g the activation function.. now. What if x is not a Vector, but Is a Matrix or a cube? I Need the generalized algorithm for feedforward step. There Is no place on the internet with that algorithm. Thank you
Hello this one is homework Question: To examine the bone mineral density of women with ankle fractures, the investigators recruited 10 postmenopausal women with ankle fractures and 12 healthy postmenopausal women to serve as controls. The stiffness index of the lunar Achilles in each woman was obtained. The mean stiffness index for the ankle fracture group was 76.4 with a standard deviation of 5.83. In the control group, the mean was 82.3 with a standard deviation of 6.34. Assume that both samples are drawn from normal populations. (i) Test at 5% level of significance, whether the variances of the stiffness indices for the two groups are equal. (ii) Using p-value approach, examine whether these data provide sufficient evidence to conclude that, in general, the mean stiffness index is higher in healthy postmenopausal women than in postmenopausal women with ankle fractures? Take a=0.05 (iii) Obtain a 95% confidence interval for the difference of two population mean stiffness indices. Does this interval confirm the conclusion derived in part (ii).
Could you do a video about "Bach training", or what it is called :), and how all partial derivatives are handeld in those situations? For example if they are added into a sum, or that the average derivative is calculated.
For details on "batch training" see my video on Stochastic Gradient Descent: ruclips.net/video/vMh0zPT0tLI/видео.html Also, whether or not we add or average the derivatives depends on the loss function. If we use the Sum of the Squared Residuals, we simply add. If we use Mean Squared Error, we use the average.
@@manujarora5062 For each data point, we calculate the derivative. We can add them, or we can average them. For details, see: ruclips.net/video/sDv4f4s2SB8/видео.html
To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning
To learn more about Grid: www.grid.ai/
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
Oh ok 👍
Can you please do a video on transformers?
@@rskandari I'm working on one.
I almost quit understanding cnn with the fancy jargons all over the internet. After watching your playlist, you gave me ray of hope. You are freaking genius of explaining things in simplicity. hope to see your playlist with advance cnn topics (object detection, semantic segmentation and siamese network). Thank You 3000
Glad I could help!
Oh yea it would be very helpful to have videos with the advance topics!
Me reading ML papers and finding tensors: Ugh
Me watching StatQuest and finding tensors: Triple BAM!!!
Hooray!
Triple BAM indeed!
"Mathematicians and machine learning people define tensors in different ways".
This one sentence made a world of difference for my learning.
May be it's just me; but I can't thank you enough.
Thank you! :)
@@statquest Thanks man. Most statements you made really were eye openers in this field. Thank you again.
You rock! I learned much more from your series in NN in 3 days than sitting in a machine learning class for one semester!
Wow, thanks!
I thought that I understood ANN, but now I feel that everything is so much more intuitive. Thank you!
Glad it was helpful!
I can't thank you enough sir, this is so well explained i'm almost crying. Thank you so much for your efforts I'll definitely buy some of the study material you offer when I will be able to.
Thank you! :)
So excited I’ve been trying to understand tensors can’t wait 🥳
Hooray! :)
The best channel I've ever seen for data science ❤️
Thank you!
Thank you! As a physicist I was originally confused by tensors in neural networks. Great video. It'll be cool to include some tensor manipulations in this video or a future one :)
Thanks! My videos on coding in PyTorch show tensors in action: statquest.org/video-index/
Cool. I just gave a lecture today on how to do linear regression with Pytorch using basic tensor operations. I'm sure your presentation will be great!
Cool!
Finally I am not confused as I did not know ML tensor is different from the tensor in maths (even though I still don't know how GPU works)! Thank you!!!
Hooray!
I have a cs study project aboug GNNs and was looking up Tensors. And i was hit by the agony of Tensors in the Context of deep mathematics and physics. The moment i open a CS Video about Tensors im met with music and good vibes
bam! :)
Just got video at the right time and I already kniw after seeing this video i will have my concepts cleared
BAM! :)
You are the best teacher on my list !!
Wow, thanks!
Thank you very much for all the effort you put into your presentations! and thank you for making it as fun, simple and useful as possible! You're the best dude out there! 💘💘
Thank you very much!
This is what I am waiting for BAM!!!
Hooray! :)
Very well explained with those interesting pictorial representations of inputs, activation functions, and all.
Thanks!
Simple and nice Tutorial Professor. But,
Expected an In-depth and more ComprehensiveTutorial about Tensor.
Thank you Professor.
Noted
Subbed. I need more StatQuest in my life.
BAM! :)
Whoaaa!
That is a very clear and fun explanation, never learned like this.
Feeling Blessed
[Edit: This Guy is seriously Under Rated]
Thanks!
I am in a bit of a quandary- trying to decide, of your skills, which is superior: you skill as a singer or your skill as a teacher of Machine Learning!
bam! :)
@@statquest I should add; both skills are extraordinary!
@@exxzxxe You're too kind!
Loved it. Great vid. An ML explainer I can actually understand. Exciting, such BAM! Gonna watch everything else next. I should take your ML course, I assume you have one -- with exercises and such?
I don't have a course yet. I hope that one day I will. :)
Me everytime Josh uploads: YIPPEEEEE
Bam!
Very well explained as usual. Can we have one video for Automatic Differentiation also please?
I'll keep that in mind.
Everyone: Darth Vader is the greatest villain to Luke Skywalker's hero
StatQuest: Bam, meet Ugh
Exactly! BAM vs ugh....
Thanks for sharing after a Long time
Thanks!
I am in love with tensors after seeing your video🤣
bam! :)
perhaps a top 3 jingle, I really enjoyed it. Even with time to reflect, I am going with: 1) Statquest, its bad to the bone; and 2) were going to do a lot of maths step by step by step... statquest ...the bangers
This is definitely one of my favorites. I also really like this one: ruclips.net/video/azXCzI57Yfc/видео.html
I literally needed this
Bam! :)
It would be great if you did a video covering automatic differentiation next!
I'll keep that in mind! :)
@@statquest Please double keep it in mind, it would be super helpful!
Baam
looking forward to more fancy topics in Deep Learning. Btw, thanks for sharing.
Thanks!
Subscribed just for that intro
bam! :)
Thank you for this really good explanation!
Thank you!
Your videos are great!! I just saw your video on decision trees and you explained the concepts so clearly, I immediately subscribed.
Would you ever go over Patient Rule Induction methods (PRIM)? It seems like a really interesting algorithm in OLAP contexts, but all I really see of it are complicated, math-notation-heavy white papers and patent applications that tweak the original to be more efficient (but use their own made up lexicon to describe it).
I'll keep that in mind.
I liked the intro.
Tensormaster!
Bam! :)
Very interesting way to teach :)
Thanks!
You are amazing bro ! Thanks for the amazing vidoes.
Glad you like them!
I love your videos about neural networks, could you also make some videos about policy gradients, which tend to be nice for continuous data.
I'll keep that in mind.
Uhhahah, boxes with numbers inside🤗. Very exciting!! They come in different colors, right? 🤩
Ha! Of course!!! BAM! :)
Sorry, Mr.Josh, I can't watch the premiere, because my area is 01:00 at that time😂😂😂 I will definetly watch the video the 2nd day🤔🤔🤔👍🏻👍🏻👍🏻
Bam! :)
I live for the guitar intro and BAMs
YES! :)
Sir, you have taught me more in few videos than my Professors did in 1 full year. I am ever grateful to you.
Also, could you please do more videos on Tensor flow (theory part e.g., eager/graph execution, name scopes, placeholders etc.)?
I'm doing PyTorch right now if you are interested in that. Just search for PyTorch on this page: statquest.org/video-index/
BAM, BAM, BAM, BAM..................BAM.. Great Sir
Thank you!
Sir, please continue this series on Tensors. Especially tensor factorization.
Please.
I'll keep that in mind!
StatSquatch is totally awesome!
Hooray! BAM! :)
Why such a long delay from the time this video is posted to time it is actually available? A full week seems excessive...
This is the first time I've even tried doing a Premiere so I have no idea what the normal procedure is. How long do people usually have to wait? I picked a week out simply I thought 1) it would be fun to try a premiere (since I've never done one before and want to see what it is like) and 2) I'm all booked until a week from today. Would it be better to not announce the video/premiere until later this week?
@@statquest I'd suggest 4-24 hours between upload and the release time, maybe up to 48 hours for a major event release. The reasons it is awkward to set a longer delay:
1) For people who get notifications, unless it is a premiere for an unusually important video that they should indeed be looking forward to as an event, it can be annoying to get notified about something that can't be watched for multiple days.
2) The video gets added to the Subscriptions feed right away (as a Premiere), even though it isn't watchable until the date. So it just sits there cluttering the feed. This can have two effects: (a) For people who hide videos after they've watched them in this feed, it's tempting to just hide the video if it is sitting there for too long. (b) And for those that don't use the "hide" feature, the video will also be buried by the time it goes live even if it is resurfaced at the release. In this second case, the value of the Premiere is largely lost, because the reminder was buried under a bunch of other videos for several days, so the value of a reminder via Premier doesn't do much good.
That's my line of thinking anyways.
Awesome!!! Thanks for the tips!!! I really appreciate it. I'll keep this in mind for the next Premiere that I do.
Thank you, i was about to leave this planet because of the wonderful people who are given the task to teach students about ML but cant teach a thing and give zero when they fail eventually.
Glad this was helpful.
Hoping that you are gonna make a series on CNN from this video🤞
I have a video on CNNs here: ruclips.net/video/HGwBXDKFk9I/видео.html however, in the future I plan on more applied videos that show how to do it in PyTorch-Lightning.
Excellent video!
Thank you! Cheers!
Interesting. Thanks. I come from manifold/engineering point of view. Which turns out to be a useful mental tool for some sorts of chemistry. Y' have to imagine, often, how some sorts of molecules interact. Using or having a background in manifold or Linear Algebra, turns out an excellent adjunct. Who knew? I thought that the maths were just a lot of fun at the time.
:)
Great video. I dont see that tensors in math and physics are somehow different from Ml, though, because they are still the same tool, just with different applications. You still even have Einstein's summation notation (Einsum).
noted
Super excited for this one!
Bam!
Little slow, but great explanation.
Thanks!
Thanks!
@@statquest No no, thank you!!
At 1.25 speed it was awesome!
@@Antz_411 1.25xBAM!!!
Hi Josh! Thank you for all your amazing videos! Can you make a video about Graph Neural Network? Thanks a lot!
I'll keep that in mind.
😆let's go ,I think I can't able to sleep well tonight ,i need at least 3 day to proper classification n get command on it , but as always it's really help me a lot to clear my all the doubts n confusion 💥 💥 double bam 😄 👍
:)
@@statquest thanks from bottom of my heart sir lots of ppl getting skills base quality knowledge 🙏👍
the song is really awesome
bam! :)
Multiple bams!!
Thats so easily bammed to me now!!
:)
Tensors for students, their mamas and papas
Tensor for breakfast and thoose whos from Belfast
Bim para bam bom paw...
StatQuest ! 💘
That's awesome!!! :)
9:02 shameless self promo -> proudly self promo 😆 😆 😆
bam!
This was tense 😊
:)
BAM!
:)
Another banger
:)
I would really love it if you could do a video on Projection Pursuit Analysis, since there aren't any great videos explaining the statistical underpinnings. Thanks for the excellent content as always!
I'll keep that in mind.
Ooo0Oooo very exciting!
Bam!
Automatic Differentiation
Yep!
Bam! Tensor is flowing
Ha! you made me laugh! :)
"Tensor cores are processing units that accelerate the process of matrix multiplication", so then we're calling them Tensors instead of Matricies, so we can use Tensor cores, which multiply matricies. Makes sense.
Unfortunately Neural Networks have lots of terminology along these lines.
Well tensors are generalized matrices not limited to two dimensions to matrices so just as 2D concepts are useful in our 3D world I'm assuming matrix operations are useful in tensors.
Nice video... btw any plan on making videos on transformer neural networks and attention?
Yep!
My only question is: Why can tensors run in gpus? I've been trying to find information on it for the longest time and still found nothing.
Why can't numpy arrays be stored in GPU?
Thanks in advance!
PS: Thanks to statquest, I was able to pass my data science class!!
GPUs have their own instruction set, which is different from what you find on a standard GPU, so you have to code for that specifically. For details, see: en.wikipedia.org/wiki/CUDA
The simple explanation is that a tensor is something that transforms like a tensor
Noted
thanks
bam!
Ugh math ? Anti-BAM!!!
Awesome explanation :) !!
I'm biologists and used to think that tensors in math and ML are the same ! Anyone knows how to think them ?
The tensors from math have specific mathematical properties that are completely ignored by people that do neural networks.
@@statquest thank you Josh :)
RNN, NLP and word embedding pliss !!! Tkss!!!
I'm working on them.
Would you be able to make a video on how tensors support automatic differentiation?
That's a good idea. I'll keep that in mind.
So Tensors are basically just faster matrices?
And also, is there a difference between tensors and safetensors when talking about image generation AI?
Tensors also have automatic differentiation. And, as far as I can tell, "safetensor" is a way to store tensors on disk that comes with some nice features, like not having to load the entire file into memory in order to inspect the values.
@@statquest Ahh okay, I think I got it now :) Thanks a lot!
Oh wow! He's gone heavy metal now.
:)
Hi, I Need a more Advanced video about tensors... The feed forward step can be written as g(Wx+b) where W is square weights matrix, x is the input vector, b is the bias and g the activation function.. now. What if x is not a Vector, but Is a Matrix or a cube? I Need the generalized algorithm for feedforward step. There Is no place on the internet with that algorithm. Thank you
I'll keep that in mind.
once again, saved my ass
bam! :)
Sooo, tensor is array or ndarray with extra properties for storing neuralnet weights and bias?
Yes, and they store your data so that you can take advantage of hardware acceleration and automatic differentiation.
Indeed tensors are also storing inputs and output values.
they be creating tension. thas it
:)
Waiting for vanishing and exploding(BAMMM) gradients!
Noted!
exciting 😆
Thank you! :)
Bamm !!!
:)
I have a Deep Learning exam in two days, so thanks I guess
Best of luck! BAM!
I love you
:)
I think that would be awesome for GRU units and we can compare with LSTM. Please !!!
I'll keep that in mind.
triple bam,!!!!!!!!!!
:)
Noice 👍
Thanks Matt!
This guy sounds like Mr. Garrison from South Park.
ha! :)
BAM
:)
Hello this one is homework
Question:
To examine the bone mineral density of women with ankle fractures, the investigators recruited 10 postmenopausal women with ankle fractures and 12 healthy postmenopausal women to serve as controls. The stiffness index of the lunar Achilles in each woman was obtained. The mean stiffness index for the ankle fracture group was 76.4 with a standard deviation of 5.83. In the control group, the mean was 82.3 with a standard deviation of 6.34. Assume that both samples are drawn from normal populations.
(i) Test at 5% level of significance, whether the variances of the stiffness
indices for the two groups are equal.
(ii) Using p-value approach, examine whether these data provide sufficient
evidence to conclude that, in general, the mean stiffness index is higher
in healthy postmenopausal women than in postmenopausal women
with ankle fractures? Take a=0.05
(iii) Obtain a 95% confidence interval for the difference of two population
mean stiffness indices. Does this interval confirm the conclusion derived
in part (ii).
If you want help with homework, you should post to some of the stats channels on Reddit. Those people are super helpful! BAM!
@@statquest what name is the channel ple
@@statquest please
Could u also make one for the assumptions of linear and logistics regression
I'll keep that in mind.
Could you do a video about "Bach training", or what it is called :), and how all partial derivatives are handeld in those situations? For example if they are added into a sum, or that the average derivative is calculated.
For details on "batch training" see my video on Stochastic Gradient Descent: ruclips.net/video/vMh0zPT0tLI/видео.html Also, whether or not we add or average the derivatives depends on the loss function. If we use the Sum of the Squared Residuals, we simply add. If we use Mean Squared Error, we use the average.
@@statquest not clear about what you mean by adding the derivatives.
Are you referring to adding the derivative to weight/ bias ?
@@manujarora5062 For each data point, we calculate the derivative. We can add them, or we can average them. For details, see: ruclips.net/video/sDv4f4s2SB8/видео.html
BAMM!!!
:)
As a physicist, now I'm very confused
Yes, for Physics, tensors are a little more than just fancy data structures that are optimized for high speed computing.
NOOO MATH IS NOT UGHH, ITS AWESOMEEEE
:)
💋
:)
Thank you. HOWEVER, NO ENOUGH EXPLANATION. I'D LIKE TO ASK YOU TO THIS TUTORIAL WITH MORE DETAILS AND VERY SLOW! PLEASE
I'll keep that in mind.
goood
Thanks!