For absolute beginners, go through this course 2-3 times, try to note important terms, topics, and processes, and then take every topic and go deep sequentially. I am happy that you have such a great mentor like him now. Best wishes all.
I watched a lots of deep learning tuturials before this, some of them were even twice as long as this and yet this explained the best and the most. Thank you for the awesome tutorial without ads for free. You are a hero.
@@claudioa.dmedina2020yes. Agree. This video best fit for one with certain background Else it overwhelms Don’t get me wrong, this video is very good. I will watch again
Short and crisp overview of deep learning with cool intuitions. This is the only video that I will recommend to anyone who wants to start with deep learning and even machine learning in general.
Some (hopefully constructive) annotations to improve the video for better clarity: * 6:33 - "Channels have weight". And the slide matches it. But 7:08 says the weight is something of a neuron (how important is the neuron, rather than how important is the relationship). I think it is confusing; aren't weights a property of the relationship/channel, in graph theory? * inside slides that list advantages/disadvantages you might use the color red for disadvantages, and not just always green to highlight terms. Ex. "small" at 29:08. * I like the examples of descending the Everest, and the one about memorising songs. * 40:21 - the slides disappears till 40:57 * same at 52:01; till 53:11 * 58:15 - "sometimes you may find the ???? depicted over time" (the automatic subtitles don't get it either) * 1:04:17 - audio says "input gate, output gate, and a forget gate"; but the slides shows "Update, Reset & Forget gates". * 1:10:30 - "although if you are interested I'll leave them in the notes below"; yet, it would be useful if a list (or a link to more info) could be added in the description. These are the major notes that I think should be fixed, for better clarity. Anyway it is well made, a good explanatory overview of the neural networks world, that I had no idea about it. It was ok to understand for me, if I skip on the name of the specific algorithms (that in the end are implementation details). But I already got some basic knowledge of statistics & data-analysis, and about graph theory. My only doubt is if others that never dig in those topics can follow this video as well. Keep up with this interesting contents! Thanks for your time and effort!
⌨️ (0:00) Introduction ⌨️ (1:18) What is Deep Learning ⌨️ (5:25) Introduction to Neural Networks ⌨️ (6:12) How do Neural Networks LEARN? ⌨️ (12:06) Core terminologies used in Deep Learning ⌨️ (12:11) Activation Functions ⌨️ (22:36) Loss Functions ⌨️ (23:42) Optimizers ⌨️ (30:10) Parameters vs Hyperparameters ⌨️ (32:03) Epochs, Batches & Iterations ⌨️ (34:24) Conclusion to Terminologies ⌨️ (35:18) Introduction to Learning ⌨️ (35:34) Supervised Learning ⌨️ (40:21) Unsupervised Learning ⌨️ (43:38) Reinforcement Learning ⌨️ (46:25) Regularization ⌨️ (51:25) Introduction to Neural Network Architectures ⌨️ (51:37) Fully-Connected Feedforward Neural Nets ⌨️ (54:05) Recurrent Neural Nets ⌨️ (1:04:40) Convolutional Neural Nets ⌨️ (1:08:07) Introduction to the 5 Steps to EVERY Deep Learning Model ⌨️ (1:08:23) 1. Gathering Data ⌨️ (1:11:27) 2. Preprocessing the Data ⌨️ (1:19:05) 3. Training your Model ⌨️ (1:19:33) 4. Evaluating your Model ⌨️ (1:19:55) 5. Optimizing your Model's Accuracy ⌨️ (1:25:15) Conclusion to the Course
This is honest to god a great lecture and perfect for introducing deep learning! I hope there’s another video in the future that shows some light programming.
After watching a lot of tutorials or courses about deep learning, i can truly say this is probably the best! Everything organized and clear. Congratulations, it will help us a lot! Thank you
for those examples we can assume all possible directions until the ball moves as for "dog" we can assume all statements/questions are accurate until more content is provided at least that would be a temporary solution.
This is a way better introduction than a University's Introduction to Deep Learning Course for beginners. While the university assumes some ML and computer science background before starting DL courses, this video is for complete beginners. I appreciate it!
Great job! As a FYI, when I used to teach, by the time I got to ANN's, I'd already covered statistical modeling. So it was always an "a-ha!" moment when I'd ask the class if they remembered the sigmoid function from before ... and that all the sigmoid functions acted like mini logistic regression models.
Thank You for this, this is my 2nd day into this field, and I think I know the big picture of how it all works and learnt around a month of stuff in a day or two.
Thank you for this amazing video. Honestly, out of all of the tutorials I've watched, it is the first time that someone explains it in such clear and understandable way. Again, thank you for sharing your knowledge!!!
Hi Jason, Nice video,. The range for the si gmoid function (16:24 on your video) is not [0,1] but rather ]0,1[ as a 0 and a 1 can only be obtained towards +-infinity. Same applies to the hyperbolic tangent ranging from ]-1,+1[ and not from [-1,+1]
Nice introduction. Some of the ideas probably should be reigned in a bit. For example, people who study learning do not think machine learning models how the human brain learns at all. Calling neural network nodes "neurons" misleads people into thinking you are actually trying to imitate a neuron instead of just using a software node. Stating that activation functions are non-linear is not always correct - in fact the equation you showed when you said that looks like a linear sum of terms. All this doesn't take away from the work you've done to make these ideas accessible. It's a good intro - but hopefully people understand there are some sweeping statements that might not hold up under close scrutiny.
Neuron and node are interchangeable enough in this context. Had 'neuron' not been used, a consumer of this course will still eventually run into each term. This course certainly wasn't the first to use 'neuron'. NNs do roughly model brain functions. Like Jason said, each works to identify patterns based on data. Jason also goes on to explain the case where linear activation functions are less than optimal. Of course there are exceptions, but beginners probably wouldn't be concerned with them
The output shouldn't be right or wrong, it should be a variable no? Because not all input we receive is binary. If anything it should be on a scale of 1-10, 1 being very negative (lots of adjustments to neurons) and 10 being perfect and should increase the weight in the future.
I found the fake news detection statement amusing. This was clearly before we realised that every side of an issue has fake news (and some truths are just not available on any network because the data regarding them is deleted). If created today I would suggest the software would be used to determine the truth according to the clients parameters and specifications. Additionally the software would need to be directed to what are 'reliable sources' or otherwise risk verifying undesirable information. Sorry for overly discussing this issue. This video was very interesting, thank you.
0:10 The computer runs an algorithm on each players possible moves and picks the one that's closest to the king WITHOUT BEING EATEN. There is no ai in that. It's just iterating all the players and iterating all of their moves going from player to player. There are actually many moves to remember and remembering all of them on a move can be sometimes difficult when there are so many possibilities A computers algorithm in that framework is 100% correct all of of the time. In terms of the facial ai it's just objects which are outlined and marked with a set of attributes Trigonometry is really important here. Also business functions play here quite frequently It's also remembering situations and figuring out the best path I took at a path. A path is a set of steps I took. How many times has that played up before. What were the moves that won me the game at that path. That's the hard part. That's deep learning. That's path memory. Very difficult This is where you learn to flip the binary tree on its side and use them as open ports. Drawing it differently and analyzing is all we need to see. DNA Path memory
It was a good introductory course! Although there's lack of examples in the explanation of neural network part. But overall you can get the idea of how deep learning works.
Such a great video - the explanations and pace were just perfect. I'm sure to use this video to review concepts time and time again... this was such well-put-together beginner material. Thank you so much for making this available for free! You are incredible!
Is there a way to have access to the slides used in the video? It would be really helpful for those who want to revise everything and doesn't have to go through the entire video.
21:11 why not just use multiple activation functions? if your goal was to find a specified output does running data through multiple "filters" be optimal? the argument I can think of is just "cost" like time to compute/etc. but what do you think about this theoretical?
Excellent.....please upload the very basic idea of machine learning and artificial intelligence.... it's too useful for us like intern and upcoming future...please upload freecodecamp...
Thanks for your all videos. But I request you please upload two different videos of 6-7 hour on Calculus and Linear Algebra required for Machine Learning and Data Science
I'm only 10 minutes in and there are already problems. We don't talk about the number of neurons in the input and hidden layers and yet we do a sum of 1 to n and the same n is used implying that there are the same number of input and hidden layer neurons or the formula is incorrect. The adjustment due to back propagation is somewhat skimmed over. Things are adjusted. What is adjusted and by what method / calculation?
For absolute beginners, go through this course 2-3 times, try to note important terms, topics, and processes, and then take every topic and go deep sequentially. I am happy that you have such a great mentor like him now. Best wishes all.
I watched a lots of deep learning tuturials before this, some of them were even twice as long as this and yet this explained the best and the most. Thank you for the awesome tutorial without ads for free. You are a hero.
Yeah me too..
@@monleyson8668 any other video recom? in the topic of deep learning or python related?
do not forget that you already have weights in your model of understanding neural networks, since you have seen other videos prior to this.
if you don't speak English, let someone read it. it bothers me a lot to hear the stupid accent.
@@claudioa.dmedina2020yes. Agree. This video best fit for one with certain background
Else it overwhelms
Don’t get me wrong, this video is very good. I will watch again
I can't believe this material is published on RUclips for free. Best course I have taken on RUclips ever!!!!!
Short and crisp overview of deep learning with cool intuitions. This is the only video that I will recommend to anyone who wants to start with deep learning and even machine learning in general.
Some (hopefully constructive) annotations to improve the video for better clarity:
* 6:33 - "Channels have weight". And the slide matches it. But 7:08 says the weight is something of a neuron (how important is the neuron, rather than how important is the relationship). I think it is confusing; aren't weights a property of the relationship/channel, in graph theory?
* inside slides that list advantages/disadvantages you might use the color red for disadvantages, and not just always green to highlight terms. Ex. "small" at 29:08.
* I like the examples of descending the Everest, and the one about memorising songs.
* 40:21 - the slides disappears till 40:57
* same at 52:01; till 53:11
* 58:15 - "sometimes you may find the ???? depicted over time" (the automatic subtitles don't get it either)
* 1:04:17 - audio says "input gate, output gate, and a forget gate"; but the slides shows "Update, Reset & Forget gates".
* 1:10:30 - "although if you are interested I'll leave them in the notes below"; yet, it would be useful if a list (or a link to more info) could be added in the description.
These are the major notes that I think should be fixed, for better clarity.
Anyway it is well made, a good explanatory overview of the neural networks world, that I had no idea about it.
It was ok to understand for me, if I skip on the name of the specific algorithms (that in the end are implementation details).
But I already got some basic knowledge of statistics & data-analysis, and about graph theory. My only doubt is if others that never dig in those topics can follow this video as well.
Keep up with this interesting contents! Thanks for your time and effort!
[/finished to watch the video and add notes]
⌨️ (0:00) Introduction
⌨️ (1:18) What is Deep Learning
⌨️ (5:25) Introduction to Neural Networks
⌨️ (6:12) How do Neural Networks LEARN?
⌨️ (12:06) Core terminologies used in Deep Learning
⌨️ (12:11) Activation Functions
⌨️ (22:36) Loss Functions
⌨️ (23:42) Optimizers
⌨️ (30:10) Parameters vs Hyperparameters
⌨️ (32:03) Epochs, Batches & Iterations
⌨️ (34:24) Conclusion to Terminologies
⌨️ (35:18) Introduction to Learning
⌨️ (35:34) Supervised Learning
⌨️ (40:21) Unsupervised Learning
⌨️ (43:38) Reinforcement Learning
⌨️ (46:25) Regularization
⌨️ (51:25) Introduction to Neural Network Architectures
⌨️ (51:37) Fully-Connected Feedforward Neural Nets
⌨️ (54:05) Recurrent Neural Nets
⌨️ (1:04:40) Convolutional Neural Nets
⌨️ (1:08:07) Introduction to the 5 Steps to EVERY Deep Learning Model
⌨️ (1:08:23) 1. Gathering Data
⌨️ (1:11:27) 2. Preprocessing the Data
⌨️ (1:19:05) 3. Training your Model
⌨️ (1:19:33) 4. Evaluating your Model
⌨️ (1:19:55) 5. Optimizing your Model's Accuracy
⌨️ (1:25:15) Conclusion to the Course
Helo
@@weyoflife Hola
It's in the description though
@@-hikikomori-7191 100%
Ctrl c Ctrl v from description or effort for nothing sorry
This is honest to god a great lecture and perfect for introducing deep learning! I hope there’s another video in the future that shows some light programming.
After watching a lot of tutorials or courses about deep learning, i can truly say this is probably the best! Everything organized and clear. Congratulations, it will help us a lot! Thank you
TOO GOOD FOR REVISION OF DEEP LEARNING CONCEPTS FROM SCRATCH.....THANK YOU AWESOME CRYSTAL CLEAR EXPAINATION....
Top quality lullaby. Slept like a baby 15 minutes in! 👌
🤣
Litterally here to fall asleep
i came here to watch it dedicatedly this comment manipulated me😂
True😂😂
OK, finally someone that understands how to teach this. Excellent pace and details.
Perfect for a beginner like me. It made me fall in love with Deep Learning!!
these are the things exactly one want to know to do a project based on deep learning.
for those examples we can assume all possible directions until the ball moves as for "dog" we can assume all statements/questions are accurate until more content is provided at least that would be a temporary solution.
Good one. If someone has finished an end to end training, this one serves as a quick refresher.. And at the end this is what one would remember...
This is a way better introduction than a University's Introduction to Deep Learning Course for beginners. While the university assumes some ML and computer science background before starting DL courses, this video is for complete beginners. I appreciate it!
Thank you, this is one of the best deep learning tutorials for beginners
Just when I need it the most! What a great timing!
Same as me
I love your venom
You've probably saved my grade in my university "Deep Learning" course!! Thank you! ❣
Great job! As a FYI, when I used to teach, by the time I got to ANN's, I'd already covered statistical modeling. So it was always an "a-ha!" moment when I'd ask the class if they remembered the sigmoid function from before ... and that all the sigmoid functions acted like mini logistic regression models.
absolute amazed by the potential of deep learning. love this vid.
Wow, You almost covered everything need to know about DL. great work.
True
Thank You for this, this is my 2nd day into this field, and I think I know the big picture of how it all works and learnt around a month of stuff in a day or two.
how did you end up doing? where are you now?
Thank you for this amazing video. Honestly, out of all of the tutorials I've watched, it is the first time that someone explains it in such clear and understandable way. Again, thank you for sharing your knowledge!!!
I'm a beginner and you just nailed it.
Thanks for providing us with such quality material 😊😊
Hi Jason, Nice video,. The range for the si
gmoid function (16:24 on your video) is not [0,1] but rather ]0,1[ as a 0 and a 1 can only be obtained towards +-infinity. Same applies to the hyperbolic tangent ranging from ]-1,+1[ and not from [-1,+1]
its (0,1) not [0,1]
This was such a great refresher course. Everything I needed to recollect and on point. Great job and thank you!
No clue what this is, i'm going to start through
All the best !
keep us posted!
@@nagendradevara1 better to be blind than the ability to see
Excellent course for the revision of all concepts of deep learning
Nice introduction. Some of the ideas probably should be reigned in a bit. For example, people who study learning do not think machine learning models how the human brain learns at all. Calling neural network nodes "neurons" misleads people into thinking you are actually trying to imitate a neuron instead of just using a software node. Stating that activation functions are non-linear is not always correct - in fact the equation you showed when you said that looks like a linear sum of terms.
All this doesn't take away from the work you've done to make these ideas accessible. It's a good intro - but hopefully people understand there are some sweeping statements that might not hold up under close scrutiny.
Neuron and node are interchangeable enough in this context. Had 'neuron' not been used, a consumer of this course will still eventually run into each term. This course certainly wasn't the first to use 'neuron'.
NNs do roughly model brain functions. Like Jason said, each works to identify patterns based on data.
Jason also goes on to explain the case where linear activation functions are less than optimal. Of course there are exceptions, but beginners probably wouldn't be concerned with them
bang, bikin cara menghitung lossnya dong, untuk ngecek tingkat akurasi, f1 score, map, dll gitu
Umm...I am searching for something greater than the " Thank you so much"
but for now tqsm for such a osum video on DL.
goofy ahh abbrev
It´s crazy how much information this video has. Thanks:)
ikr
Guys! The cover pick says LEANING not learning, correct it please.
Heard a lot about this but I'm gonna be honest I have no idea what it is. Let's get into it tho 🙂
Great information! Super well written. Clear and interesting!
Very enlightening for beginners! Very nice voice-over! Thanks!
Honestly the best course I came across at the moment
(6:17) That formula is incorrect. There is only one bias value per neuron, not "n". Thus, the formula should have "b" not "b_i".
This is the first time I actually understand a little bit about AI. Great work.
Excellent introduction to the topic. Great slides, great explanation, right pace. Really good.
The output shouldn't be right or wrong, it should be a variable no? Because not all input we receive is binary. If anything it should be on a scale of 1-10, 1 being very negative (lots of adjustments to neurons) and 10 being perfect and should increase the weight in the future.
If I'm having a difficult time keeping up with the activation functions, what should I study to be better prepared for this tutorial?
This is awesome!
Supervised learning can be a regression as well. It's not only predicting the correct label (classification problem).
Probably the best video I've watched on deep learning.
great intro to DL.. cheers
Good course but tell us about the Advanced project
14:36: this function represented by the graph is not linear, since it doesn't pass the origin.
This was so fucking awesome. Thanks for doing this.
Refreshed all my DL concepts.
:)
You explained the concepts extremely well, thanks for this amazing video!
This is so deep and didactic at the same time! Thanks a lot for putting the effort to produce the video!
is it okay to say that RNN "digest" data a little longer (as an analogy to feedback loops) so it can "spit" better results?
I found the fake news detection statement amusing. This was clearly before we realised that every side of an issue has fake news (and some truths are just not available on any network because the data regarding them is deleted). If created today I would suggest the software would be used to determine the truth according to the clients parameters and specifications. Additionally the software would need to be directed to what are 'reliable sources' or otherwise risk verifying undesirable information. Sorry for overly discussing this issue. This video was very interesting, thank you.
Better if shows some demo to do the 5 steps in Deep Learning Model. Any simple demo is good enough to highlight those steps.
0:10 The computer runs an algorithm on each players possible moves and picks the one that's closest to the king WITHOUT BEING EATEN.
There is no ai in that. It's just iterating all the players and iterating all of their moves going from player to player.
There are actually many moves to remember and remembering all of them on a move can be sometimes difficult when there are so many possibilities
A computers algorithm in that framework is 100% correct all of of the time.
In terms of the facial ai it's just objects which are outlined and marked with a set of attributes
Trigonometry is really important here.
Also business functions play here quite frequently
It's also remembering situations and figuring out the best path I took at a path.
A path is a set of steps I took. How many times has that played up before.
What were the moves that won me the game at that path.
That's the hard part. That's deep learning. That's path memory. Very difficult
This is where you learn to flip the binary tree on its side and use them as open ports.
Drawing it differently and analyzing is all we need to see.
DNA
Path memory
watch your career with great interest young Jedi
I needed just this course a perfect one for revising all concepts in short time. Great work. Thank you.
Very good for recapping the knowledge of Deep Learning. Thanks
This video is really helpful,
Thank you so much for the video!!!
thank you so much This is very, helpful
It was a good introductory course! Although there's lack of examples in the explanation of neural network part. But overall you can get the idea of how deep learning works.
Any advanced version comming, such a fantastic course.
The Deep “Leaning” got fixed? Was driving me crazy.
My brain is really tired but this was very helpful. Addressed almost everything my lecturer mentioned in class
2:17 In 2016, Alphago beated Lee Sedol. Not in 2015. Am i wrong ?
Great intro to DL..cheers
Correct the spelling mistake in the thumbnail...r is missing in the Learning
Awesome video! Super comprehensive yet compact and simply explained
Great course for beginners, thank you
Best for Course for an new Deep Learning aspirants....Kudos to Jason Dsouza
Such a great video - the explanations and pace were just perfect. I'm sure to use this video to review concepts time and time again... this was such well-put-together beginner material. Thank you so much for making this available for free! You are incredible!
Awesome video, Jason. You've helped make this information accessible to thousands of new people.
Is there a way to have access to the slides used in the video? It would be really helpful for those who want to revise everything and doesn't have to go through the entire video.
Exactly
This has been up for a year and no one has mentioned the thumbnail says LEANING instead of LEARNING 😂😂
I loved the video, its excelent. Is there a recommended model for time series?
use facebook prophet or NeuralProphet libraries
This is gold! Brilliant job done by you guys.
Did you cover working with mnist and cifar10
Thank you it was really amazing course!!
i love u bro, keep it up🙂
this course is priceless
Yeah, its free so priceless is a good choice of word!😅
Loved it.
Thank you Jason!!!
(7:19) Weight is assigned to a link and not a neuron right? And the bias term shifts the graph up or down, not left or right.
Excellent video, helped me a lot and translated into 7.5 pages of notes.
Can you please share your notes?
@@ameyakhot4458 It would be too much for a youtube comment, would it work as a google doc?
@@benjystrauss2524 Yes if it is possible
Thanks your hardwork is recommendable
21:11 why not just use multiple activation functions?
if your goal was to find a specified output does running data through multiple "filters" be optimal?
the argument I can think of is just "cost" like time to compute/etc. but what do you think about this theoretical?
Excellent.....please upload the very basic idea of machine learning and artificial intelligence.... it's too useful for us like intern and upcoming future...please upload freecodecamp...
Thank you bro. Great one.
best explanation for DL
Concise and clear!
Thanks for your all videos. But I request you please upload two different videos of 6-7 hour on Calculus and Linear Algebra required for Machine Learning and Data Science
I'm only 10 minutes in and there are already problems. We don't talk about the number of neurons in the input and hidden layers and yet we do a sum of 1 to n and the same n is used implying that there are the same number of input and hidden layer neurons or the formula is incorrect. The adjustment due to back propagation is somewhat skimmed over. Things are adjusted. What is adjusted and by what method / calculation?
weights and biases using whatever function you have to minimize loss
just a suggestion, how about this channel also provide pdf link in description.
Thanks, man. I love you!
Gud goin Jason ! Great content....Keep adding more courses..... Looking forward.
Teeny tiny remark: There is a typo in the text of the thumbnail, it reads "Deep Leaning" instead of "Deep Learning" 😁
How to get transcript of this whole video into organised notes ? or atleast broken down into timestamps
granddaddy of optimizers got me lol
Great video btw!!
Please make full course on deep learning
that's what this course is there for duh