My first formal introduction to Keras. Half way through. Did not think I will make it this far without difficulty. Your narration style is not a chatty style. It is more like a professional news reader. But your script anticipates everything and preempts the questions that arise in the mind. That makes it effective. Will continue again tomorrow.
Well there are quite few like them on RUclips that provide our generation the best of their content, explaining every aspect of computer science Rather we should give them atleast give them some appreciation The internet is expanding tremendously and these are our hero(s) which make internet a golden source of knowledge.
Thank you Mandy. This was a great tutorial or insight given on deep learning. This is surely the best one I have seen on RUclips. Thanks a lot again for your efforts 😊
Definitely that kind of Pokemon for sure, Gal with the makeup on it's hard to tell, but no makeup they are the spitting image of one another, forget being a programmer, she should be a body double
freckles just saying. tbh its awesome to have an instructor who is an attrctive gal. good to see the stereotype of good looking ladies being bimbos is really being proven to be foolish. though like 75% percent of tech jobs are still held by guys so if u learn all this and get a great job it wont feature as many as it should which is the personification of :(
Your multi-API approach (plus CPU/GPU heads up) was indeed a major factor to consider while choosing a source of insight into Keras. Thank you for the thorough and very well presented content.
Since my laptop did not have a GPU, it threw an error, so I added an if statement in case the length of the device array was zero. I was a little green with envy when the vgg16 ran 8 to 10 seconds per epoch on your system, but my laptop with an i7 and 24 gig of RAM and 1TB of SSD took a whopping 850 seconds more or less per epoch. I suspect it's running single-threaded, and I vaguely remember something about an option to open this cpu to using 8 threads instead. I remember doing something like that in octave or matlab in Andrew Ng's first course or Geoffrey Hinton's course.
Great material, just one remark at 14:40: according to your problem definition the sample sizes should be 50 and 950, hence the second loop should be for i in range(950).
I am not a programmer/coder. I found this video very soothing and inspiring simply the vision it infused in creative aspects. I sat 6 years in the hole so the slightest intellectual understanding blows up into wisest yet connected set for output variable. I am a mystic, someone needs to code what I see.
Thank you Deeplizard and Free Code Camp. Its a great tutorial and a good video. I have been learning ML and DL, started out recently. However, its only after seeing this video that I know that I think I have the confidence to carry out something on my own, now. Thank you to the Keras and TF2 team as well.
2:26:42 You need to add one more layer before the Dense output with 10 neurons since the chosen modified layer is not working as for today (22nd July 2022). You can add it as x = tf.keras.layers.Flatten()(x) . Hope this helps.
With this line the number of non trainable parameters is the same as in this course, but the total number of parameters and the number of trainable parameters both increase. To get exactly the same result as in the course I modified that part of the code like this: x = mobNet.layers[-6].output x = tf.keras.layers.GlobalAveragePooling2D()(x) output = Dense(units=10, activation='softmax')(x) I don't know which is best but if anyone wants to follow this course religiously, that works (as of 27th July 2022).
@@profsrmq the funny part is that I haven't worked in Keras since July and now I have absolutely no clue about the thing I wrote and can't even remember how I came up with it 😅
This is excellent. I'm glad I spent a lot of time learning about machine learning and deep learning theory before I started this so I understand basically what is going on, and this is a super simple API. I think I'll use keras primarily, but I think I'll also learn tensorflow more thoroughly just in case.
I am an Italian guy and that is not an espresso, It is a cappuccino :D watching that part of the video was a stab to my heart :D The espresso is the Italian coffee made by the bar espresso coffee machine.
Near-zero computer programming training. Just admiring from the nosebleed seats the codes that go into making the fundamental building blocks of neural networks similar to how our human own brains functions. Recently read Max Tegmark's Life 3.0 book on AGI which have peaked my interest in deep learning.
Great book dude ! Read it beginning of this year. I would highly recommend Nick Bostrom's Superintelligence which I personally think gives even better insights about artificial intelligence.
Feedback on video production. When the discussion is about the last few lines in a screen, it is difficult to watch the screen when it is overlapped by the subtitles. You are capturing the screen showing jupyter and not using slides, and hence it should be easy for you to scroll the jupyter notebook to make the line under discussion to appear in the middle.
1:03:45 I think all the photos were meant to be directly under the dogs-vs-cats folder as per the later demonstration. Cause at 1:08:33, all the remaining photos were directly under dogs-vs-cats folder. Not dogs-vs-cats/train.
Hope you could see it. I renamed the folder train containing all the pics to trains. And then updated the code as below, it works on mac. os.chdir('trains') for i in random.sample(glob.glob('cat*'), 500): shutil.move(i, '../train/cat') for i in random.sample(glob.glob('dog*'), 500): shutil.move(i, '../train/dog') for i in random.sample(glob.glob('cat*'), 100): shutil.move(i, '../valid/cat') for i in random.sample(glob.glob('dog*'), 100): shutil.move(i, '../valid/dog') for i in random.sample(glob.glob('cat*'), 50): shutil.move(i, '../test/cat') for i in random.sample(glob.glob('dog*'), 50): shutil.move(i, '../test/dog')
Incredible video! Very well taught with clear explanations for all the different concepts. This has allowed me to put my first foot through the door to understand Keras/TensorFlow!
In the first data generation step, I believe the second loop ( ~95% of young people who did not experience side effects) should be range(50,1000) instead of range(1000)
Thank you for this nice video, it really got me started. But I think there is one bad habit involved: "Never normalize your test data on its own but rather on the train data!" (This is what you read in many forums and for example in Chollet's "Deep Learning with Python" book. I think at 43:24 it should be scaler.transform(test_samples.reshape(-1,1)) as this takes the fitted scaler from the train_samples. Correct me if I am wrong :)
Tensorflow + Keras technically isnt "from scratch" anymore. Those apply many abstraction and functional layers that make it calculate things for you without exposing true nature of NN.
Did you know Python technically isn't from scratch? It all translates to machine code.. so go write in machine code if you like doing everything 'from scratch'.
@@shivankitss8396 Clearly I was disagreeing with the original comment lol. I would suggest using TF and Keras and not whining about how it isn't `from scratch`.
@@fazalali2894 models are Ofcourse same in deep learning...but data processing is very different. There is nothing like Audiodatastore as imagedatastore . Also wav files loading is different
@@krishnachauhan2850 if I were to tackle that problem I wouldn't try to use direct audio files. I don't see the benefit of that but that may be different based on your end goal. From what I have seen, log-mel spectrograms are the way to go and those can be loaded in as images (or matrices which is probably what I'd use for a more precise representation) or Stacked Spectrograms. Have you tried any of those out? If so, what problems were you facing that required the need for an audio generator?
When you are good you are good, and deeplizard is VERY GOOD. I recommend this course and the Deep learning classic as an excellent way to get familiar with deep learinng ANN and keras implementation. Also the text versions on the blog are very good. Great job!
*For my own reference* ⭐️🦎 COURSE CONTENTS 🦎⭐️ ⌨️ (00:00:00) Welcome to this course ⌨️ (00:00:16) Keras Course Introduction ⌨️ (00:00:50) Course Prerequisites ⌨️ (00:01:33) DEEPLIZARD Deep Learning Path ⌨️ (00:01:45) Course Resources ⌨️ (00:02:30) About Keras ⌨️ (00:06:41) Keras with TensorFlow - Data Processing for Neural Network Training ⌨️ (00:18:39) Create an Artificial Neural Network with TensorFlow's Keras API ⌨️ (00:24:36) Train an Artificial Neural Network with TensorFlow's Keras API ⌨️ (00:30:07) Build a Validation Set With TensorFlow's Keras API ⌨️ (00:39:28) Neural Network Predictions with TensorFlow's Keras API ⌨️ (00:47:48) Create a Confusion Matrix for Neural Network Predictions ⌨️ (00:52:29) Save and Load a Model with TensorFlow's Keras API ⌨️ (01:01:25) Image Preparation for CNNs with TensorFlow's Keras API ⌨️ (01:19:22) Build and Train a CNN with TensorFlow's Keras API ⌨️ (01:28:42) CNN Predictions with TensorFlow's Keras API ⌨️ (01:37:05) Build a Fine-Tuned Neural Network with TensorFlow's Keras API ⌨️ (01:48:19) Train a Fine-Tuned Neural Network with TensorFlow's Keras API ⌨️ (01:52:39) Predict with a Fine-Tuned Neural Network with TensorFlow's Keras API ⌨️ (01:57:50) MobileNet Image Classification with TensorFlow's Keras API ⌨️ (02:11:18) Process Images for Fine-Tuned MobileNet with TensorFlow's Keras API ⌨️ (02:24:24) Fine-Tuning MobileNet on Custom Data Set with TensorFlow's Keras API ⌨️ (02:38:59) Data Augmentation with TensorFlow' Keras API ⌨️ (02:47:24) Collective Intelligence and the DEEPLIZARD HIVEMIND
Great work Mandy. I really enjoyed your video. I noticed though that " classes = cm_ plot_labels" wasn't defined in the video. Hence, my plot of the confusion matrix was somewhat different. I will be glad if you define the class. Thank you.
Amazing tutorial! I thought about improving the custom CNN model, and I got it up to 0.8993 val_accuracy. My model: model = Sequential([Conv2D(filters=8, kernel_size=(3,3), activation='relu', padding='same', input_shape=(224, 224, 3)), MaxPool2D(pool_size=(2,2), strides=2), Conv2D(filters=16, kernel_size=(3,3), activation='relu', padding='same'), MaxPool2D(pool_size=(2,2), strides=2), Conv2D(filters=32, kernel_size=(3,3), activation='relu', padding='same'), MaxPool2D(pool_size=(2,2), strides=2), Conv2D(filters=64, kernel_size=(3,3), activation='relu', padding='same'), MaxPool2D(pool_size=(2,2), strides=2), Conv2D(filters=128, kernel_size=(3,3), activation='relu', padding='same'), MaxPool2D(pool_size=(2,2), strides=2), Conv2D(filters=256, kernel_size=(3,3), activation='relu', padding='same'), MaxPool2D(pool_size=(2,2), strides=2), Flatten(), Dense(units=2048, activation='relu'), Dense(units=2, activation='softmax') ]) I also changed the Adam learning rate from 0.0001 to 0.001(the default value) and the epochs to 30 and lastly I used all of the included 25000 pictures(9617/animal for training, 1922/animal for validation and 961/animal for testing) imgur.com/a/DprZDhl
2:09:10 - Expresso is just the coffee. It becomes a cappuccino when you add milk and foam. If it's getting classified as expresso then I wonder if the original dataset labels were added incorrectly by human editors that didn't know the difference.
This pic has the classic look of a cappuccino. I too was thinking about some mislabeling in dataset. Note, espresso is both the brewed coffee ingredient in combination drinks like cappuccino, latte and cortado, as well as a drink in itself.
at 2:09:04 into the video, the picture 2.png is identified as espresso and not cappuccino. This may be due to the fact that imagenet does not have a class by the name of cappuccino.
How do I identify that the indices actually correspond the label? How if my labels are 3 and 10 for example? How I can be sure that index 0 do not correspond to label 1 and viceversa?
Running huge blocks of code like 20 lines of imports and 20-30 lines of folder creation was a huge inconvenience for me. Those huge blocks of codes should have been freely available to people watching this video.
My first formal introduction to Keras. Half way through. Did not think I will make it this far without difficulty. Your narration style is not a chatty style. It is more like a professional news reader. But your script anticipates everything and preempts the questions that arise in the mind. That makes it effective. Will continue again tomorrow.
Well there are quite few like them on RUclips that provide our generation the best of their content, explaining every aspect of computer science
Rather we should give them atleast give them some appreciation
The internet is expanding tremendously and these are our hero(s) which make internet a golden source of knowledge.
@@dhananjaygola4786 well, to expand on your point, you should name a few of them.
@@Nootey33 don't worry just believe in youtube recommendation Ai & soon you'll find them
@@dhananjaygola4786 what are the other channels pls?
You are absolutely right on! This is a great presentation.
This was an excellent instruction set. Really appreciate all the work on it.
Thank you, Paul!
@Paul Mcwhorter So good to see u here sir! I'm currently doing ur Arduino lessons and that's really amazing
@@zaief7016 Thanks, yep, I am always trying to learn as well.
:)
All I can think of is laying with her in the bed behind her
Thank you Mandy. This was a great tutorial or insight given on deep learning. This is surely the best one I have seen on RUclips. Thanks a lot again for your efforts 😊
Glad to hear that, Palash :)
Glad I'm being taught by Gal Gadot. ☺️👍
Me too !!
I saw abella danger
Definitely that kind of Pokemon for sure, Gal with the makeup on it's hard to tell, but no makeup they are the spitting image of one another, forget being a programmer, she should be a body double
@@code-to-design bruh
freckles just saying. tbh its awesome to have an instructor who is an attrctive gal. good to see the stereotype of good looking ladies being bimbos is really being proven to be foolish. though like 75% percent of tech jobs are still held by guys so if u learn all this and get a great job it wont feature as many as it should which is the personification of :(
26:16 , the parameter "learning_rate" has been renamed to "lr" from version 2.2.* to 2.3.0 in September 2018
This is the first long tutorial I watched from start to finish and it says a lot! Thank you very much!!!
Your multi-API approach (plus CPU/GPU heads up) was indeed a major factor to consider while choosing a source of insight into Keras. Thank you for the thorough and very well presented content.
Great to hear. You're welcome Filipe!
Since my laptop did not have a GPU, it threw an error, so I added an if statement in case the length of the device array was zero. I was a little green with envy when the vgg16 ran 8 to 10 seconds per epoch on your system, but my laptop with an i7 and 24 gig of RAM and 1TB of SSD took a whopping 850 seconds more or less per epoch.
I suspect it's running single-threaded, and I vaguely remember something about an option to open this cpu to using 8 threads instead. I remember doing something like that in octave or matlab in Andrew Ng's first course or Geoffrey Hinton's course.
I cant describe in words how much this video helped me with my research project! You are a great teacher, Thankyou so much!
hey what was your research project?
Great material, just one remark at 14:40: according to your problem definition the sample sizes should be 50 and 950, hence the second loop should be for i in range(950).
Yeah, i think it is a mistake.
It's "~"95%, not 95%
There are 2100 patients.
Mandy , your are a very gifted instructor. There have been hundreds of instructors in my life and your are the BEST!
I am not a programmer/coder. I found this video very soothing and inspiring simply the vision it infused in creative aspects. I sat 6 years in the hole so the slightest intellectual understanding blows up into wisest yet connected set for output variable. I am a mystic, someone needs to code what I see.
I think it's better to learn to program.
You need to see a pychiatrist
You're the chosen one padawan.
Thank you Deeplizard and Free Code Camp. Its a great tutorial and a good video. I have been learning ML and DL, started out recently. However, its only after seeing this video that I know that I think I have the confidence to carry out something on my own, now. Thank you to the Keras and TF2 team as well.
2:26:42 You need to add one more layer before the Dense output with 10 neurons since the chosen modified layer is not working as for today (22nd July 2022). You can add it as x = tf.keras.layers.Flatten()(x) . Hope this helps.
With this line the number of non trainable parameters is the same as in this course, but the total number of parameters and the number of trainable parameters both increase. To get exactly the same result as in the course I modified that part of the code like this:
x = mobNet.layers[-6].output
x = tf.keras.layers.GlobalAveragePooling2D()(x)
output = Dense(units=10, activation='softmax')(x)
I don't know which is best but if anyone wants to follow this course religiously, that works (as of 27th July 2022).
@@robertoprestigiacomo253 works like a charm!
@@profsrmq the funny part is that I haven't worked in Keras since July and now I have absolutely no clue about the thing I wrote and can't even remember how I came up with it 😅
yes exactly :
x = mobile.layers[-5].output
x = tf.keras.layers.Flatten()(x)
output = Dense(units=10, activation='softmax')(x)
i clicked because she was pretty, now i know about keras
I clicked because i wanted to watch you comment
Finally, the wait ended. Thanks guys. Lots of love!
Dear Lady DeepLizard,
Thank you so much for the energy, time and thought you've putting this course! I have benefited a lot from your channel,
Abella Danger teaching us about ML is amazing
🤣My first thought too.. We are degenerates.
This is excellent. I'm glad I spent a lot of time learning about machine learning and deep learning theory before I started this so I understand basically what is going on, and this is a super simple API. I think I'll use keras primarily, but I think I'll also learn tensorflow more thoroughly just in case.
no one gives a shii
I am an Italian guy and that is not an espresso, It is a cappuccino :D watching that part of the video was a stab to my heart :D
The espresso is the Italian coffee made by the bar espresso coffee machine.
Mandyy! You came and you gave without taking!!
You are so incredibly easy to listen to for hours on end, very well done I look forward to learning a bunch more from these videos
Near-zero computer programming training. Just admiring from the nosebleed seats the codes that go into making the fundamental building blocks of neural networks similar to how our human own brains functions. Recently read Max Tegmark's Life 3.0 book on AGI which have peaked my interest in deep learning.
Great book dude ! Read it beginning of this year. I would highly recommend Nick Bostrom's Superintelligence which I personally think gives even better insights about artificial intelligence.
I enjoyed watching this tutorial, I ended up finishing the whole video without realizing it, thanks!
Feedback on video production. When the discussion is about the last few lines in a screen, it is difficult to watch the screen when it is overlapped by the subtitles. You are capturing the screen showing jupyter and not using slides, and hence it should be easy for you to scroll the jupyter notebook to make the line under discussion to appear in the middle.
This is a great tutorial for Keras image classification. Can you do a similar one for object detection using Keras? That would be very helpful.
thanks abella danger, i learned a lot :)
🤣
Damn!!!
😭😭😭 JAILLLLL
😅 damnn i thought i was the only one who saw that. She looks like a prettier Abella Danger
Lmao bri
You’re a great teacher! This was perfect, learned a lot in a short time.
1:03:45 I think all the photos were meant to be directly under the dogs-vs-cats folder as per the later demonstration. Cause at 1:08:33, all the remaining photos were directly under dogs-vs-cats folder. Not dogs-vs-cats/train.
did u get past the for loops part? at 1:08:14
ive been stuck on it for ages
the error that is occuring is a valueerror
@@54M1WUL same
Hope you could see it. I renamed the folder train containing all the pics to trains. And then updated the code as below, it works on mac.
os.chdir('trains')
for i in random.sample(glob.glob('cat*'), 500):
shutil.move(i, '../train/cat')
for i in random.sample(glob.glob('dog*'), 500):
shutil.move(i, '../train/dog')
for i in random.sample(glob.glob('cat*'), 100):
shutil.move(i, '../valid/cat')
for i in random.sample(glob.glob('dog*'), 100):
shutil.move(i, '../valid/dog')
for i in random.sample(glob.glob('cat*'), 50):
shutil.move(i, '../test/cat')
for i in random.sample(glob.glob('dog*'), 50):
shutil.move(i, '../test/dog')
I thought its a hotel Trivago new advertisement.
IKR - I was waiting for Captain Obvious to show up!
Lol I guess it's what happens when you film courses from your Airbnb 🤷♀️😂
@@deeplizard 😅 nice video though
reporting bias.
Ahahahah
Such a detailed and amazingly designed course. Covered every question I had in mind!.
Incredible video! Very well taught with clear explanations for all the different concepts. This has allowed me to put my first foot through the door to understand Keras/TensorFlow!
Loved your video Mandy and need more content from you. Great explanation.
Brilliant illustarations and best DL material I have seen. Thank you, Andy.
*Mandy
In the first data generation step, I believe the second loop ( ~95% of young people who did not experience side effects) should be range(50,1000) instead of range(1000)
I think there is a mistake in 15:27. If it is 95% of the population, the code should be 'for i in range(950)'.
This was the answer to my prayer. I know linear algebra and Python. I just needed some specific code examples with comments. Thank you!!!
One of the best videos on Keras Deep Learning. Thanks for your wonderful teaching.
insane, thanks mandy for your great explanation, also really loved the background scenery of the video .
Fall in love with u for ur tutorials deeplizard 🦎 ... super cool. Hope that learning would be continue... Thanks a lot👩🏫
Thanks for a well prepared, well organized, professional presentation. GREATLY appreciated
Love the course! Hate the constant ads youtube
Thank you for this nice video, it really got me started. But I think there is one bad habit involved: "Never normalize your test data on its own but rather on the train data!" (This is what you read in many forums and for example in Chollet's "Deep Learning with Python" book. I think at 43:24 it should be scaler.transform(test_samples.reshape(-1,1)) as this takes the fitted scaler from the train_samples. Correct me if I am wrong :)
You're right 100%
That's correct
@@googlable
@@googlable
Also noticed, though in this example it doesn't matter as ranges of features are the same for test and train datasets
2:46:15 your cam got augmented too (and flipped upwards) :D
Nonetheless awesome course!
Thank You So Much for this in-depth hands on deep learning experience
"alright, that's it for the manual labor" at the one hour mark haha... i love it.
I have watched this video completely. It was worth my time on a beautiful Saturday afternoon. Keep up with this nice project Mandy
Perfect, code explanations were clear and straightforward
Very good intro example, easy to setup problem can be tweaked to explore more and doesn't require pictures or strange formats and other downloads.
Can't thank you enough. I needed this.
absolute gem!! Way better than all those paid courses.
Great course, I've been following this for the last week. Well organised and presented..
Thanks, finally someone with a normal English accent so to speak
Tensorflow + Keras technically isnt "from scratch" anymore. Those apply many abstraction and functional layers that make it calculate things for you without exposing true nature of NN.
Then how would you suggest?
Did you know Python technically isn't from scratch? It all translates to machine code.. so go write in machine code if you like doing everything 'from scratch'.
@@nick9198 then how would you suggest?
@@shivankitss8396 Clearly I was disagreeing with the original comment lol. I would suggest using TF and Keras and not whining about how it isn't `from scratch`.
Frameworks
WHOA ... this instructor is sooo smart!
Thank you Mandy, It was a great video with such a fantastic explanation!!!
I have always been a fan of Keras! Great video.
Did they explained audio processing also
@@krishnachauhan2850 They did not but the concepts can apply to that too.
@@fazalali2894 models are Ofcourse same in deep learning...but data processing is very different.
There is nothing like
Audiodatastore as imagedatastore
.
Also wav files loading is different
@@krishnachauhan2850 if I were to tackle that problem I wouldn't try to use direct audio files. I don't see the benefit of that but that may be different based on your end goal. From what I have seen, log-mel spectrograms are the way to go and those can be loaded in as images (or matrices which is probably what I'd use for a more precise representation) or Stacked Spectrograms. Have you tried any of those out? If so, what problems were you facing that required the need for an audio generator?
I love you Mandy
This course was awesome
Thank you for such a good explanation.
This channel is a gold mine :) Keep up the good work.
When you are good you are good, and deeplizard is VERY GOOD. I recommend this course and the Deep learning classic as an excellent way to get familiar with deep learinng ANN and keras implementation.
Also the text versions on the blog are very good.
Great job!
Great course, thanks guys, always know that we want 👍👍
Fantastic presentation! Thanks for sharing!!
You are one of a kind. Thanks much Mendy ❤️
*For my own reference*
⭐️🦎 COURSE CONTENTS 🦎⭐️
⌨️ (00:00:00) Welcome to this course
⌨️ (00:00:16) Keras Course Introduction
⌨️ (00:00:50) Course Prerequisites
⌨️ (00:01:33) DEEPLIZARD Deep Learning Path
⌨️ (00:01:45) Course Resources
⌨️ (00:02:30) About Keras
⌨️ (00:06:41) Keras with TensorFlow - Data Processing for Neural Network Training
⌨️ (00:18:39) Create an Artificial Neural Network with TensorFlow's Keras API
⌨️ (00:24:36) Train an Artificial Neural Network with TensorFlow's Keras API
⌨️ (00:30:07) Build a Validation Set With TensorFlow's Keras API
⌨️ (00:39:28) Neural Network Predictions with TensorFlow's Keras API
⌨️ (00:47:48) Create a Confusion Matrix for Neural Network Predictions
⌨️ (00:52:29) Save and Load a Model with TensorFlow's Keras API
⌨️ (01:01:25) Image Preparation for CNNs with TensorFlow's Keras API
⌨️ (01:19:22) Build and Train a CNN with TensorFlow's Keras API
⌨️ (01:28:42) CNN Predictions with TensorFlow's Keras API
⌨️ (01:37:05) Build a Fine-Tuned Neural Network with TensorFlow's Keras API
⌨️ (01:48:19) Train a Fine-Tuned Neural Network with TensorFlow's Keras API
⌨️ (01:52:39) Predict with a Fine-Tuned Neural Network with TensorFlow's Keras API
⌨️ (01:57:50) MobileNet Image Classification with TensorFlow's Keras API
⌨️ (02:11:18) Process Images for Fine-Tuned MobileNet with TensorFlow's Keras API
⌨️ (02:24:24) Fine-Tuning MobileNet on Custom Data Set with TensorFlow's Keras API
⌨️ (02:38:59) Data Augmentation with TensorFlow' Keras API
⌨️ (02:47:24) Collective Intelligence and the DEEPLIZARD HIVEMIND
i cant thank you enough for this video. God bless you.
Thank you for this very good tutorial. You are a great teacher and very pleasant to listen to. I learned a ton.
You are right about the part "Deep Learning"
you shouldn't call fit_transform on test_samples in 43:20. You should use the same scaler that was fitted on train_samples.
yea, you only need to transform it
Excellent tutorial. Thank you for the effort. The only drawback is that I get sleepy when I look at your background :-)
the best video ever, easy to understand. Great thanks!
Great work Mandy. I really enjoyed your video. I noticed though that " classes = cm_ plot_labels" wasn't defined in the video. Hence, my plot of the confusion matrix was somewhat different. I will be glad if you define the class. Thank you.
Clear communicator. Interesting lessons. Good vid
This tutorial amazingly helps me. Thanks!
Muchas gracias Miss Mandy....que habitación tan ordenada, saludos de Perú
Amazing tutorial! I thought about improving the custom CNN model, and I got it up to 0.8993 val_accuracy. My model:
model = Sequential([Conv2D(filters=8, kernel_size=(3,3), activation='relu', padding='same', input_shape=(224, 224, 3)),
MaxPool2D(pool_size=(2,2), strides=2),
Conv2D(filters=16, kernel_size=(3,3), activation='relu', padding='same'),
MaxPool2D(pool_size=(2,2), strides=2),
Conv2D(filters=32, kernel_size=(3,3), activation='relu', padding='same'),
MaxPool2D(pool_size=(2,2), strides=2),
Conv2D(filters=64, kernel_size=(3,3), activation='relu', padding='same'),
MaxPool2D(pool_size=(2,2), strides=2),
Conv2D(filters=128, kernel_size=(3,3), activation='relu', padding='same'),
MaxPool2D(pool_size=(2,2), strides=2),
Conv2D(filters=256, kernel_size=(3,3), activation='relu', padding='same'),
MaxPool2D(pool_size=(2,2), strides=2),
Flatten(),
Dense(units=2048, activation='relu'),
Dense(units=2, activation='softmax')
])
I also changed the Adam learning rate from 0.0001 to 0.001(the default value) and the epochs to 30 and lastly I used all of the included 25000 pictures(9617/animal for training, 1922/animal for validation and 961/animal for testing)
imgur.com/a/DprZDhl
try adding some dropout layers, i think it will help with the overfitting.
Finished this. Liked it very much. Going to your website to find more.
This video deserves 1 million views.
Excellent Course to understand the documentation and practice DL
great now i should learn semantic segmentation
Lol, this video is impossible to learn from. Every 10 minutes I find myself drifting off, daydreaming about the instructor. Gorgeous!
The right time , the right video😁😁
She is really good at teaching
Thank you for this great contribution. On my list to watch soon.
so wonderful, hope you guys keep this up to date
2:09:10 - Expresso is just the coffee. It becomes a cappuccino when you add milk and foam. If it's getting classified as expresso then I wonder if the original dataset labels were added incorrectly by human editors that didn't know the difference.
This pic has the classic look of a cappuccino. I too was thinking about some mislabeling in dataset. Note, espresso is both the brewed coffee ingredient in combination drinks like cappuccino, latte and cortado, as well as a drink in itself.
Your very talented..thanks for your well explained instructions..
Very well explained.. Thank you Mandy @deeplizard. Would be great if you can make one such video series on RNN
Now gal gadot teaches you keras for free
Lol
Most beautifull teacher < 3
2:29 today 05/05/2023 need this:
x = mobile.layers[-5].output
x = tf.keras.layers.Flatten()(x)
output = Dense(units=10, activation='softmax')(x)
thank you man, needed that, do you have any explanation for that?
1:16 I am sure I missed something, Still.... Where does that labels come from? ? How did it distinguish cats and dogs?
Really appreciate your Hardwork 👍.
at 2:09:04 into the video, the picture 2.png is identified as espresso and not cappuccino. This may be due to the fact that imagenet does not have a class by the name of cappuccino.
How do I identify that the indices actually correspond the label? How if my labels are 3 and 10 for example? How I can be sure that index 0 do not correspond to label 1 and viceversa?
At 2:09, I think the model did a deep learning to see the espresso layer under the cappuccino top.
Running huge blocks of code like 20 lines of imports and 20-30 lines of folder creation was a huge inconvenience for me. Those huge blocks of codes should have been freely available to people watching this video.
a very nice and clear explanation. Thank you mandy
Just started my journey ❤🎉
So cool to learn data science from Ann Perkins!
Not only was I not distracted by her beautifulness, I was actually able to understand everything she said. Thank you!