Transfer Learning | Kaggle
HTML-код
- Опубликовано: 4 окт 2024
- The 4th video in the deep learning series at kaggle.com/learn/deep-learning
SUBSCRIBE: www.youtube.com...
About Kaggle:
Kaggle is the world's largest community of data scientists. Join us to compete, collaborate, learn, and do your data science work. Kaggle's platform is the fastest way to get started on a new data science project. Spin up a Jupyter notebook with a single click. Build with our huge repository of free code and data. Stumped? Ask the friendly Kaggle community for help.
Follow Kaggle online:
Visit the WEBSITE: www.kaggle.com/...
Like Kaggle on FACEBOOK: www.facebook.co...
Follow Kaggle on TWITTER: kag...
Check out our BLOG: blog.kaggle.com...
Connect with us on LINKEDIN: www.linkedin.co...
Advance your data science skills:
Take our free online courses: www.kaggle.com/...
Get started with Kaggle Kernels: www.kaggle.com/...
Download clean datasets from Kaggle: www.kaggle.com/...
Sign up for a Kaggle Competition: www.kaggle.com/...
Explore the Kaggle Public API: www.kaggle.com/...
Transfer Learning | Kaggle
• Transfer Learning | Ka...
Kaggle
/ kaggledotcom
This dude is a good dude
this_comment_acc: 0.955
and you couldn't stand it and disliked it ?
How could you dislike arjen robben talking about machine learning
why give him a dislike man
Pro tip: you can watch series at flixzone. Been using it for watching a lot of movies during the lockdown.
Dan is always the Number 1 in DL and every AI related field. i really appreciated the efforts he is deploying to transfer knowledge to everyone even overseas. Thank You Dan
0:55 early layers of deep learning model identify simple shapes, later layers identify more visual complex patters and very last layer makes predictions. So most layers(early layers) from pre-trained model are useful in new application because most computer vision problems involve finding similar simple shapes.. So we reuse most of the pre-trained model replacing last layer that makes final classification. We train last layer alone now... I guess
yeah you're right the early layers are more generic(common) to the features ,the last layers are however more and more specific to the original trained data.Thus the need to replace the final layers
Kaggle machine learning lessons are really good, I love that!
Great ! Can u do a video for image similarity using siamese network on custom dataset. Thanks
great explaination .totally loved it
Awesome
Which would be the best architecture for medical image classification for skin diseases?
BungholeNet
Explanation of details. Yes!
Is it possible handle transfer learning problems without Deep Learning?
I have a question regarding the exercise for this lesson. How did the model know what images were urban and which were rural, if all the images were mixed in the same directory? By this I mean, how did we include the label for each one of them, to make the supervised learning?
Edit: I found out. You specify a bigger directory which contains subfolders, and the model interprets each of them as a different class
Too much technology for you
@@trexmidnite ?
Does this mean that if we are not training the first layer then we are not training any layer except for the last one(the one we just added)???
Can anyone tell me whether this makes sense. I get better result with all layers trainable = true. If so, what is the purpose of freezing?
The purpose of freezing is keep the values of the weights if you set all layers as trainable = true you are going to lose the previous learning of the model.
The ResNet50 is trained on a lot of data for a lot of features. Setting all layers as trainable will make you lose that information. Also, 72 images are not enough to train a model like ResNet50! It will most definitely overfit.
I like this guy
Great tutorial
WTF this is my first time seeing a video with no dislike.
How do we separate the data in the training set and validation set into the two categories, i.e.c, urban and rural? There was no step shown that indicated that the model knew which pictures were urban and which were rural.
I know this is way too late but you need to create subfolders in the train and val folder and it will automatically use them
@@PlayaBurger Thanks!
This man barely blinks
Great explanation 👍
Kaggle, please note during my Transfer Learning process, I faced an issue with RestNet 50 file path, it wasn;t able to read file paths.
Go to Kernel settings (right side of the kernel), and switch on 'Internet'. Hope this helps. Cheers :)
@@aseem-pandita actually, I didn't load the resnet50 files.
😅😅😅
BiryaniNet works better
'ResNet50 is not found'
What happens if you just add those two (or more) new classes to the existing classification layer instead of replacing it completely?
we actually remove the last layers during transfer learning because they learn features that are specific to the original training set.The early layers of our network detect common features such as patterns,blobs etc,making sense to remove these input specific layers before classification.
from keras.preprocessing.image import ImageDataGenerator
data_generator = ImageDataGenerator(preprocessing_function=preprocess_input)
The above code raises:
NameError: name 'preprocess_input' is not defined.
How to fix this??
from tensorflow.python.keras.applications.resnet50 import preprocess_input