Man your courses taught me to lose my ego, and to live out for the people. Thank you so much for this. This has to be a blessed deed that you did this.
By watching your deep learning whole series i have cleared my bunch of doubts . Your way of putting any topic so simple is really amazing. Hatts off to your teaching methodology. A heartiest request to you to Please upload NLP series (including LSTM and Encoders/Decoders) with a good project. I am eagrely waiting.
For those who are getting this error Only instances of keras.Layer can be added to a Sequential model. Received: (of type ) try this, import tf_keras and replace every 'tf.keras' with 'tf_keras'
Having error while using the pretrained model, error : ValueError: Only instances of `keras.Layer` can be added to a Sequential model. Received: (of type ) please help.
That example with the flower identified as an umbrella got me thinking. I applied your model to my data. I have pictures of an object with rust and blemishes on them. The model identified a few pictures as a snake of some sort or a hyena. If I do a case study of a bunch of pictures in my data and feed them to the model, I should be able to program into it the actual appropriate identity once the model has ascertained it's a snake, hyena or whatever.
Excellent course. This is my "Go To" resource for Machine Learning and Deep Learning. Please think of creating a series of projects that can make one grow their skills. Keep the price affordable.
import tf_keras IMAGE_SHAPE = (224, 224) classifier = tf_keras.Sequential([ hub.KerasLayer(model_link, input_shape=IMAGE_SHAPE+(3,)) ]) First install tf_keras by pip install tf_keras then, import it by import tf_keras And use it classifier = tf_keras.Sequential([...]) For me it is working Replace tf.keras to tf_keras everywhere
I don`t have words to thank you for this deep learning series, you have explained every single topic in simple language. Thanks a ton :) :) Keep Uploading such series :)
Hi Thanks for the great explanation as always. Just wanted to know about model ensembling or stacking across different datasets. As in developing a one single model which is trained from 2 separate datasets which have their own separate set of features (which are not overlapping with each other). Is it possible and are there any examples which are already done? Can you please shed a light on this..Thank you so much in advance...
Wouldn't it affect the ability to recognize images other than these flowers because now the last layer has been trained with a smaller dataset? Like recognizing a chair or. bike
can you show how prediction is done after transfer learning is done .... i mean after retraining the model with flower images plz show how to check results... thanks a lot... i tried but couldn't do ....thanks a lot
Hi sir, How can i remove the last layer , without using feature_vector model. what i mean is how to remove the last layer from the classification model(mobilenet_v2) and freeze the trained layers .
Hello, why are there so many MobileNet model variants on Tensorflow hub? Like mobilenet_v2_50_192, mobilenetv2_035_96, mobilenetv3_large_100_224 and so many other variants? What is the different between these?
@codebasics this is greatest video of transfer learning i have ever seen, usually people load dataset from tensorflow directly instead of local disk. You load it from local disk which is great thing. I just have one question what you appended in "y" , class name or class Id ?
You predicted using a classifier that was pre-trained(in which you got inaccurate results), but I wanted to see how we can give one input image and check the output by using the new classifier.
thank you sir for the wonderful tutorials... I am following all the steps in this video as well in previous episode (26), I get an error when I train the model, since I am new I could not figure out how to fix this error. error: ValueError: Data cardinality is ambiguous: x sizes: 2752 y sizes: 918 Make sure all arrays contain the same number of samples.
sir, u have taught so much of free that i have not learnt in any of the paid courses. i am trying to use ML and AI in financial instruments for trading and investing. but due to non familiarity with ML and AI world I m not able to figure out from where to start and what is relevant for me. plz guide me.
So let me see if I understand...we froze all the weights/biases/filters for all the layers except the final one. So when we say we are "training" the model with our flower data set, we are only optimizing the weights and biases of the final layer (i.e. the input to the softmax function)?????
YOU ARE A LEGEND! Thanks for helping me understand how transfer learning worksQ Just had one question... I understood everything but this piece of code: model.compile( optimizer="adam", loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), metrics=['acc']) Where did you get this? Do I use this with any model I use for transfer learning?
Hey, I really liked your videos you are explaining everything in a quiet simple way, but can we use transfer learning on non_image data set? , like I have a dataset composed of srtings I transformed it to arrays and I want to make predictions using arrays, does transfer learning help in this case? Thanks a lot
@@yassmingourya9975 please did you found documentations for using transfer learning on non_image data set , i need to do it but can't find anything useful. thank you
Thanks a lot , I appreciate your way of explaining deep learning 😊 I would like to ask about image classification using transfer learning .. if I have a dataset of images with the size 100x100 .. and the image size in the pre trained model is 300x300 .. a bigger size .. how do I make my smaller sized original images fit in that pre trained model ?
Hi, great tutorial! Quick question, when training the single dense layer around minute 24:00, why is there no activation function (e.g. softmax) applied to the final, trainable dense layer?
Hello guy ! I need help . Specifically, I followed your code but when I bring that model to deploy in my project, using load_model does not work. Ok i understand it doesn't save the complete model but i want it to save only the part i changed (output layer) so when i use it i need to have it (custom_objects={'KerasLayer':hub.KerasLayer }) . But the problem here is that when I use the above command, my model will not work after a period of time (according to my understanding, when I load the model in the above way, it will download a temporary folder containing the information. the model number and structure ( which contains some .pb and pbtxt files ) - this folder will be partially deleted by something after about a week , so it causes an unusable error , If you want to use it again, you need to delete the folder and let it download the new one). I don't want my program to crash every once in a while. Do you have any way ? In addition, I have found a way to fix it, but it is not a good way. I use :" from tensorflow.keras.applications import " to load sample models and add classes directly to this model. It works like a self-built model so you don't have to reload the keras layer every time you use it (which caused my error). However, this way it is quite resource-intensive. Do you have a better alternative? Thanks for reading . Looking forward to hearing from you soon.
Hi sir, i watch your videos every day But with the same dataset, i am getting losses of 1.6 and accuracy close to 0.21 with transfer learning as well as with cnn So please tell how can i improve them
when we use pupil detection, i mean wha t I need is to localize the pupil center, are we using classification? just like left eye and right eye ? or something else? please reply.
model.fit(X_train_scaled, y_train, epochs=5) ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type list). error occur at this line
Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced
Man your courses taught me to lose my ego, and to live out for the people. Thank you so much for this. This has to be a blessed deed that you did this.
By watching your deep learning whole series i have cleared my bunch of doubts .
Your way of putting any topic so simple is really amazing.
Hatts off to your teaching methodology.
A heartiest request to you to
Please upload NLP series (including LSTM and Encoders/Decoders) with a good project.
I am eagrely waiting.
Yes nlp series is in my plans
@@codebasics sir, when will you starting that series?
here how can we test the model after train the model plz tell me
@@codebasics here how can we test the model after train the model plz tell me
A very humble request, please add atleast 5-6 more videos in this series.
With a clear example you explained transfer learning so beautifully. LOved to see your tutorial. Thank you so much sir.
Wow, such a great techique to train models. Thank you sir for making it easy for us
Your way of explaining complex issues in such an easy and simple way is truly amazing. Thank you!
For those who are getting this error
Only instances of keras.Layer can be added to a Sequential model. Received: (of type )
try this,
import tf_keras
and replace every 'tf.keras' with 'tf_keras'
I. LOVE. YOU
Even though this of two years old it is still very well taught. Thank you!
Having error while using the pretrained model,
error : ValueError: Only instances of `keras.Layer` can be added to a Sequential model. Received: (of type )
please help.
Have you been able to solve this error?
That example with the flower identified as an umbrella got me thinking. I applied your model to my data. I have pictures of an object with rust and blemishes on them. The model identified a few pictures as a snake of some sort or a hyena. If I do a case study of a bunch of pictures in my data and feed them to the model, I should be able to program into it the actual appropriate identity once the model has ascertained it's a snake, hyena or whatever.
Excellent course. This is my "Go To" resource for Machine Learning and Deep Learning.
Please think of creating a series of projects that can make one grow their skills. Keep the price affordable.
import tf_keras
IMAGE_SHAPE = (224, 224)
classifier = tf_keras.Sequential([
hub.KerasLayer(model_link, input_shape=IMAGE_SHAPE+(3,))
])
First install tf_keras by pip install tf_keras then,
import it by import tf_keras
And use it
classifier = tf_keras.Sequential([...])
For me it is working
Replace tf.keras to tf_keras everywhere
Woahh!! Thank you very much for this!!
thanks!! it worked
I don`t have words to thank you for this deep learning series, you have explained every single topic in simple language.
Thanks a ton :) :) Keep Uploading such series :)
👍☺️
here how can we test the model after train the model plz tell me
@@codebasics sir in github code are not there it shows error it is moved from github????
Your teaching style is outstanding
Wow! What a simple and clear illustration of Transfer Learning. Thank you!
really helpful....you are the best teacher for deep learning....also for machine learning.....thankyou so much...... i m a big fan of yours.....
😊🙏
Omg..u have cleared my lot of doubts here...superbbb explained...thank you much sir. Please contribute more towards RNN also like this.
Sure I will
Really great lecture. Keep helping the people like that. Thank you so much.
My pleasure
@@codebasics here how can we test the model after train the model plz tell me
Amazing explanation, very easy to understand
no words to praise, the session is superb , thanks a lot for sharing your knowledge
Hi Thanks for the great explanation as always. Just wanted to know about model ensembling or stacking across different datasets. As in developing a one single model which is trained from 2 separate datasets which have their own separate set of features (which are not overlapping with each other). Is it possible and are there any examples which are already done? Can you please shed a light on this..Thank you so much in advance...
In the last layer of the trained model, though softmax layer is being used , how we are getting some of our prediction values to be greater than 1??
Very Lucid Explanation
Wouldn't it affect the ability to recognize images other than these flowers because now the last layer has been trained with a smaller dataset? Like recognizing a chair or. bike
can you show how prediction is done after transfer learning is done .... i mean after retraining the model with flower images plz show how to check results... thanks a lot... i tried but couldn't do ....thanks a lot
Hi sir, How can i remove the last layer , without using feature_vector model. what i mean is how to remove the last layer from the classification model(mobilenet_v2) and freeze the trained layers .
This is something new I learned!!!
Thanks mate... it helped.. Love from London..
Glad it helped
thanks, very easy to follow
Is it possible to do transfer learning for ANN ? Any reference available ?
I could see more examples for CNN. Is there any reason behind.
nice video sir just wanna add u must use softmax activation function in last layer directly.... please
VERY GOOD EXPLANATION
Thanks for liking
Thank you so much sir. This was extremely helpful!
Glad it helped!
Can you please analyse HAM10000 dataset. Because this dataset is highly imbalanced and it also contain meta data in csv file. So I am bit confused.
it's humble request
Hello, why are there so many MobileNet model variants on Tensorflow hub? Like mobilenet_v2_50_192, mobilenetv2_035_96, mobilenetv3_large_100_224 and so many other variants? What is the different between these?
@codebasics this is greatest video of transfer learning i have ever seen, usually people load dataset from tensorflow directly instead of local disk. You load it from local disk which is great thing.
I just have one question what you appended in "y" , class name or class Id ?
You predicted using a classifier that was pre-trained(in which you got inaccurate results), but I wanted to see how we can give one input image and check the output by using the new classifier.
What if I want to classify 1005 classes with freezing, what do I need to do?
thank you sir for the wonderful tutorials...
I am following all the steps in this video as well in previous episode (26), I get an error when I train the model, since I am new I could not figure out how to fix this error.
error:
ValueError: Data cardinality is ambiguous:
x sizes: 2752
y sizes: 918
Make sure all arrays contain the same number of samples.
from where did you downloaded those classes list
?
the model perfomance with useing 'feature vector' is increased. but when i tried to predict ([X[0],X[1], X[2]]), it perfomes so bad. why?
sir, u have taught so much of free that i have not learnt in any of the paid courses.
i am trying to use ML and AI in financial instruments for trading and investing. but due to non familiarity with ML and AI world I m not able to figure out from where to start and what is relevant for me. plz guide me.
Sir please make more videos on transferring learning.i think this is not sufficient to use transferring learning in more easy way 🙏
sir, how to search TensorFlow hub training models give me the same as you have used.
So let me see if I understand...we froze all the weights/biases/filters for all the layers except the final one. So when we say we are "training" the model with our flower data set, we are only optimizing the weights and biases of the final layer (i.e. the input to the softmax function)?????
How do you get past the hub and keras.layer zone. I have been getting value error
I need more videos on transfer learning sir
Thank you so much!!! 👏
Thank you, can transfer learning be done for time series data?.
Yes, but I’d imagine it does not have advantages over LSTM or transformer
It shows flowers as umbrella and so on..Then using mobilenet is good?or bad?
Sir... Can you please show CNN in UCIHAR dataset... Will be really helpful
YOU ARE A LEGEND! Thanks for helping me understand how transfer learning worksQ
Just had one question... I understood everything but this piece of code:
model.compile(
optimizer="adam",
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['acc'])
Where did you get this? Do I use this with any model I use for transfer learning?
This has nothing to do with transfer learning. Whenever you train a model in deep learning you need to use this line
here how can we test the model after train the model plz tell me
@@codebasics here how can we test the model after train the model plz tell me
@@bagheerathan8975
Model.evaluate(X_test,y_test)
Do you have any tutorial where u have shown any regression problem solving using deep learning?
why yolo algorithm takes hours to train to perform object detection using transfer learning
Thank you. Helpful.
hi sir can we get the ppt also in deep learning tutorials and Machine Learning tutorials that you have used to explain the concepts
Hey, I really liked your videos you are explaining everything in a quiet simple way, but can we use transfer learning on non_image data set? , like I have a dataset composed of srtings I transformed it to arrays and I want to make predictions using arrays, does transfer learning help in this case?
Thanks a lot
yes transfer learning is not restricted to only images. you can use it for other problems too
@@codebasics ah great, thanks for your reply. please can you help with some documentations or instructions on how to do i'll be so grateful
@@yassmingourya9975 please did you found documentations for using transfer learning on non_image data set , i need to do it but can't find anything useful. thank you
why we didn't use an activation function in last dense layer ?
Thanks a lot , I appreciate your way of explaining deep learning 😊
I would like to ask about image classification using transfer learning .. if I have a dataset of images with the size 100x100 .. and the image size in the pre trained model is 300x300 .. a bigger size .. how do I make my smaller sized original images fit in that pre trained model ?
Wonderful explanation!!
can we apply the same way for transforming of text sentiment model?
nice video keep going ;so what would you suggest as model from TL HUB to train signal or spectogram
Please Tell How can we save this model after transfer learning !
Please teacher; could you give me a master topic proposal about clustering images?
Thank you very much.
Glad it was helpful!
Hi, great tutorial! Quick question, when training the single dense layer around minute 24:00, why is there no activation function (e.g. softmax) applied to the final, trainable dense layer?
Edit: minute 23:30
It's taking by default value for that and its relu i think or you can check....but keeping it empty means it's taking default value for that
Hello guy ! I need help . Specifically, I followed your code but when I bring that model to deploy in my project, using load_model does not work. Ok i understand it doesn't save the complete model but i want it to save only the part i changed (output layer) so when i use it i need to have it (custom_objects={'KerasLayer':hub.KerasLayer }) . But the problem here is that when I use the above command, my model will not work after a period of time (according to my understanding, when I load the model in the above way, it will download a temporary folder containing the information. the model number and structure ( which contains some .pb and pbtxt files ) - this folder will be partially deleted by something after about a week , so it causes an unusable error , If you want to use it again, you need to delete the folder and let it download the new one). I don't want my program to crash every once in a while. Do you have any way ?
In addition, I have found a way to fix it, but it is not a good way. I use :" from tensorflow.keras.applications import " to load sample models and add classes directly to this model. It works like a self-built model so you don't have to reload the keras layer every time you use it (which caused my error). However, this way it is quite resource-intensive. Do you have a better alternative?
Thanks for reading .
Looking forward to hearing from you soon.
You need a standing ovation :-)
Hi sir, i watch your videos every day
But with the same dataset, i am getting losses of 1.6 and accuracy close to 0.21 with transfer learning as well as with cnn
So please tell how can i improve them
Thank you sir
Welcome
since, we have 1000 classes, why do we get (1, 1001), when executing result.shape
Sir can we use this method for the leaf disease project also as a extension?
when we use pupil detection, i mean wha t I need is to localize the pupil center, are we using classification? just like left eye and right eye ? or something else?
please reply.
Are you doing object detection? Have you tried yolo etc?
Sir, the tutorial is well made. But can u please tell us how to add batch size and other preprocessing steps in this code??
Can you please suggest trained model for ai based picture translation project
HI is it possible tu convert an image of 3D dimensions in to an image of 4D dimensions?
Sir how can we apply transfer learning for numerical data?
Hey can you teach us how to do real time neural network prediction using open-cv? It is very useful in computer vision
what is the version od tensorflow, hub?
holy fuck that might save me a lot of fucking trouble thank you so much for this video and for this code
very informative sir. can you explain how to combine 2 or more pretrained models..
How and why would someone do that?
What are the methods available to improve the accuracy of custom models??except trial and error.
Keras tuner try it
is this could consider as convolutional neural network?
sir, how many more videos will be there?
sir is the python project of Grocery Application Completed ?
how to know if its working? in the end after we run 5 epoch
amazing🙃
thank you so much bhaiya
I'm having a value error when trying to wrap the hub and keras.layer, how were you able to get past there?
how to increase running speed of epoch in jupyter notebook for flower dataset classification
buy gpu instead
which classifier are you using
Hey , how can I download Mobile net V2 dataset?
Thanks a lot sir
👍😊
hello how can I use Transfer learning on 1 dimension data
model.fit(X_train_scaled, y_train, epochs=5)
ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type list).
error occur at this line
model.fit only accept numpy arrays. So try running this :-
X_train_scaled = np.array(X_train_scaled)
y_train = np.array(y_train)
Wow!!!!!!! If I had an option, I would subscribed (1000000000000000000000000000....................I don't know how many zeros I would add) times.
Thanks for your kind words of appreciation Atiqur.
while i am running classifier.preduction it's showing your kernal is dead
please let me know the solution
Can you please add the fish dataset link
What was the need of adding 1 and 3 to the dimensions ?
same doubt, maybe the index/number
Tensorflow hib is not compatible with latest tf versions
Is there no way around it anymore?
this is the params that i have been getting from the model
Model: "sequential_1"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type) ┃ Output Shape ┃ Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ lambda_1 (Lambda) │ (None, 1280) │ 0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense) │ (None, 5) │ 6,405 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 6,405 (25.02 KB)
Trainable params: 6,405 (25.02 KB)
Non-trainable params: 0 (0.00 B)