👉 Check out the blog post and other resources for this video: 🔗 deeplizard.com/learn/video/dzoh8cfnvnI 👀 Come say hey to us on OUR VLOG: 🔗 ruclips.net/user/deeplizardvlog
Just found this channel 3 days ago. Have already gone through the machine learning fundamentals and now started with the implementation playlist. The incremental approach of teaching is excellent and the playlist is extremely well ordered. Great job on the effort you guys put (even including the link to the corresponding videos whenever you mention a concept). Keep it up. Gained a subscriber and I am looking forward to the upcoming videos as well
Thank you very much for this video! It really helped me to see how setting aside a validation set can be done this way, as well as the need to shuffle the data beforehand!
hi, your teaching skills are great...and most importantly an easy explanation of difficult topics makes it more excellent....thanks for all the efforts.. how can i access all 43 videos?? I can see only 21 videos here and also on the blog post there are 21 videos and text....
You are most welcome, Shalini. The second half of the original TensorFlow course has now been separated out into its own course regarding neural network deployment linked below. deeplizard.com/learn/playlist/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL2
HI, i am a little confused on the validation set. Do we need to manually create a validation set from the training set by manually removing say 10% of the training data and putting in into a validation dataset before training begins. Or does keras automatically does it for us after declaration a validation_split in model.fit ? Or that both has to be done before training begins.
You can do it either way. In this episode, we focused on the latter way of doing it (Keras splitting the validation set out for us). In the corresponding blog, there is further discussion for how to do it the first way (manually creating a validation set prior to training). deeplizard.com/learn/video/dzoh8cfnvnI
@@deeplizard Yes, thanks. I saw it. I really enjoy watching your video and have greatly benefited from it and thanks so much for sharing all these knowledge. From Singapore.
sorry nerd question: what is the hexa after "History at.." in the output? Is that a memory address? if so , just curious as to why Keras would include such low-level info.
So uhm whenever i run the model it basically starts at 97% accuracy. Why is that? My guess was that its is because model is still defined in the jupyter notebook but i restarted kernel and it still happens.
👉 Check out the blog post and other resources for this video:
🔗 deeplizard.com/learn/video/dzoh8cfnvnI
👀 Come say hey to us on OUR VLOG:
🔗 ruclips.net/user/deeplizardvlog
Just found this channel 3 days ago. Have already gone through the machine learning fundamentals and now started with the implementation playlist. The incremental approach of teaching is excellent and the playlist is extremely well ordered. Great job on the effort you guys put (even including the link to the corresponding videos whenever you mention a concept). Keep it up. Gained a subscriber and I am looking forward to the upcoming videos as well
Thank you very much for this video! It really helped me to see how setting aside a validation set can be done this way, as well as the need to shuffle the data beforehand!
NICE!!! so useful!!! thanks
I have enjoyed this video/blog several times to fully understand the shuffling/validation on-the-fly complex. Great playlist item. Thanks.
hi, mary, I'm a huge fan of you both, you ppl helped me a lot in understanding various concepts in AI, im so happy to see you both finally!!!
Hi :) , I appreciate your work, it's really helpful. Keep doing what you are doing . Thanks.
hi, your teaching skills are great...and most importantly an easy explanation of difficult topics makes it more excellent....thanks for all the efforts..
how can i access all 43 videos??
I can see only 21 videos here and also on the blog post there are 21 videos and text....
You are most welcome, Shalini. The second half of the original TensorFlow course has now been separated out into its own course regarding neural network deployment linked below.
deeplizard.com/learn/playlist/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL2
@@deeplizard alright....thanks a ton n again ur videos are best😍
Great lesson!
thank you so much
Amazing content .Thanku
I've just been doing 2 scikitlearn splits. I forget why but a lot of stack exchange people were complaining about how validation_data was implemented
Do you guys have any videos on distributional reinforcement learning?
Not at the moment
please start video series on BERT, XLNET and other NLP SOTA models please
HI, i am a little confused on the validation set. Do we need to manually create a validation set from the training set by manually removing say 10% of the training data and putting in into a validation dataset before training begins. Or does keras automatically does it for us after declaration a validation_split in model.fit ? Or that both has to be done before training begins.
You can do it either way. In this episode, we focused on the latter way of doing it (Keras splitting the validation set out for us). In the corresponding blog, there is further discussion for how to do it the first way (manually creating a validation set prior to training).
deeplizard.com/learn/video/dzoh8cfnvnI
@@deeplizard Yes, thanks. I saw it. I really enjoy watching your video and have greatly benefited from it and thanks so much for sharing all these knowledge.
From Singapore.
can you also start computer vision playlist having GANS and Transformers
I love you so much
sorry nerd question: what is the hexa after "History at.." in the output? Is that a memory address? if so , just curious as to why Keras would include such low-level info.
Ha, I've never looked into it :D
WOW
kFold? Maybe in a future vid?
just four in joy
She is very very attractive
So uhm whenever i run the model it basically starts at 97% accuracy. Why is that? My guess was that its is because model is still defined in the jupyter notebook but i restarted kernel and it still happens.