@@mfaiz6 My personal opinion but I would say you should have some level of knowledge of working with python. Be somewhat comfortable looping and iterating through data structures like dictionaries, lists, arrays, etc. and writing functions for basic tasks and printing/writing to console. You should also know and have basic usability of numpy arrays and pandas dataframes. From here, you can learn specific things you need by searching something you don't know via google or DDG as you need!
20 minutes in and am all in. I teach students ML and Data Science, and i keep studying the same myself. The young lady in the video covered all the necessary basics, and did it so well i might end up suggesting the same video to my students on multiple occasions. And yeah, at the end of this video, i am going to her channel and subscribing. Keep up the good work
[04:39] Just to be clear, `NaN` is not a "none-type value" indicating that "no value [was] recorded [there]" -that'd be `undefined`. It stands for "not a number" and is the result returned from trying to do an operation that can only be done on an Int/Float (or something that will be coerced into an Int/Float) on a value that isn't an Int/Float; e.g., `4 * "dog"` in JS will return `NaN`. It means you tried to do something with a number that's irrational to do with an number. Another JS example: zero divided by zero.
I find your tutorial very interesting, very clear, and very convincing. My question: Also, is there a tutorial that shows the practical application of the model you created? - I would like to learn more about how this model can be practically used for evaluating and analysing new data.
Tutorials that go from start to finish from data to model *and* explain the surrounding concepts and theory.. those are good. Maybe I should start including code too.. 🤔
Some conceptual errors present in the tutorial. Scaling the data before splitting means the train dataset is informed about data from the test set which it is not supposed to know. Random oversampling prior to the split might also overestimate the performance of the model on the test dataset because of data duplication/leakage. In general, it's best to keep the test data separate before augmenting the training data.
@21:04 when kylie was explaining multiclass and binary classification with the example of hotdog, I first remembered Jian yang's app from Silicon Valley. I really liked that you put in a small clip of it.
I like the last tutorial. I got Accuracy : 85 % with logistic regression so I wonder whetever model selection is more important then just using neurals
if you have an error with the inputs shape when you evaluate the data just do this instead of what she did: hub_layer = hub.KerasLayer(embedding, input_shape=[], dtype=tf.string, trainable=True)
I never worked on machine learning, but I can easily follow and understand what is going on. Thanks for the crystal clear and great explanation. @KylieYYing.
Excellent tutorial, There are two questions. 1. Can I use open-source large language models in your text classification code for analyzing a wine review dataset?. 2. If yes plz suggest me where and how i can change.
Thanks Kylie for explaining very clearly the concepts in different neural network architectures, the code part was also very interesting since I got to know for the first time about imbalanced learn library and about Dropout layer for dealing with overfitting! Besides, I guess we ran the model.evaluate before training the model to show the base case of randomly choosing between two labels yields accuracy of 0.5 (probability of random selection between two classes)?
Thank you so much for your brilliant tutorials and courses Kylie (please do more!!!)! Could you please recommend some books on the mathematics of machine learning (and books that you found useful when you dived into the subject).
Hi, great tutorial but i think you have a mistake: you are leaking information from train to test. Both scaling and resampling must be done to the train and then to the test separately, not to the whole dataset 🙃
Thank you for a well crafted tutorial. My question is on what you did with the imbalanced dataset? Creating an artificial or synthetic data and use that as a basis for the ML model seems to be questionable to say the least. It feels like we are introducing a lie into the model for the sake of an artificial equal outcome and use that for prediction. I would be grateful if you can elaborate on that, or anybody else for that matter.
After researching the history of great assets such as real estate, dividend-paying stocks, gold, oil, and other commodities, Ive come to the conclusion that most excellent assets never come down to the price you want to acquire them at. Simply get the ones you can afford right now.
hope to see this next course about machine learning using python and tensorflow. and i want to ask, what the implemention in daily life about this course, thank you
Thanx @Kylie for such wonderful tut's - how original and through, I really learned A LOT! Anyway I have a quick question, after completing evaluation with test cases - is it possible (like other ML projects) passing real life data and get the answer? Like, we build model with 'description' and 'variety' and per given 'description' can we predict possible 'variety'?
Does anyone follow along and encounter error while creating the model? It says, "Only instances of 'keras.Layer' can be added to Sequential model... Thank you
Yes same error message and I just trying to follow and run the codes this week. Is it due to latest version of Keras ? What's the solution ? Any updates from Kylie ? Thanks.
1st example: When I tried this the first time I got almost the same accuracy, but when I restarted the kernel of the notebook and run everything again I got an initial accuracy of 65% instead of 35% and that accuracy varies b etween 60 and 70% in the next steps and finally drops to about 60% when evaluated on the test data (on multiple runs the best it got was 66% but the average is much lower)... Is the notebook saving the model and updating on re-run causing overfitting or is it normal?
I believe the code randomly creates your training, validation, and test sets so the percentages of accuracy will be different between models (when you restart the notebook) because the data points used for the different sets will be different.
hey, @Kylie Ying in the diabetes model, you are having the number of neurons in first layer as 16, will it be a better option if it is 8 i.e length of feature vector. thanks.
Thanks for watching everyone! I hope you enjoy learning from the examples in this course :)
What are the prerequisite for this video?
Excellent session! Thank you for covering every topic and showing practical implementation of LSTM.
Hi, I am very excited for this video, you are a very good teacher.
@@mfaiz6 My personal opinion but I would say you should have some level of knowledge of working with python. Be somewhat comfortable looping and iterating through data structures like dictionaries, lists, arrays, etc. and writing functions for basic tasks and printing/writing to console. You should also know and have basic usability of numpy arrays and pandas dataframes. From here, you can learn specific things you need by searching something you don't know via google or DDG as you need!
Damn, you're so cool.
the way she explained backprop is so mind blowing! loved it
This is exactly what I was searching yesterday! You're amazing! Thanks for this tutorial. :)
20 minutes in and am all in. I teach students ML and Data Science, and i keep studying the same myself. The young lady in the video covered all the necessary basics, and did it so well i might end up suggesting the same video to my students on multiple occasions. And yeah, at the end of this video, i am going to her channel and subscribing. Keep up the good work
⭐ Course Contents ⭐
⌨ (0:00:00) Introduction
⌨ (0:00:34) Colab intro (importing wine dataset)
⌨ (0:07:48) What is machine learning?
⌨ (0:14:00) Features (inputs)
⌨ (0:20:22) Outputs (predictions)
⌨ (0:25:05) Anatomy of a dataset
⌨ (0:30:22) Assessing performance
⌨ (0:35:01) Neural nets
⌨ (0:48:50) Tensorflow
⌨ (0:50:45) Colab (feedforward network using diabetes dataset)
⌨ (1:21:15) Recurrent neural networks
⌨ (1:26:20) Colab (text classification networks using wine dataset
Course created by Kylie Ying
a reinforcement learning course please,please , please , really need it & you're so amazing at simplfying things and making them understand
[04:39] Just to be clear, `NaN` is not a "none-type value" indicating that "no value [was] recorded [there]" -that'd be `undefined`. It stands for "not a number" and is the result returned from trying to do an operation that can only be done on an Int/Float (or something that will be coerced into an Int/Float) on a value that isn't an Int/Float; e.g., `4 * "dog"` in JS will return `NaN`. It means you tried to do something with a number that's irrational to do with an number. Another JS example: zero divided by zero.
It is really good. I am halfway through and it keeps you engaged and learning at the same time. Great job Kylie.
finally!! i have finally understood everything after a month of struggling to do so. thank you sooo much
great content.
explained in layman terms without wasting time 👌🏻
I find your tutorial very interesting, very clear, and very convincing. My question: Also, is there a tutorial that shows the practical application of the model you created? - I would like to learn more about how this model can be practically used for evaluating and analysing new data.
you way of explaining is so good this was the first video i watched on Neural networks and iam already in love with it.
Tutorials that go from start to finish from data to model *and* explain the surrounding concepts and theory.. those are good.
Maybe I should start including code too.. 🤔
That was so well-explained and practical! Looking forward to more of these on other types of machine learning models! Thank you!
You are so awesome! this is I am searching for! it is really help a lot! Thank you all you hard work and precious time!
This was a great video. My only questions from it would be:
1) How would you set these projects up outside of colab?
2) How do we utilize the model?
Some conceptual errors present in the tutorial. Scaling the data before splitting means the train dataset is informed about data from the test set which it is not supposed to know. Random oversampling prior to the split might also overestimate the performance of the model on the test dataset because of data duplication/leakage. In general, it's best to keep the test data separate before augmenting the training data.
@21:04 when kylie was explaining multiclass and binary classification with the example of hotdog, I first remembered Jian yang's app from Silicon Valley. I really liked that you put in a small clip of it.
Haha classic!!
Thanks so much Kylie, good coding tutorial and excellent, sharp run through ML theory!
Thanks again.
It's new for me that COLAB things.
With it, I don't need deal with Python environment questions any more!!
Amazing good tool
The Silical Valley insertion was really cool.
I like the last tutorial. I got Accuracy : 85 % with logistic regression so I wonder whetever model selection is more important then just using neurals
Really great video, great explanation of concepts in very easy/ layman terms. Well done!
Thank you once again Kylie!
if you have an error with the inputs shape when you evaluate the data just do this instead of what she did:
hub_layer = hub.KerasLayer(embedding, input_shape=[], dtype=tf.string, trainable=True)
Thank you so much this viedio really make me understand ML easier than ever I learn about this topic
Great course.
I never worked on machine learning, but I can easily follow and understand what is going on. Thanks for the crystal clear and great explanation. @KylieYYing.
Excellent tutorial, There are two questions. 1. Can I use open-source large language models in your text classification code for analyzing a wine review dataset?. 2. If yes plz suggest me where and how i can change.
not hot dog :D, this part is still round in my mind, and the funny part for helping me to grasp what is binary classification is
thanks amazing teacher
Thanks Kylie!!! Awesome content.
Thanks a lot for this awesome video. It helped me a lot in my college project
Thanks Kylie for explaining very clearly the concepts in different neural network architectures, the code part was also very interesting since I got to know for the first time about imbalanced learn library and about Dropout layer for dealing with overfitting! Besides, I guess we ran the model.evaluate before training the model to show the base case of randomly choosing between two labels yields accuracy of 0.5 (probability of random selection between two classes)?
it's learningggggg !!!! TENSORFLOW! 🔥🔥💕💕
I saw the thumbnail that was Kylie, so I gave it a Like already.
at 1:12:25 , feature scaling should be done after splitting into training & testing data in order to avoid information leakge
I want to be as smart as "Kylie Ying" when I grow up. LMAO! 🤣🤣🤣
Same. :)
You are a great teacher
Code squad. Love it. 😊
Thank you so much Kylie!
very good video, start practice wthi this watched till 13:00
Thank you so much for your brilliant tutorials and courses Kylie (please do more!!!)! Could you please recommend some books on the mathematics of machine learning (and books that you found useful when you dived into the subject).
I enjoyed your tutorial Keep it UP Girl, Your ROCK 💪
Thank you for making this! Please make it a series if you can
Superb teaching!!!
Guys this is pure diamond 💎💎💎
You are great sister. You have helped me a lot with this tutorial. 😍
Your analogy’s are awesome very easy to understand thanks
This is interesting to watch. Thank you!
Nice video, you really sparked interest in ML and are looking foward to future content! Keep it going!
you teach really well i am impressed seriously i mean it
A great one, I love your mode of teaching, simple
Sharing your knowledge it is invaluable. Thank you 1000 times
Well explained. Thanks
OMG Kylie is here wow new machine learning course 😍
Thank you very much for your tutorial!
Great, amazing and charming work, thank you.
Hi, I am very excited for your new amazing video, thanks , you are a very good teacher.
Thank you for the excellent overview!!!!
Hi Kylie.... Big fan of your work... Quick Question. In your nn model, why did u not add any input numbers or nodes ?
You are amazing! Thank you very much.
Love that intro 😂 😂
very clearly explained
great job
Oh man, was fasting today and the example at around 20:00 with the hot dog, pizza, and ice cream had me dying😅
Was saved by the Silicon Valley clip😂
Thank you
This tutorial can be called "Neural networks crash course with practice problem". Thank you!
Very informative thank you
Hi, great tutorial but i think you have a mistake: you are leaking information from train to test. Both scaling and resampling must be done to the train and then to the test separately, not to the whole dataset 🙃
Thank you for a well crafted tutorial. My question is on what you did with the imbalanced dataset? Creating an artificial or synthetic data and use that as a basis for the ML model seems to be questionable to say the least. It feels like we are introducing a lie into the model for the sake of an artificial equal outcome and use that for prediction. I would be grateful if you can elaborate on that, or anybody else for that matter.
After researching the history of great assets such as real estate, dividend-paying stocks, gold, oil, and other commodities, Ive come to the conclusion that most excellent assets never come down to the price you want to acquire them at. Simply get the ones you can afford right now.
Great course!
you are awesome ! Very very clear explanation
hope to see this next course about machine learning using python and tensorflow. and i want to ask, what the implemention in daily life about this course, thank you
Great lesson, love to see more of your
Love it
Just grateful thak you.
Amazing thanks :) glad to see a girl on your channel doing a tutorial for NLP !
Nice tutorial btw
Really awesome work!
Great tutorial
Informative tutorial.
I am good the tutorial was straight forward.
i love these video, keep making it.
The hotdog / not hotdog had me dying😅
Great video!!
Thanks kylie
this is really good video. watching
We need Javascript TF tutorial as well. Thank you.
keren banget mbakkk
I think you could have used an « else » here :) 0:05
Great video !
thx 4 vid !~
Thanks
Thanx @Kylie for such wonderful tut's - how original and through, I really learned A LOT!
Anyway I have a quick question, after completing evaluation with test cases - is it possible (like other ML projects) passing real life data and get the answer?
Like, we build model with 'description' and 'variety' and per given 'description' can we predict possible 'variety'?
Does anyone follow along and encounter error while creating the model? It says, "Only instances of 'keras.Layer' can be added to Sequential model...
Thank you
Yes same error message and I just trying to follow and run the codes this week. Is it due to latest version of Keras ? What's the solution ? Any updates from Kylie ? Thanks.
1st example: When I tried this the first time I got almost the same accuracy, but when I restarted the kernel of the notebook and run everything again I got an initial accuracy of 65% instead of 35% and that accuracy varies b etween 60 and 70% in the next steps and finally drops to about 60% when evaluated on the test data (on multiple runs the best it got was 66% but the average is much lower)...
Is the notebook saving the model and updating on re-run causing overfitting or is it normal?
I believe the code randomly creates your training, validation, and test sets so the percentages of accuracy will be different between models (when you restart the notebook) because the data points used for the different sets will be different.
Thanks a million
thank youuuuuuuuuuuu
Can we have custom plugin development in java using Eclipse tutorial from scratch .
Thanks in advance .
Great work thanks its so simplified.just WOW.
1:36:40 Is it wise to set trainable=True in the embedding layer imported from the hub? Isn't the whole point that it is pre-trained?
hey, @Kylie Ying in the diabetes model, you are having the number of neurons in first layer as 16, will it be a better option if it is 8 i.e length of feature vector. thanks.
Thank you. and Thank you.
I was expecting something like : tf.keras.layers.Input(shape=(8,))
YEEAHHH KYLIE YING LADS AND GENTS!!