Compiling how I fixed this. 1) In the bottom right hand corner open up the notebook. Change the environment from "pin to the version" to "Always use latest environment". 2) ddg_images has been deprecated - from duckduckgo_search import DDGS def search_images(keywords, max_images = 30): print(f"Searching for {keywords}") return L(DDGS().images(keywords,max_results=max_images)).itemgot('image') Use this function instead. Like for visibility.
@@mustyyunus9150 Here's the full working snippet: from duckduckgo_search import DDGS from fastcore.all import * def search_images(keywords, max_images = 30): print(f"Searching for {keywords}") return L(DDGS().images(keywords,max_results=max_images)).itemgot('image')
Sir. I remember back in the day I wanted a bachelor in data science and started reading your books. Now I have been admitted to a graduate program. Thank you, you are doing a lot for this field.
Sir, you thought me 20 years ago when I was studying at QUT. Great to see you are still teaching - you have a great talent at that! All best, greatings from Poland!
As a middle aged hardware enginner I went to a ML workshop at work which started with, "So is everyone familiar with matracies". All the graduates nodded. So fast ai is a tool that looks hugley benificial . Cloud based jupyter notebook is a big nono for industry security though so im running in pycharm which isn't so straightforward but works so far. Many thanks for this development.
Hello there... have you been able to get the first lesson setup in PyCharm? If so could you please assist me with getting it setup as I am having issues with the download_images() section (in the for loop)... I keep getting the following error "OSError: [Errno 22] Invalid argument:" no matter what I try
Hats off to you Jeremy, and everyone at Fastai. Over the years your course has improved and improved, and today it is truly a well oiled machine. Keep it free, keep it democratic.
You sir are a saint. My adhd rarely lets me truly focus on a video lecture ,but you had me dialed in. Thank you. I am looking forward the rest of the course videos.
From the bottom of my ad agency just-get-it-out-the-door developer's heart, thank you sir, for your pragmatism and amazing instructional style. This is the course I needed to connect my world to AI, your changing lives my friend!
Oh I am super happy that you are doing this, I loved the course 2 years ago and I have benefitted hugely. I am helping to educate others and will definitely be enjoying this course with you.
No Teacher has got me interested into a course this much only after first Lesson This was packed with so much information but presented in such a good way that it felt like I am reading a children book.
Thank you so much for all your hard work in putting together this new version of the course! I'm really excited to see what's in store!! Thank you again for all your hard work, it's truly appreciated!
Great videos and notebooks! Just a quick remark on 43:30: I use "map" myself quite often, but the combination with "unlink" seems a bit weird since it does not return anything, it causes a side effect instead
Jeremy please don't stop the course mid-way like the 2020 deep learning for coders. I am trying to learn the second part of the book but it's much more difficult without your guidance. Please finish the whole book.
around 53:15 you explained that fine_tune method teaches the model about the difference between datasets. From the docs I understand that the argument you pass in to the method is number of epochs. what is an epoch? is it an attempt?
I noticed that in my case path.mkdir results in an error saying that such directory doesn't exist. I looked up the api for mkdir in utility functions sections of the fastai library and seems that api has changed since recording of this video? anyways, instead of path.mkdir I used mkdir(path) and everything works like charm
One thing that bothered me a little bit: Howard says roughly "we can learn anything if our model can represent the function." But there isn't actually a promise that you can hill climb into a good set of weights just because the model is capable of representing a good function. A lot of the work on improving architectures is about improving learnability, not representability. For example, in theory shallow networks of arbitrary width are universal approximators, but in practice we have gotten better learning performance by making networks deep.
if you get this error on dataloaders "ValueError: This DataLoader does not contain any batches" --> you are testing only a few images, the batch size is by default bs = 64. If you change that value for a lower one, it will work. ie. dls = db.dataloaders(path,bs=5)
I found interesting that you didn’t go into details on the tabular section of the presentation. I believe that is the only section we don’t have pre-learned models to assist. The example you showed was only able to achieve a 0.6 loss on trained data.
We appreciate all you have done for this field, Jeremy. I would be interested to know if you feel neural nets are now in the 'trough of disillusionment' on the Gartner Hype Cycle?
In the video Jeremy mentions that for tabular data we normally won't have a foundational or base model to fine tune and that's fast ai uses fit_one_cycle. For the use case of creating a recommendation system wouldn't it make sense to say that my first version of the model becomes my foundational or base model and as I get new data from users I could fine tune that model? That will save me costs and effort same as foundational models do, or am I missing something? I could keep iterating this same way and then fine tunning tabular data and starting up with a pretrained model will make sense. I'm sure I might be missing something but cannot think on what it is.
"the only prerequisite is that you know how to code (a year of experience is enough), preferably in Python, and that you have at least followed a high school math course." from the preface of the text book that this course is based.
Anyone else wish they could merge these courses to their brains instead of pushing all the data through their feebly equipped attention spans and comprehension algorithms.
Hi Jeremy, will the 2020 version of the course be archived? I just finished lesson 8 a few days ago and I still have the 2020 version on one of my tabs. Will the 2020 videos remain on the channel publicly? I imagine the jupyter notebook contents will no longer match. - Jack
I am trying to replicate this but I am getting issues with the DataBlock. Also, in this example where path = Path('bird_or_not'), is this folder created or you are supposed to create it manually?
I wonder if anyone can help me with this. I’m following along with the code in the notebook but using pycharm. I’m just at the beginning but getting an error saying the Image class doesn’t contain a definition for to_thumb I’ve used all the same imports as the file and installed the required repositories.
Anyone else not able to get this running? It fails at the grab 200 images section. I tried for a couple of days and got nowhere, even with the documentation, I gave up and just copied everything verbatim, just to get something running, and even that didn't work.
Hi Thanks for the great video. I went thru the bird and forest examples. Work great. I have a question for you about regarding weights (at about 1:15:50). Generally, as more inputs are fed into the model, the weights are adjusted so that the loss gets closer to 0. But how can we prove the new weights won't make it worse for the previous input? How can we ensure that the model will merge toward lost=0 for all inputs? I still can't understand how the adjusted weights after the 1000th image still good for the 1st image. Thanks
Hey Mike! You probably have this answer by now, but the loss function is generally computed across all inputs (or a statistically representative sample of them) before updating the weights, rather than going one-by-one.
Compiling how I fixed this.
1) In the bottom right hand corner open up the notebook. Change the environment from "pin to the version" to "Always use latest environment".
2) ddg_images has been deprecated -
from duckduckgo_search import DDGS
def search_images(keywords, max_images = 30):
print(f"Searching for {keywords}")
return L(DDGS().images(keywords,max_results=max_images)).itemgot('image')
Use this function instead. Like for visibility.
Thanks for this. I was stuck for a while.
this worked!! thank you so much ❤
I've run into the same problem, but it doesn't seem that API/package is working anymore either.
Please why is it showing name "L " is not define?, I mean after I wrote exactly ur procedure.
@@mustyyunus9150 Here's the full working snippet:
from duckduckgo_search import DDGS
from fastcore.all import *
def search_images(keywords, max_images = 30):
print(f"Searching for {keywords}")
return L(DDGS().images(keywords,max_results=max_images)).itemgot('image')
Sir. I remember back in the day I wanted a bachelor in data science and started reading your books. Now I have been admitted to a graduate program. Thank you, you are doing a lot for this field.
Congrats man.
ChatGPT has recommended this channel.
yess me too!
haha like me man
Sir, you thought me 20 years ago when I was studying at QUT. Great to see you are still teaching - you have a great talent at that! All best, greatings from Poland!
Insane this knowledge is out there for free. Thank you so much Jeremy, and everyone that made this possible!
As a middle aged hardware enginner I went to a ML workshop at work which started with, "So is everyone familiar with matracies". All the graduates nodded. So fast ai is a tool that looks hugley benificial . Cloud based jupyter notebook is a big nono for industry security though so im running in pycharm which isn't so straightforward but works so far. Many thanks for this development.
you can run a jupyter notebook completely locally with no cloud connection, and it's actually pretty easy
Hello there... have you been able to get the first lesson setup in PyCharm? If so could you please assist me with getting it setup as I am having issues with the download_images() section (in the for loop)... I keep getting the following error "OSError: [Errno 22] Invalid argument:"
no matter what I try
@@boblagoon7489 it is very easy to setup jpyter-notebook in vscode
Hats off to you Jeremy, and everyone at Fastai. Over the years your course has improved and improved, and today it is truly a well oiled machine. Keep it free, keep it democratic.
You are the best. Thank you for this course! Hope you update your book in the future so that we all can keep up with the latest topics in this field.
You sir are a saint. My adhd rarely lets me truly focus on a video lecture ,but you had me dialed in. Thank you. I am looking forward the rest of the course videos.
Every time you say "not a bird" I think "not a hotdog" lol. Love the content.
why?
@@vyrsh0 Silicon Valley (tv show) reference
🤣
This is pure gold, thanks Jeremy for put so much effort in give a comprehensive education to the world in such important topic
Bro I want to learn DL just completed ML so is it good resource
From the bottom of my ad agency just-get-it-out-the-door developer's heart, thank you sir, for your pragmatism and amazing instructional style. This is the course I needed to connect my world to AI, your changing lives my friend!
Let's goooo!!
You are a god for doing this for free jeremy. Thank you so much.
Happy 1st Birthday!
Amazing as always, it's the third time i do the course, and I learn new stuff every-time! Thanks a lot for this invaluable resource!
hey, i saw this course on freecode camp, another on the website and now this and they're all different, can i start here ?
@@bunchiochi starting here is probably fine if it’s like other years
@@zanedurante3709 do i need to know ml before doing this course ?
Oh I am super happy that you are doing this, I loved the course 2 years ago and I have benefitted hugely. I am helping to educate others and will definitely be enjoying this course with you.
Your teaching style is absolutely amazing! I love this 🙂 :-)
You are too generous to put such great content in YT for free!
No Teacher has got me interested into a course this much only after first Lesson
This was packed with so much information but presented in such a good way that it felt like I am reading a children book.
Freedom for deep learning: Unlocked. Thank you sir.
Omg i've been so excited for this!
Thank you so much for all of the brilliant work you do Jeremy!
Absolutely brilliant playlist!
Thank you so much for all your hard work in putting together this new version of the course! I'm really excited to see what's in store!! Thank you again for all your hard work, it's truly appreciated!
I did the v2 course, now using these to teach my students. Thank you so much..
just finished first home work.Thank you!
Hi Jeremy! Thanks for great book. Like your approach to see a big picture.
I enjoyed the book, the course, and I'm sure i'll enjoy this too
love it! first time ever understand what is ml, of course at surface level. thank you
Excellent explanations and pedagogy. Many thanks for it!
Great introduction, easy to understand. Looking forward to completing the series.
best playlist for absolute beginners!
Again, thank you for making these available !
I'm quite excited about this course - thanks Jeremy for doing this!
weekend plan sorted, binging all video
You're a star, Jeremy. Many thanks!
Great videos and notebooks!
Just a quick remark on 43:30: I use "map" myself quite often, but the combination with "unlink" seems a bit weird since it does not return anything, it causes a side effect instead
Thanks Jeremy and the entire team for this great course.
So excited for the new course.
Thanks Jeremy!
This is Amazing Sir, i have benefited alot from it
The wait is over, superb!
Thanks for uploading, looks incredible.
Im very excited for this!
So excited !! Thank you Jeremy, for all the good work !
Thank you very much. Always enjoying your classes ...
Great lecture! Thanks!
Amazing! I will follow this series
Thanks the course,learns a lot
Not even a year later and DALL-E 2 is now the butt of a joke when compared to Midjourney. You're going to have to update these videos quarterly!
Give this man a trophy
Thank you so much for sharing this tutorials!
Excellent
Just for folks that are stuck with where to turn on internet. Its top right .. "Notebook->NoteBook Options" .
What about Deep Learning from the Foundations 2022?
It would be interesting to delve into the innards of frameworks)
Thanks a lot!
You are best!
part II topic for sure.
Amazing video, thank you so much for this lesson! 😄🙏💯
Jeremy please don't stop the course mid-way like the 2020 deep learning for coders. I am trying to learn the second part of the book but it's much more difficult without your guidance. Please finish the whole book.
Thanks so much. I love fastai
around 53:15 you explained that fine_tune method teaches the model about the difference between datasets. From the docs I understand that the argument you pass in to the method is number of epochs. what is an epoch? is it an attempt?
epochs is the number of steps you take (in say gradient descent for example).
Great course! Thank you.
Great lecture, thank you
you can use slido for participantes
This is brilliant!
I noticed that in my case path.mkdir results in an error saying that such directory doesn't exist. I looked up the api for mkdir in utility functions sections of the fastai library and seems that api has changed since recording of this video? anyways, instead of path.mkdir I used mkdir(path) and everything works like charm
Thank you so much, sir.
Very minor correction. That XKCD was actually from the end of 2014 (September 24, 2014)
Just incredible
Thank you for sharing good videos!
Thank you Jeremy.
Excelente! Gracias Jeremy
It would be nice if there was an option to change the bright white background to a softer darker color that's easier on the eyes.
One thing that bothered me a little bit: Howard says roughly "we can learn anything if our model can represent the function." But there isn't actually a promise that you can hill climb into a good set of weights just because the model is capable of representing a good function. A lot of the work on improving architectures is about improving learnability, not representability. For example, in theory shallow networks of arbitrary width are universal approximators, but in practice we have gotten better learning performance by making networks deep.
if you get this error on dataloaders "ValueError: This DataLoader does not contain any batches" --> you are testing only a few images, the batch size is by default bs = 64. If you change that value for a lower one, it will work. ie. dls = db.dataloaders(path,bs=5)
Thanks for sharing!
I was doing 2020 course, should i switch now to 2022
I found interesting that you didn’t go into details on the tabular section of the presentation. I believe that is the only section we don’t have pre-learned models to assist. The example you showed was only able to achieve a 0.6 loss on trained data.
wow. so much insight!
I want to learn all this using PyTorch. I havent went through other lectures, can someone clarify is Fastai is being used throughout this course?
We appreciate all you have done for this field, Jeremy. I would be interested to know if you feel neural nets are now in the 'trough of disillusionment' on the Gartner Hype Cycle?
This comment aged well didn't it
In the video Jeremy mentions that for tabular data we normally won't have a foundational or base model to fine tune and that's fast ai uses fit_one_cycle. For the use case of creating a recommendation system wouldn't it make sense to say that my first version of the model becomes my foundational or base model and as I get new data from users I could fine tune that model? That will save me costs and effort same as foundational models do, or am I missing something? I could keep iterating this same way and then fine tunning tabular data and starting up with a pretrained model will make sense. I'm sure I might be missing something but cannot think on what it is.
Link doesn't work at 51:10
Thank you!
Do you have any other video with more material? I completed the book but i feel that need more feedback or something to advance.
starting now. :)
can anyone please tell me the pre-requisites of this course ?
"the only prerequisite is that you know how to code (a year of experience
is enough), preferably in Python, and that you have at least followed a high
school math course." from the preface of the text book that this course is based.
none
Path('[filename]') is this api of fastai or python runtime? is it saving file or reading the file? if saving what folder does it save to?
Anyone else wish they could merge these courses to their brains instead of pushing all the data through their feebly equipped attention spans and comprehension algorithms.
Where can I get the Meta Learning book from?
thank you!
why is this not in trending?
What basic concept i have to know to understand this course
Calculus, Linear Algebra, and proficient in python programming.....
Hi Jeremy, will the 2020 version of the course be archived? I just finished lesson 8 a few days ago and I still have the 2020 version on one of my tabs. Will the 2020 videos remain on the channel publicly? I imagine the jupyter notebook contents will no longer match. - Jack
watching in 2025 . can not believe how far we have come.. Some days ago r1 launched which explains his thinking.
I am trying to replicate this but I am getting issues with the DataBlock.
Also, in this example where path = Path('bird_or_not'), is this folder created or you are supposed to create it manually?
I wonder if anyone can help me with this. I’m following along with the code in the notebook but using pycharm. I’m just at the beginning but getting an error saying the Image class doesn’t contain a definition for to_thumb
I’ve used all the same imports as the file and installed the required repositories.
Anyone else not able to get this running? It fails at the grab 200 images section. I tried for a couple of days and got nowhere, even with the documentation, I gave up and just copied everything verbatim, just to get something running, and even that didn't work.
23:39 I heard that as "a lot of meth" and I had a good laugh
Hi Thanks for the great video. I went thru the bird and forest examples. Work great.
I have a question for you about regarding weights (at about 1:15:50). Generally, as more inputs are fed into the model, the weights are adjusted so that the loss gets closer to 0. But how can we prove the new weights won't make it worse for the previous input? How can we ensure that the model will merge toward lost=0 for all inputs? I still can't understand how the adjusted weights after the 1000th image still good for the 1st image. Thanks
Hey Mike! You probably have this answer by now, but the loss function is generally computed across all inputs (or a statistically representative sample of them) before updating the weights, rather than going one-by-one.
@@danielquandt9953 Thanks, that helps explain it. I understood it wrong all this time ;)
How much python I need for this course upto oops ?is ok ?
Amazing!
when I try to get the images off DDG i am getting an HTTPError not a JSON, has anyone else experienced this? and if so how did you fix?
I've shared the fix.
after so long!