Lesson 8 - Practical Deep Learning for Coders 2022
HTML-код
- Опубликовано: 10 июл 2024
- 00:00 - Neural net from scratch
04:46 - Parameters in PyTorch
07:42 - Embedding from scratch
12:21 - Embedding interpretation
18:06 - Collab filtering in fastai
22:11 - Embedding distance
24:22 - Collab filtering with DL
30:25 - Embeddings for NLP
34:56 - Embeddings for tabular
44:33 - Convolutions
57:07 - Optimizing convolutions
58:00 - Pooling
1:05:12 - Convolutions as matrix products
1:08:21 - Dropout
1:14:27 - Activation functions
1:20:41 - Jeremy AMA
1:20:57 - How do you stay motivated?
1:23:38 - Skew towards big expensive models
1:26:25 - How do you homeschool children
1:28:26 - Walk-through as a separate course
1:29:59 - How do you turn model into a business
1:32:46 - Jeremy's productivity hacks
1:36:03 - Final words
Transcript thanks to fmussari and bencoman from forums.fast.ai
Timestamps based on notes by Daniel from forums.fast.ai
I can safely say that Jeremy, and by extension, Fast AI have helped me power through some of the most difficult times in my life. The end result was my complete pivot towards a new field and I have never been happier or more driven. Thank you doesn't even cut it.
Amazing to hear! :D
I finish part 1 and part 2 is already up. What a great world we live in ladies and gentlemen
Thank you, Jeremy and Fast AI team! I'm very grateful for being able to go through this course.
What a course! I've never come across such a hands-on AI course before. I'm not technically a "coder", but the notebooks linked in the lectures were just perfect to experiment with and learn, bit by bit. Thank you, Jeremy. Looking forward to joining Part 2 of this course live!!
You're a massive inspiration and role model Jeremy - thanks for the excellent course
great 😍..a pleasure to learn from a Deep Learning O'Sensei.
To quote Master Egami "Not every Sensei is a master and not every master is Sensei"
but you definitely are!! ..Many thanks for sharing your knowledge and experience and opening the path to others
Thank you for putting this together! Looking forward to the next part.
I am so grateful for this course. This is the first time I've truly learned in years.
Thank you FastAi-team for an excellent course! It gave me belief that these things are possible to learn, even at an older age without going through huge amount of math.
Thanks a lot, Jeremy for your efforts and work :)
Gracias por tanto Jeremy! Your work is doing a lot for helping all of us who come behind you, thanks for sharing with us your knowledge, your experience and your passion, it’s priceless
Thank you! I really enjoyed this course. Lots of hands-on practice with clear and succinct explanations. Eagerly waiting for part 2 now...
Great to hear!
Just finished the course!! thank you so much :)
Thanks @jeremy. Loved the course.
Thank you very much for this course! Not just the content, but the way it is presented helped me massively to understand the field better! I'm looking forward to part 2. Also thanks for your character-insights (ie. do things differently, persistence) you have presented at the end.
You're very welcome!
Thanks Jeremy for this amazing content. You're an incredible pedagogue.
Glad you think so!
Thank you so much!
I have tried paid courses before and I always got stuck at the math mumbo jumbo. This course is orders of magnitude better than everything else I tried around. I would gladly pay for this one as I paid gladly for your book!
This fastai course is the time I have gone farthest with fastai. Still a lot to do, but really hoping you'd do Part 2 :)
Thank you Jeremy. This is by far the best content on the Internet.
Thanks, Jeremy, this was an amazing course! Very helpful. I am at the start of my career as a Data Scientist, will share on forums my achievements!
Glad it was helpful!
AMA session timestamps:
1:20:57 - How do you stay motivated?
1:23:38 - Skew towards big expensive models and huge amount of data
1:26:25 - How do you homeschool children science and math?
1:28:26 - Walk-through as a separate course and coding sessions
1:29:59 - How do you turn model into a business?
1:32:46 - Jeremy's productivity hacks
1:36:03 - Final words
Thank you Jeremy and the team behind this. Very grateful that you give this information for free.
Thanks a lot!
Okay, I'll watch LA Confidential now. Will watch the lecture later.
Thank you Jeremy, I completed the last version of Part 1 too and this is a definite improvement. This has been both inspiring and useful. What more can you ask for from a course?
Are there any plans to deep dive into Timeseries data?
I'm no expert but time series is basically just tabular... Especially if you are splitting a continuous time series into sequences... I have just done a project to that effect and tabular_learner worked great. Additionally all the work on the titanic data would all be very similar steps just without any cat variables and all cont variables
@@broombroom3208 h7huo
1:00:00 And these simple convolution steps can handle whatever we want to detect being rotated and scaled in any way on the picture?
Is there going to be an update for the Computational Linear Algebra course?
Where I can download conv excel find?
29:45 I can agree that anime people watch wayy too much anime!
Could it be better for educative purposes if lots of libraries are not imported by "import *"? For example just looking at cont_cat_split is not clear to me if that is self made function or fast ai function or what? fastai.cont_cat_split(), would be clear. Thus making it easier to build understanding what kind of services different libraries offer. This is small detail. All in all another great video. Thanks!
Type "cont_cat_split" in any cell and hit shift-enter, and it'll tell you where it's from. Jupyter best practices are different to regular editor practices, since you're interacting with the interpreter directly. So there's no need to scroll up to the top of the file, find the symbol in the imports, see what it says, and scroll back to where you were -- you can always see directly exactly what's in every symbol, and where it's from!
I confirm about the anime, I've watched enough anime for the time to be equivalent to 5 years without doing anything else.
The concept of dropout seems very counter-intuitive: we improved learning outcome by removing information from the system! I would imagine there is some trade-off where it speeds up learning by sacrificing the highest attainable accuracy.
Interestingly enough (as I learned recently) one can keep dropout enabled during inference to model uncertainty of predictions. I've never thought about it in this way before.
I have to disagree. Robocop 3 was bad, but amongst the worst 5?
29:34 NOOOOOOOOOO...................A lot people watch anime! I am one of them, LoL.
Thank you, Jeremy and Fast AI team! I'm very grateful for being able to go through this course.