Deep Learning State of the Art (2019) - MIT

Поделиться
HTML-код
  • Опубликовано: 24 янв 2025

Комментарии • 75

  • @lexfridman
    @lexfridman  6 лет назад +96

    I'm excited about recent developments in NLP, deep RL, speeding up training/inference, big GANs, powerful DL frameworks, and real-world application of DL in driving 370k+ Tesla HW2 cars! What else would you like to see covered?

    • @firstprinciples9854
      @firstprinciples9854 6 лет назад +8

      Healthcare

    • @ryersonHAB
      @ryersonHAB 6 лет назад +2

      Hey Lex! Could you do a lecture that reviews state of the art in privacy-preserving methods?
      Would be cool to get your perspective on potential applications of federated learning, homomorphic encryption, and blockchain technologies towards models that enable secure, value-oriented distributed learning over decentralized data.
      Would be awesome to have Andrew Trask as a guest.
      www.openmined.org
      Eg. github.com/OpenMined/PySyft/tree/master/examples/tutorials
      ruclips.net/channel/UCsONJ5EfiYYfShT4VL7Bt9g

    • @nceevij
      @nceevij 6 лет назад +5

      Lex Fridman Healthcare

    • @kiegh
      @kiegh 6 лет назад +1

      Any in-depth walkthroughs of how to transfer the learning in these breakthroughs to apply to our own datasets? Maybe this isn't the right thread to ask.

    • @hummerspike
      @hummerspike 6 лет назад

      Yes! There are several programs in each step of the operating chain for databases, preloaded libraries, modeling, etc., that accomplish the same operation but with different strengths... Maybe what we could use is a walk through of the pieces of a typical operating chain. Python vs. Java is one piece for operating system, python NLP library vs. Stanford NLP library is one for preloaded algorithm sets in language processing, etc...
      I'm looking as an executive who knows machine learning is as decentralized an opportunity as the dot com boom, because someone will make a tool with the technique. Learning about spaCy and gensim was informative but these are data science problems to be solved by researchers, I only expect to incorporate these tools in my Frankenstein Monster. Still could use a treasure map, I can't afford to start a data department just yet but I'd bet big my aim is attainable now or within two years.

  • @derasor
    @derasor 5 лет назад +15

    It could be very insightful to go back to this talk in a couple of years time, and see how these ideas developped.

  • @nebimertaydin3187
    @nebimertaydin3187 6 лет назад +4

    This is the most valuable thing that I saw in 2019.

  • @wtaylorjr2001
    @wtaylorjr2001 6 лет назад +12

    This is fantastic. I am trying to create a storytelling system using LSTM s and a corpus of self-written works. Thank you for this Mr. Fridman.

  • @SaidakbarP
    @SaidakbarP 6 лет назад +2

    Lex's summary is so true:
    Stochastic gradient descent and backpropagation are still backbones of the current state of the art AI techniques. Therefore, we need some innovations to see leaps in the field.
    Thank you Lex!

  • @pratikkala7243
    @pratikkala7243 5 лет назад +4

    This is the most valuable thing that I saw in 2019.
    This guy have the same voice that the Breaking Bad's actor, Jesse Pinkman.

    • @kushxmen
      @kushxmen 5 лет назад

      And he looks like him in a way

  • @JousefM
    @JousefM 6 лет назад +6

    Thanks a ton Lex! You're one of the guys who brought me from Mech. Engineering to AI :)

    • @amritaanshnarain7524
      @amritaanshnarain7524 Год назад

      and ofc the mnoney

    • @JousefM
      @JousefM Год назад +1

      @@amritaanshnarain7524 don't get it. Is it bad to earn money?

  • @josephakindiraneverthings2988
    @josephakindiraneverthings2988 8 месяцев назад

    Awesome,, thanks
    The intro is invaluable and very helpful

  • @alaaalatif4841
    @alaaalatif4841 6 лет назад +2

    Great review on recent advance in deep learning! Would be great to see a similar review of current (immediate) challenges e.g. limited numerical extrapolation abilities, multi-task learning...etc.

  • @JeDi___
    @JeDi___ 6 лет назад +2

    Thanks for the lecture. Also, well edited (erasing pauses). Funny to see you transform to Men in Blue compared to when you started the lecture two years ago. Looking good

  • @vobjects
    @vobjects 6 лет назад +15

    So funny how Lex refers to the fast.ai folks as renegade reserchers :-D

    • @BrianMPrime
      @BrianMPrime 6 лет назад +1

      You can sort of get that vibe from the lectures. One cool moment was when Jeremy demonstrated the 3-minute DawnBench result in the middle of a lecture.

    • @vobjects
      @vobjects 6 лет назад +1

      @manideep lanka 27:30

    • @kawingchan
      @kawingchan 6 лет назад

      They also mentioned academics “don’t give a shit” about structured data in deep learning, and talk like they really are renegade outsider. Btw: “don’t give a shit” was the literal phrase used. 😀

  • @tobedecided8886
    @tobedecided8886 9 месяцев назад

    40:40 Is it just me or is Lex slowly increasing his excitement when talking about OpenAI & DOTA 2
    Anyway, great work and thank you

  • @alienkishorekumar
    @alienkishorekumar 6 лет назад +23

    Didn't understand much. But I'm excited. Moving forwards.

  • @eliasxu9342
    @eliasxu9342 6 лет назад

    Very happy to see the prosperity of deep learning. I hope I can excavate the biggest potential of DL in the field of computational advertising.

  • @katanaking90210
    @katanaking90210 6 лет назад +1

    I was absolutely shocked by how brilliant these approaches to deep learning where. I'm absolutely excited to see what we can come up with next

  • @megadebtification
    @megadebtification 5 лет назад

    Thank you Lex !! Always great to learn from you

  • @ung-k4c
    @ung-k4c 6 лет назад +5

    Amazing content Lex, always very informative to help filter the firehose of research papers.

  • @ohyoyshozebulo
    @ohyoyshozebulo 6 лет назад +4

    It would be great if you add to description links from the presentation!

    • @s.f.3605
      @s.f.3605 5 лет назад +1

      If you go here: deeplearning.mit.edu/ and you click on the slides links (es. www.dropbox.com/s/v3rq3895r05xick/deep_learning_state_of_the_art.pdf?dl=0) you get the slides and you can easily click on the links in them ^_^

  • @dmeister2909
    @dmeister2909 6 лет назад +5

    Thanks Lex!

  • @josy26
    @josy26 5 лет назад

    Awesome Lex, are you planning on doing the same for 2020?

  • @igor156
    @igor156 5 лет назад

    Awesome video. Are there good links to videos on other developments not mentioned, e.g. in healthcare and agriculture?

  • @lonwulf0
    @lonwulf0 6 лет назад +1

    Super talk, keep up the good job.

  • @johnniefujita
    @johnniefujita 6 лет назад

    adanet is very interesting! that is a very good material on progressive learning for data scarse situation! also multiple classification in rounds... and one on confusion for fine grain classifications without augmentation!

  • @movax20h
    @movax20h 6 лет назад

    So, the AutoAugment is about augmentation of worst possible inputs? Like you try to do augmented ops that are hardest to recognize correctly thus force network to learn them better?

  • @swapnilmeshram9991
    @swapnilmeshram9991 6 лет назад

    thanks lex

  • @raphaelrehman
    @raphaelrehman 6 лет назад +1

    Is there a way to solve the need for so much data?

    • @StraightCrossing
      @StraightCrossing 6 лет назад

      Not really if you're starting from scratch. Maybe someday someone will release a pretrained general purpose model. This sorta idea seems to be something that the deep mind team is chasing down. They're very interested in general intelligence.

  • @louiswang538
    @louiswang538 6 лет назад +1

    awesome !!!

  • @acedemiczoomer-je1de
    @acedemiczoomer-je1de Год назад

    lex is my favourite mossad agent 😂

  • @gto433
    @gto433 6 лет назад

    Does anyone know what technology amazon textract uses?

  • @muhammadadeel1150
    @muhammadadeel1150 6 лет назад

    Would you please make a video detailed lecture on computer vision and algorithm for real time tracking applications?

  • @muhammadusman7217
    @muhammadusman7217 6 лет назад

    🤓👍🏻 thanks

  • @Sergio-pq3ri
    @Sergio-pq3ri 5 лет назад +2

    This guy have the same voice that the Breaking Bad's actor, Jesse Pinkman.

  • @severinbratus7584
    @severinbratus7584 6 лет назад

    Mаgnificent!

  • @SuperMaDBrothers
    @SuperMaDBrothers 2 года назад

    This is a survey class.

  • @이인섭-r2x
    @이인섭-r2x 6 лет назад

    31:53 Baam!!!!!!

  • @vallikenhaven5843
    @vallikenhaven5843 6 лет назад

    This is attempting computational expression! It inhibits natural inherent algorithmic processing. Human is an emotional cognitive being. Whether it be locution or calculation, induction/deduction is based on emotional cognitive aptitude, therewith conflict with the attempt of mimicking sterile computation and natural anatomization. I have watched from afar and Chomsky is aware who I am... ( ^V^ ) If human continues to use technology without the comprehension that tech creates an ersatz structure, than human will fall deeper and deeper into a form of psychosis. Love to ALL

  • @kparag01
    @kparag01 6 лет назад

    No nonsense AI from Lex

  • @Mrbeastifed
    @Mrbeastifed 6 лет назад

    If this lecture is over my head and I need a little more knowledge about the fundamental concepts where should I look for that?

    • @charliethompson3366
      @charliethompson3366 6 лет назад

      I really like this series, helped me understand ruclips.net/video/aircAruvnKk/видео.html

  • @kuqiu5003
    @kuqiu5003 5 лет назад

    I only completed watching this video because I think he is charming.

  • @sui-chan.wa.kyou.mo.chiisai
    @sui-chan.wa.kyou.mo.chiisai 6 лет назад

    I think he speeks more faster than before

  • @shyamalchandra
    @shyamalchandra 6 лет назад +1

    I see a hand-waving review of acronyms and no step-by-step tracing of algorithms. No substance. NeXT!

    • @CharlesVanNoland
      @CharlesVanNoland 5 лет назад +1

      Considering the material that needed to be covered and the time it had to be covered in, especially for something that's meant to be an overview/review and not a how-to, I'm not sure why you would think this is supposed to actually get into the nitty-gritty that is already covered elsewhere.

  • @muhammadusman7217
    @muhammadusman7217 6 лет назад +1

    First

  • @JeDi___
    @JeDi___ 6 лет назад +1

    Thanks for the lecture. Also, well edited (erasing pauses). Funny to see you transform to Men in Blue compared to when you started the lecture two years ago. Looking good