Encoder-decoder architecture: Overview

Поделиться
HTML-код
  • Опубликовано: 16 сен 2024

Комментарии • 29

  • @googlecloudtech
    @googlecloudtech  Год назад +3

    Subscribe to Google Cloud Tech → goo.gle/GoogleCloudTech

  • @DogBlessGeesus
    @DogBlessGeesus Год назад +23

    I came to the comments to see if I was the only one struggling with this explanation. This was a very difficult video to follow, anyone have a better way of summing this up?

    • @Udayanverma
      @Udayanverma 10 месяцев назад +1

      i am also looking for a proper course.

    • @jeffmathew6238
      @jeffmathew6238 10 месяцев назад +1

      I think Prof. Ghassemi Lectures and Tutorials gives a proper explanation...i watched his lecture 6 playlist and found it helpful...hope it helps you too

  • @gkay007
    @gkay007 10 месяцев назад +10

    You need to use encoder decoder to understand this course

    • @AshkaCambell
      @AshkaCambell 9 месяцев назад

      RTS TEXTS AND FEEDS INSTEAD OF SMS
      /FEEL/

  • @ThisIsAnInactiveChannel
    @ThisIsAnInactiveChannel Год назад +32

    I think there's definitely a simpler way to explain how this architecture works, this one is too hard to understand tbh

  • @abdulanzil5224
    @abdulanzil5224 Год назад +7

    Nothing understood. What is happening ? A simple transformer block itself contains Encoder and Decoder. Why there is explanation with RNN ? Last it says, RNNs are replaced by transformers. Makes me more confused !

    • @opplli4942
      @opplli4942 Год назад +1

      RNN replaced by transformers which works in attention mechanism concept if u want to know more about transformers follow the learning path suggested in video

  • @debadri
    @debadri Год назад +11

    Anyone has link to an easier explanation? I could not decode this lesson.

    • @danish5326
      @danish5326 11 месяцев назад

      ruclips.net/video/L8HKweZIOmg/видео.html

    • @AshkaCambell
      @AshkaCambell 9 месяцев назад

      Try Machine Learning AI on your own created audio

    • @debadri
      @debadri 9 месяцев назад

      @@AshkaCambell You didn't have to display your stupidity in full view.

  • @vq8gef32
    @vq8gef32 6 месяцев назад +1

    Amazing Thank you

  • @SuperShawermaXS
    @SuperShawermaXS 10 месяцев назад +6

    Dam , sounds like chamber in valorant game

  • @aashishrulz
    @aashishrulz 8 месяцев назад

    enunciation and audio are so suppressed.

  • @hlodwig0_049
    @hlodwig0_049 11 месяцев назад +2

    "shifted to the left"...? 5:01

  • @akheelkale2937
    @akheelkale2937 Год назад +6

    what am i meant to understand from this??

  • @2NormalHuman
    @2NormalHuman 9 месяцев назад +1

    I think this video omits too many details to the point where I don't see what was the point of recording this? The video doesn't even show how data is embedded into the encoder and how the vector is produced

    • @kisame3151
      @kisame3151 8 месяцев назад

      The goal of an encoder decoder architecture is to "compress" high dimensional information be it an image or a natural language sentence into a dense lower dimensional representation.
      Then, in order to minimize the loss generated during learning, the encoder is forced to learn to represent the input data in a lower dimensional space without losing so much information that it makes it impossible for the decoder to recover the input. This could essentially be seen as the encoder learning to "summarize" its input.
      Again the encoder learns to encode natural language into vector space representations in the same way the decoder learns to decode it. Through training and backpropagation. Showing this process in any meaningful way is difficult because it happens in a deep neural network consisting of RNN blocks. If you dont know what RNNs are i recommend learning about that architecture first.

  • @Neil1701
    @Neil1701 Год назад +7

    Far too complicated and difficult to understand.

    • @AshkaCambell
      @AshkaCambell 9 месяцев назад

      Supply Chains
      Are you the consumer the user or the creator?

  • @mtare8942
    @mtare8942 22 дня назад +1

    Really google. We can do better than this 🙄

  • @prose_mozaic
    @prose_mozaic Год назад +2

    *frantic notation intensifies *

  • @abhisheksharmac3p0
    @abhisheksharmac3p0 2 месяца назад

    worst explanation

  • @AshkaCambell
    @AshkaCambell 9 месяцев назад

    /feel/