torch.nn.Embedding - How embedding weights are updated in Backpropagation

Поделиться
HTML-код
  • Опубликовано: 25 дек 2024

Комментарии •

  • @mariamzomorodi2249
    @mariamzomorodi2249 Год назад

    Excellent! Thank you so much! The best explanation ever for embedding layer. We can find it anywhere else in the web!

  • @王天宁-y8y
    @王天宁-y8y 10 месяцев назад

    Thanks for the video! Really do me a big favor.

  • @GoogleAccount-kc6rt
    @GoogleAccount-kc6rt Год назад

    Very good man,
    Keep posting ☺️

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Год назад

    this is a good series.

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Год назад

    Is this the same as a dense layer?

    • @machinelearningwithpytorch
      @machinelearningwithpytorch  Год назад +1

      The linear layer is the same as the dense layer. But technically speaking the embedding layer is different from the dense(linear) layer