Transformer-Based Time Series with PyTorch (10.3)

Поделиться
HTML-код
  • Опубликовано: 17 окт 2024
  • НаукаНаука

Комментарии • 18

  • @yuvrajpatra8301
    @yuvrajpatra8301 8 месяцев назад +9

    The video is great, and well explained. Could you also tell me how I can implement this for a use case wherein I have multiple features in my dataframe and one regression output variable y? (6 inputs, one output)?

  • @chunziWang-rb2kd
    @chunziWang-rb2kd 7 месяцев назад +1

    Why the decoder layer is a nn.Linear? Would it be better to use nn.TransformerDecoderLayer and how to use it?

  • @shanks9758
    @shanks9758 4 месяца назад

    Your explanation is very simple wow great job man, all respect

  • @hoseinhabibi2385
    @hoseinhabibi2385 3 месяца назад

    thanks for the explanation, what if we use LSTM along with transformer(attention mechanism), it would be helpful or just make the model complex?

  • @georgevlachodimitropoulos5169
    @georgevlachodimitropoulos5169 11 месяцев назад +1

    Hi one question. When the model does the early stopping the Validation Loss hasn't decreased at all (It is also shown in this video). Is this model really learning anything or is it just for demonstration. Will any hyperparameter tuning make any difference?

    • @SaschaRobitzki
      @SaschaRobitzki 8 месяцев назад

      I made a similar observation along this line. The model doesn't seem to learn much...

  • @josephomalley6652
    @josephomalley6652 11 месяцев назад +2

    Your the best, thank you!

  • @matthiaswiedemann3819
    @matthiaswiedemann3819 11 месяцев назад

    Do you plan to add up- and downcycling like in the metnet-3 model as well?

  • @zoe.tsekas
    @zoe.tsekas 11 месяцев назад +1

    amazing, just what I needed, thanks! ❤

  • @BooklyCrashCourse
    @BooklyCrashCourse 6 месяцев назад

    You are the man!!!

  • @ccc_ccc789
    @ccc_ccc789 11 месяцев назад +1

    Thanks for sharing this!

  • @Dmitrii-q6p
    @Dmitrii-q6p 7 месяцев назад

    can transformers work with irregular time series?
    Would be great to get some info about irregular timeseries, google bring me to CNN but need yet to test.

  • @SaschaRobitzki
    @SaschaRobitzki 8 месяцев назад +1

    At least in PyTorch 2.2 I got a warning from the line `self.transformer_encoder = nn.TransformerEncoder(encoder_layers, num_layers)` in `TransformerModel`. Setting `enable_nested_tensor=True` in the TransformerEncoder fixed that.

  • @bothainah.r9414
    @bothainah.r9414 5 месяцев назад

    Is this model considered a hybrid model ?

  • @EvelynGolden-y9s
    @EvelynGolden-y9s 6 месяцев назад

    still a little bit confused about why using just a linear layer as a decoder?

  • @honestkariwo6163
    @honestkariwo6163 11 месяцев назад +1

    Thank you

  • @nauseouscustody1440
    @nauseouscustody1440 6 месяцев назад

    ❤ Gracias!