The video is great, and well explained. Could you also tell me how I can implement this for a use case wherein I have multiple features in my dataframe and one regression output variable y? (6 inputs, one output)?
Hi one question. When the model does the early stopping the Validation Loss hasn't decreased at all (It is also shown in this video). Is this model really learning anything or is it just for demonstration. Will any hyperparameter tuning make any difference?
can transformers work with irregular time series? Would be great to get some info about irregular timeseries, google bring me to CNN but need yet to test.
At least in PyTorch 2.2 I got a warning from the line `self.transformer_encoder = nn.TransformerEncoder(encoder_layers, num_layers)` in `TransformerModel`. Setting `enable_nested_tensor=True` in the TransformerEncoder fixed that.
The video is great, and well explained. Could you also tell me how I can implement this for a use case wherein I have multiple features in my dataframe and one regression output variable y? (6 inputs, one output)?
Why the decoder layer is a nn.Linear? Would it be better to use nn.TransformerDecoderLayer and how to use it?
Your explanation is very simple wow great job man, all respect
thanks for the explanation, what if we use LSTM along with transformer(attention mechanism), it would be helpful or just make the model complex?
Hi one question. When the model does the early stopping the Validation Loss hasn't decreased at all (It is also shown in this video). Is this model really learning anything or is it just for demonstration. Will any hyperparameter tuning make any difference?
I made a similar observation along this line. The model doesn't seem to learn much...
Your the best, thank you!
Do you plan to add up- and downcycling like in the metnet-3 model as well?
amazing, just what I needed, thanks! ❤
You are the man!!!
Thanks for sharing this!
can transformers work with irregular time series?
Would be great to get some info about irregular timeseries, google bring me to CNN but need yet to test.
At least in PyTorch 2.2 I got a warning from the line `self.transformer_encoder = nn.TransformerEncoder(encoder_layers, num_layers)` in `TransformerModel`. Setting `enable_nested_tensor=True` in the TransformerEncoder fixed that.
Is this model considered a hybrid model ?
still a little bit confused about why using just a linear layer as a decoder?
Thank you
❤ Gracias!