Informer: Time series Transformer - EXPLAINED!
HTML-код
- Опубликовано: 26 май 2024
- Let's talk about a time series transformer: Informer.
ABOUT ME
⭕ Subscribe: ruclips.net/user/CodeEmporiu...
📚 Medium Blog: / dataemporium
💻 Github: github.com/ajhalthor
👔 LinkedIn: / ajay-halthor-477974bb
RESOURCES
[1] Main paper that introduced the Informer: arxiv.org/pdf/2012.07436
PLAYLISTS FROM MY CHANNEL
⭕ Deep Learning 101: • Deep Learning 101
⭕ Natural Language Processing 101: • Natural Language Proce...
⭕ Reinforcement Learning 101: • Reinforcement Learning...
Natural Language Processing 101: • Natural Language Proce...
⭕ Transformers from Scratch: • Natural Language Proce...
⭕ ChatGPT Playlist: • ChatGPT
MATH COURSES (7 day free trial)
📕 Mathematics for Machine Learning: imp.i384100.net/MathML
📕 Calculus: imp.i384100.net/Calculus
📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
📕 Linear Algebra: imp.i384100.net/LinearAlgebra
📕 Probability: imp.i384100.net/Probability
OTHER RELATED COURSES (7 day free trial)
📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
📕 Python for Everybody: imp.i384100.net/python
📕 MLOps Course: imp.i384100.net/MLOps
📕 Natural Language Processing (NLP): imp.i384100.net/NLP
📕 Machine Learning in Production: imp.i384100.net/MLProduction
📕 Data Science Specialization: imp.i384100.net/DataScience
📕 Tensorflow: imp.i384100.net/Tensorflow
Thank you for explaining papers related to time-series. Would love to see your videos more on time series!!
Coming up soon
As always, great video, looking forward to next video on the code...
This is interesting. Eagerly looking forward to next episodes ❤
It is a subject I have been waiting for. Super, sir! Finally, you directed us to it.
Yep!
Good video! Well explained. In real life though a particular time series will correlate with itself and depend on other time series. Any way to take this into account to improve predictions?
Beautiful !
I think the answers are: D, B, D
And I'll do more research cuz I don't understand how the network is able to adjust the output according to the input
Thank you sir
Ding ding ding. You got full points in quiz time!
And yea ~ glad this sparked more curiosity in you for further research
@@CodeEmporium Got only one correct. The last one learning and computational complexity :)
The video I was just looking for
Super glad! Thanks for watching
so it means it makes the process faster by prob sparse attention , distillation and generative inference but does it improve the accuracy also ?
According to the “experiments” section of the paper, it certainly looks like this architecture has the best performance compared to some models (including different transformer architectures)
studies, fitness, trading
Would historical nutritional data count?
❤
Can you please blow up the Llama/Llama 2 architecture and code for us? Eagerly waiting for your LLM videos.
Yep! That’s definitely a future playlist idea
@@CodeEmporium Awesome. Thanks
I am just thinking about it you just made it..hope you are not reading my mind😄
I just might be :)
Ok. It's all interesting. But how can I use it when time-series data are received in real-time? I can not batch process, only one by one. I tried to make some kind of buffering to collect several items and then process them all together. But I didn't succeed in this, because I couldn't incorporate it in common libraries used for neural networks
during real time inference, the model will typically be deployed as a part of a service. we get a request, pass this as a "batch size 1", get an output, and return the response.
How can someone get in Touch with you
?
Honestly I can't think of any context where I use historical data to inform my decisions other than financial.
Yea. Finance does seem like the bigger and obvious one to me too
Answer: D ?
For quiz 1, yes - it was all of them :)
Provide answers to your quizzes at the end. It's really irritating to see questions unanswered. How would someone verify it. Also, please stop saying "Quiz time"