Pedram Jahangiry
Pedram Jahangiry
  • Видео 226
  • Просмотров 281 685
Module 4- part 3- Tiemseries SARIMA model in Python (Pycaret)
Relevant playlists:
Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR
Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_
Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM
Instructor: Pedram Jahangiry
All of the slides and notebooks used in this series are available on my GitHub page, so you can follow along and experiment with the code on your own.
github.com/PJalgotrader
Lecture Outline:
0:00 where to find the materials
2:23 AR(1) and MA(1) intuition
10:18 SARIMA(p,d,q)(P,D,Q)m
42:07 AutoARIMA
1:01:33 Prediction intervals for SARIMA model
Просмотров: 116

Видео

Module 4- part 2- SARIMA(X) models for timeseries forecasting
Просмотров 11921 день назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 4- part 1- ARIMA models (pre-reqs: ACF, PACF, weak vs strong stationarity, differencing)
Просмотров 17921 день назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 3- part 3- ETS timeseries models in Python (Pycaret)
Просмотров 222Месяц назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 3- part 2- ETS (Error, Trend, Seasonality) timeseries models
Просмотров 133Месяц назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 3- part 1- Exponential smoothing methods (SES, Holt-linear, Holt-Winter, Damped)
Просмотров 160Месяц назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 3- part 0- Introduction to exponential smoothing methods vs ETS models
Просмотров 129Месяц назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 2 -Part 2- Setting up Deep Forecasting environment, basic Python timeseries
Просмотров 3555 месяцев назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 2 -Part 1- Setting up Deep Forecasting environment, platforms and python packages
Просмотров 2395 месяцев назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 1- Part 4- Demystifying timeseries data and modeling (Can we beat WallStreet?)
Просмотров 3215 месяцев назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 1- Part 3: Demystifying timeseries data and modeling (classical vs ML vs DL modeling)
Просмотров 4256 месяцев назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 1- Part 2- Demystifying timeseries data and modeling (forecasting strategies)
Просмотров 3576 месяцев назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Module 1- Part 1- Demystifying timeseries data and modeling (Basics)
Просмотров 6246 месяцев назад
Relevant playlists: Deep Forecasting Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUPW_lptTNwpKNrpEQvUZerR Machine Learning Codes and Concepts: ruclips.net/p/PL2GWo47BFyUNeLIH127rVovSqKFm1rk07&si=lCPyHenEQYBCJzQ_ Deep Learning Concepts, simply explained: ruclips.net/p/PL2GWo47BFyUO6Fiy2mJCxR8sUrBEfT6BM Instructor: Pedram Jahangiry All of the slides and notebooks used in this series are a...
Your One-Stop Resource Guide: Where to Find Course Materials for All My Courses
Просмотров 4866 месяцев назад
Looking for materials for all my courses? You're in the right place! This video guides you on how to access all the course resources you'll need. Check out my GitHub for downloadable materials, follow my RUclips channel for updates and additional learning content, and stay connected through Twitter for the latest news. GitHub: github.com/PJalgotrader RUclips: www.youtube.com/@UCNDElcuuyX-2pSatV...
We Did It! Pedram Jahangiry Wins Teacher of the Year 2024 at Utah State University
Просмотров 4506 месяцев назад
We Did It! Pedram Jahangiry Wins Teacher of the Year 2024 at Utah State University
Welcome to the Deep Forecasting course (Advanced Timeseries with Econometrics, ML and DL)
Просмотров 1,6 тыс.6 месяцев назад
Welcome to the Deep Forecasting course (Advanced Timeseries with Econometrics, ML and DL)
Module 12 - Python Part 2: Mastering Clustering with PyCaret - K-Modes & K-Prototypes Unveiled
Просмотров 5249 месяцев назад
Module 12 - Python Part 2: Mastering Clustering with PyCaret - K-Modes & K-Prototypes Unveiled
Module 12- Python part1: Mastering Clustering techniques using Sklearn (Kmeans, Hierarchical)
Просмотров 6459 месяцев назад
Module 12- Python part1: Mastering Clustering techniques using Sklearn (Kmeans, Hierarchical)
Module 12- Mastering Clustering in ML: K-Means, K-Modes, K-Prototypes & Hierarchical Methods
Просмотров 1 тыс.9 месяцев назад
Module 12- Mastering Clustering in ML: K-Means, K-Modes, K-Prototypes & Hierarchical Methods
Module 11- Python: Mastering PCA & Kernel PCA in Python using Sklearn and pca packages
Просмотров 1,2 тыс.10 месяцев назад
Module 11- Python: Mastering PCA & Kernel PCA in Python using Sklearn and pca packages
Module 11- Theory: Eigenvalues, Eigenvectors and Principle Component Analysis (PCA and Kernel PCA)
Просмотров 1,4 тыс.10 месяцев назад
Module 11- Theory: Eigenvalues, Eigenvectors and Principle Component Analysis (PCA and Kernel PCA)
Module 10- Theory 4: Timeseries challenges in machine learning (Cross validation and Bootstrapping)
Просмотров 41310 месяцев назад
Module 10- Theory 4: Timeseries challenges in machine learning (Cross validation and Bootstrapping)
Module 10- Python 3: Mastering Machine Learning Boosting algorithms in (Scikit Learn and Pycaret)
Просмотров 60310 месяцев назад
Module 10- Python 3: Mastering Machine Learning Boosting algorithms in (Scikit Learn and Pycaret)
Module 10- Theory 3: Advanced ML boosting techniques: XGboost, Catboost, LightGBM
Просмотров 1 тыс.10 месяцев назад
Module 10- Theory 3: Advanced ML boosting techniques: XGboost, Catboost, LightGBM
Module 10- Theory 2: Machine Learning Boosting techniques: AdaBoost, GBM and XGboost
Просмотров 45310 месяцев назад
Module 10- Theory 2: Machine Learning Boosting techniques: AdaBoost, GBM and XGboost
Module 10- Python 2: Master Bagging & Random Forest CLASSIFICATION in Python with Sklearn & PyCaret
Просмотров 25911 месяцев назад
Module 10- Python 2: Master Bagging & Random Forest CLASSIFICATION in Python with Sklearn & PyCaret
Module 10- Python 1: Master Bagging & Random Forest REGRESSION in Python with Sklearn & PyCaret
Просмотров 44811 месяцев назад
Module 10- Python 1: Master Bagging & Random Forest REGRESSION in Python with Sklearn & PyCaret
Module 10- Theory 1: Mastering Bagging and Random Forest in Machine Learning
Просмотров 54811 месяцев назад
Module 10- Theory 1: Mastering Bagging and Random Forest in Machine Learning
Module 9- Python: Mastering Decision Trees: A Comprehensive Guide with Sklearn and PyCaret
Просмотров 39611 месяцев назад
Module 9- Python: Mastering Decision Trees: A Comprehensive Guide with Sklearn and PyCaret
Module 9- Theory: Decision Trees (CART) Explained- Everything You Need to Master Them
Просмотров 47111 месяцев назад
Module 9- Theory: Decision Trees (CART) Explained- Everything You Need to Master Them

Комментарии

  • @didierleprince6106
    @didierleprince6106 День назад

    Un grand merci (:

  • @Bruno0095
    @Bruno0095 6 дней назад

    Your content is great. Thanks a lot!

  • @robertkabachun4656
    @robertkabachun4656 8 дней назад

    Incredible video! Thank you very much!

  • @DesertRat44
    @DesertRat44 12 дней назад

    God bless you Dr. Pedram and thank you for the knowledge as always.

    • @pedramjahangiry
      @pedramjahangiry 11 дней назад

      That’s very kind of you! Glad you are enjoying the content.

  • @hackerborabora7212
    @hackerborabora7212 12 дней назад

    Always the best ❤❤❤ from the best pls can you include the KAN models IN THE COURSE

    • @pedramjahangiry
      @pedramjahangiry 12 дней назад

      Thanks for the feedback! I love the enthusiasm. Creating new content with the quality I need, typically takes weeks if not months! But for sure I will add KAN to my to do list.

    • @hackerborabora7212
      @hackerborabora7212 12 дней назад

      @pedramjahangiry thank you for the high quality of content

  • @musesafayigatalo
    @musesafayigatalo 16 дней назад

    Great Very crucial tutorial! Thank you .Doc I want ask question have you experience on context sensitive spelling error detection and correction using deep learning approach? If you have please share for me.

    • @pedramjahangiry
      @pedramjahangiry 6 дней назад

      You're welcome! I don't have a specific video on that topic, but context-sensitive spelling error detection can be addressed using sequence models like LSTMs or transformers (e.g., BERT). You could also explore seq2seq models for generating corrected text.

    • @musesafayigatalo
      @musesafayigatalo 6 дней назад

      @pedramjahangiry Thank you, Doc! If you have time, could you please prepare a video on this topic? I'd like to use it as a reference for my related MSc thesis.

  • @razinust2579
    @razinust2579 20 дней назад

    Very detailed everything is cleared

  • @cassioandrade6233
    @cassioandrade6233 24 дня назад

    Excellent videos, thank you very much. As Cora Coralina said: "Happy is he who transfers what he knows and learns what he teaches."

  • @asnakewbewketubelete4108
    @asnakewbewketubelete4108 26 дней назад

    This is extremely helpful! Many thanks!

  • @stefaniebabiolakis2724
    @stefaniebabiolakis2724 Месяц назад

    Thank you for such an insightful and helpful video tutorial! Do you have any tips on how to set a custom probability threshold for XGBoost in PyCaret (i.e. metrics, plots, tuning, and predictions)? I have approximately 30% of my dataset in the positive class, so would expect my XGBoost to perform best with a threshold close to 0.3, not the default 0.5.

    • @pedramjahangiry
      @pedramjahangiry 29 дней назад

      you should be able to do it by adjusting the "probability_threshold" inside the predict_model function. After you created your xgboost model, do something like this: xgb = create_model('xgboost') predictions = predict_model(xgb, data=data, probability_threshold=0.3) always check out the source code to figure out how you can play with these parameters inside each function: github.com/pycaret/pycaret/blob/master/pycaret/classification/functional.py

  • @reginemartschiske7395
    @reginemartschiske7395 Месяц назад

    Great, congratulations 🙇‍♀ But I'm not surprised 🙂 You are a great teacher!!

  • @smitkumbhani5034
    @smitkumbhani5034 Месяц назад

    Sir please upload next module in deep learning

    • @pedramjahangiry
      @pedramjahangiry 29 дней назад

      it is on my radar. it may take some time to have it ready though.

  • @baharehraee2827
    @baharehraee2827 Месяц назад

    congratulations! We are so proud of you!

  • @ИсидораАлександрова

    Lopez Kevin Williams James Thomas Eric

  • @nikiforosstaveris2484
    @nikiforosstaveris2484 Месяц назад

    must be one of the very few tutorials on yt where the tutors voice is actually pleasant to follow

  • @Hamidhdr-975
    @Hamidhdr-975 Месяц назад

    It's been a while since I've searched for good ML training, and your content is great, especially your GitHub. Thank you, Pedram.

  • @ehlooga
    @ehlooga Месяц назад

    just THANK YOU -- awesome !!

  • @iniedeteyo4349
    @iniedeteyo4349 Месяц назад

    I am surprised you don't so many likes for this course. Its very well explained. Searching for the best explanation on youtube and I stumbled upon it and I am going to stick with it

  • @xiaojiewen8552
    @xiaojiewen8552 Месяц назад

    Does it mean OLS is more straightforward than gradient descent? At least I have not seen the necessity to use gradient descent to estimate linear regression

    • @pedramjahangiry
      @pedramjahangiry Месяц назад

      for OLS because there is a simple colsed form solution, then all the python packages use that, instead of doing gradient descent. but for many other ML models, python packages use numerical solvers (bacuase there is no closed form solution) like GD, BFGS and etc.

  • @xiaojiewen8552
    @xiaojiewen8552 Месяц назад

    Thank you for the video. Besides the disadvantages mentioned in others' comments, the feature scaling may also affect the interpretation or interpretability, especially when it comes to very complicated ML model.

  • @xiaojiewen8552
    @xiaojiewen8552 Месяц назад

    Thank you for your videos. And your channel is my favorite place to learn ML. I feel comfortable with the overall content of the lecture, but at the same time, there are some points that raise questions in my mind. It might be the general feeling when learning something new.

    • @pedramjahangiry
      @pedramjahangiry Месяц назад

      thanks for the feedback. feel free to share your questions below each video.

  • @brunos.dossantos954
    @brunos.dossantos954 Месяц назад

    Good work Pedram! It is hard to find meaningful material about clustering methods. Thank you!👋

  • @walsoftai
    @walsoftai 2 месяца назад

    Machine Learning is different from general programming in this way: Machine Learning involves designing algorithms that can learn from data and improve over time, whereas general programming requires explicit instructions for every step. In Machine Learning, the focus is on enabling the system to recognize patterns and make predictions based on the data it processes. There are three main types of Machine Learning: 1. Supervised Learning: Both input and output are provided, meaning the labels are known. The algorithm learns to map inputs to outputs based on this labeled training data. 2. Unsupervised Learning: Only the input is given. The machine must detect meaningful patterns or structures within the data without any labeled outcomes to guide it. 3. Reinforcement Learning: Both input and output are provided but not at the same time. The system interacts with the environment, receiving feedback through trial and error, and learns to make decisions that maximize cumulative rewards. Classification of the following algorithms according to the types of Machine Learning: 1. Both input and output are given, meaning the labels are known. - Type: Supervised Learning 2. Only the input is given. The machine should detect meaningful patterns. - Type: Unsupervised Learning 3. Both input and output are given, but not at the same time. There might be a delay. The machine needs to explore the environment through trial and error until it discovers a meaningful pattern. - Type: Reinforcement Learning ai.walsoftcomputers.com/

  • @walsoftai
    @walsoftai 2 месяца назад

    In real-world problems where the emphasis is either on prediction or inference, the choice between Machine Learning (ML) and Statistical Learning (SL) can depend on the specific goals: Problem Emphasizing Prediction (Less Inference): - Approach: Machine Learning (ML) - Reason: ML is generally more suited for problems where the primary goal is to make accurate predictions or classifications based on large datasets, often without needing to understand the underlying relationships between variables in detail. ML models like neural networks, ensemble methods, and gradient boosting are designed to optimize predictive performance and can handle complex, high-dimensional data. Problem Emphasizing Inference (Less Prediction): - Approach: Statistical Learning (SL) - Reason: SL focuses on understanding and interpreting the relationships between variables, often through simpler, more interpretable models. Techniques like linear regression, logistic regression, and hypothesis testing are used to infer relationships and test hypotheses about data. These methods are designed to provide insights into the underlying processes and can be used to draw conclusions about causal relationships or other aspects of the data structure. In summary: - Use ML for problems where accurate prediction is crucial and interpretability is less of a concern. - Use SL for problems where understanding relationships and making inferences is more important than achieving the highest predictive accuracy. ai.walsoftcomputers.com/

  • @hackerborabora7212
    @hackerborabora7212 2 месяца назад

    Pls can i use the vprom autoencoder with mcmc model or hmc model to catch daily & weekly forcast im trying use combination of economic reports and daily financial data im self learner i think its kind of art mix or im wrong ?

    • @pedramjahangiry
      @pedramjahangiry 2 месяца назад

      Hi, Combining complex models like autoencoders with MCMC or HMC for predicting daily and weekly stock prices sounds interesting, but none of these methods have been proven to consistently work for short-term financial forecasting. The stock market is very unpredictable and full of random noise, which even advanced models struggle to handle well. As a retail investor, I highly encourage you not to waste time trying to beat the market with such models. The chances of success are low, and it’s better to focus on long-term, proven investment strategies. Good luck, and feel free to ask if you have more questions!

    • @hackerborabora7212
      @hackerborabora7212 2 месяца назад

      @@pedramjahangiry yes you absolutely right the market is hard to predict I'm trying put my knowledge about data and the new release in my ideas 💡 It is only a quest for sustenance. I am a Muslim and Islam denies the idea that anyone can predict anything because sustenance comes from God.

  • @hackerborabora7212
    @hackerborabora7212 2 месяца назад

    Im waiting for more videos 😞 were are you

    • @pedramjahangiry
      @pedramjahangiry 2 месяца назад

      I guess life happens policy! Have had some challenges along the way but I promise, I haven't forgotton you guys.

  • @PJ-nc4jh
    @PJ-nc4jh 2 месяца назад

    Question of the day: 1. Supervised Learning 2. Unsupervised Learning 3. Reinforcement Learning (Just what I am guessing based on the options)

  • @hamzaehsankhan
    @hamzaehsankhan 3 месяца назад

    When using timeseries_dataset_from_array to create datasets, train_dataset, test_dataset and val_dataset had all uniform tensors except the last one which were partials, i.e. their samples and targets were as follows: for samples, targets in train_dataset: if samples.shape != (batch_size, sequence_length, 14): print(samples.shape) print(targets.shape) for samples, targets in test_dataset: if samples.shape != (batch_size, sequence_length, 14): print(samples.shape) print(targets.shape) for samples, targets in val_dataset: if samples.shape != (batch_size, sequence_length, 14): print(samples.shape) print(targets.shape) (103, 120, 14) (103,) (118, 120, 14) (118,) (206, 120, 14) (206,) This gave the error: Epoch 1/10 --------------------------------------------------------------------------- InvalidArgumentError Traceback (most recent call last) <ipython-input-146-e09ecdf9d4ec> in <cell line: 15>() 13 ] 14 model.compile(optimizer="rmsprop", loss="mse", metrics=["mae"]) ---> 15 history = model.fit(train_dataset, 16 epochs=10, 17 validation_data=val_dataset, Only one input size may be -1, not both 0 and 1 [[{{node functional_9_1/flatten_10_1/Reshape}}]] [Op:__inference_one_step_on_iterator_41264] Although I remove the partial batches, I still get the error. I do not get the same error when fitting the dataset with the CNN.

    • @pedramjahangiry
      @pedramjahangiry 6 дней назад

      this is due to an update in the Keras package. I updated the code on GitHub. Thanks for your feedback.

    • @hamzaehsankhan
      @hamzaehsankhan 6 дней назад

      @pedramjahangiry Thank you!

  • @Sneha-g2b
    @Sneha-g2b 3 месяца назад

    Excellent explanations and with clarity. Waiting for all the modules!!

  • @NACHIKETKISHORGORE
    @NACHIKETKISHORGORE 3 месяца назад

    great explanation sir

  • @alek_lind
    @alek_lind 3 месяца назад

    I discovered your videos yesterday, and I have to admit that they are exceptionally good.I have long been wanting to advance my knowledge and skills within time series forecasting but haven't been able to find a course that seemed worth my time. However, I have been binging the Deep Forecasting course - it just perfectly matches my current background knowledge and skills within programming, it covers the methods and models that I am interested in mastering, and everything is explained really well. I cannot wait for the coming modules!

  • @r0cketRacoon
    @r0cketRacoon 3 месяца назад

    is SVM good for regression, i mean in general compared to KNN, Random Forest, XGBoost?

    • @pedramjahangiry
      @pedramjahangiry 3 месяца назад

      of course it depend on the dimension of features and patterns in data. but, "generaly speaking", I would rank them like this: Xgboost> random forest> SVR > KNN

  • @codewithbrogs3809
    @codewithbrogs3809 3 месяца назад

    No GBDTs?

  • @rahilnecefov2018
    @rahilnecefov2018 3 месяца назад

    is there any chance to get the presentations? I cant find this materials in your github account.

    • @pedramjahangiry
      @pedramjahangiry 3 месяца назад

      I just added the slides here: github.com/PJalgotrader/Machine_Learning-USU/tree/main/Lectures%20and%20codes/miscellaneous

  • @rahilnecefov2018
    @rahilnecefov2018 3 месяца назад

    oh my dear god, it is the greatest ML videos I have ever seen in my life, I cant understand the concepts or I get bored, but I can watch this videos everyday all day long, thanks dear Pedram <3

  • @RELAXISLANDS
    @RELAXISLANDS 3 месяца назад

    perfect videos so many people missing these valuable informations

  • @RELAXISLANDS
    @RELAXISLANDS 3 месяца назад

    golden explanation

  • @Sneha-g2b
    @Sneha-g2b 3 месяца назад

    The concepts are so well explained. I love the way each phase of the process is taught in a clear way. Im better able to grasp concepts of machine learning now with clarity. Thanks for you efforts!

  • @Mohamedezzeldin-k8h
    @Mohamedezzeldin-k8h 3 месяца назад

    Should i learn staitisitcial learning or machine learning first or it depends

    • @pedramjahangiry
      @pedramjahangiry 3 месяца назад

      I would highly encourage to learn them side by side. It may slow you down but it is worth your investment. It is good to start with fundamentals of regression analysis. After all, machine learning is the extension of statistical learning I believe. Good luck!

  • @hamzasanialiyu4181
    @hamzasanialiyu4181 4 месяца назад

    Excellent videos. Highly recommended.

  • @foobar24
    @foobar24 4 месяца назад

    the end of the video 😂 the lectures are just awesome 👏

  • @RELAXISLANDS
    @RELAXISLANDS 4 месяца назад

    thanks Man for Your great explanation

  • @RELAXISLANDS
    @RELAXISLANDS 4 месяца назад

    perfect videos,hope it will find proper audience. maye eftekhari ham vatan.

  • @borisljevar3126
    @borisljevar3126 4 месяца назад

    Thanks for making this video. I enjoyed watching it. I'm looking forward to the video on time series forecasting.

  • @borisljevar3126
    @borisljevar3126 4 месяца назад

    This is my personal experience (not so good one) with the PyCaret library. *First Disappointment:* After installing it with `pip install pycaret` on my Linux machine, I started Python from the command line and ran `import pycaret`. This is the message I received: ``` RuntimeError: ('PyCaret only supports Python 3.9, 3.10, 3.11. Your actual Python version: ', sys.version_info(major=3, minor=12, micro=4, releaselevel='final', serial=0), 'Please DOWNGRADE your Python version.') ``` Alright, I can live with that. After creating a virtual environment with a downgraded Python version and playing with PyCaret for some time, I was quite impressed by the rich capabilities-until I tried to analyze the performance of a trained model. *Second Disappointment:* My script contains the following line: ```python experiment1.plot_model(tuned_gbr, plot='learning') # Learning Curve ``` This line does work and I get to see the plot, but when I close the plot by clicking on the "X" mark or pressing Ctrl+W, the script crashes with the following output: ``` File "/usr/lib64/python3.11/tkinter/__init__.py", line 1732, in __setitem__ self.configure({key: value}) File "/usr/lib64/python3.11/tkinter/__init__.py", line 1721, in configure return self._configure('configure', cnf, kw) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.11/tkinter/__init__.py", line 1711, in _configure self.tk.call(_flatten((self._w, cmd)) + self._options(cnf)) _tkinter.TclError: invalid command name ".!navigationtoolbar2tk.!button2" ``` Actually, any plot type will crash in the same way except for the 'pipeline' plot, which does continue executing the remainder of the script after closing. This is very unfortunate, as I was looking forward to the amazing PyCaret plotting and analysis capabilities, which are now completely useless as they crash the program. Maybe it all works in Jupyter Notebook and Google Colab, but I don't use those platforms. If you can't run a simple script, it ends there for me. *Conclusion:* PyCaret is an excellent resource for learning. I'll try to adopt its concepts, pipelines, and methodologies, but I'll have to write my code myself as this thing unfortunately doesn't work. I hope developers will continue improving it and maybe it will become better with time. For now, it's a learning material, which I'll definitely examine in more detail. Just by going through the PyCaret documentation, I am fascinated by how much there is to learn.

  • @amrdel2730
    @amrdel2730 4 месяца назад

    Exactly what I needed

  • @borisljevar3126
    @borisljevar3126 4 месяца назад

    7:03 topics that will be covered in the video

    • @pedramjahangiry
      @pedramjahangiry 4 месяца назад

      thank you, just updated the timeline.

  • @FxZizo
    @FxZizo 4 месяца назад

    Hi many thanks for sharing great information in this area.

  • @Anis-f2t
    @Anis-f2t 4 месяца назад

    Hi Pedram, could we have access to the pdf of the sessions?

    • @pedramjahangiry
      @pedramjahangiry 4 месяца назад

      of course, all the slides are available on my Github account. Please check out the link on description.

  • @dimpleraghu2744
    @dimpleraghu2744 4 месяца назад

    Hi Pedram, I really need your help. I'm working on a project that detects seasonality using pycaret setup and it's all working great except for that the setup function shows seasonality present even with the most randomest data eg:- just a data of random values ranging from 20-25. Is there a way to fix this?

    • @pedramjahangiry
      @pedramjahangiry 4 месяца назад

      The pycaret setup might sometimes detect false seasonality in random data. To fix this, try manually inspecting your data and using statistical tests to verify seasonality. You can also adjust the setup parameters or set seasonality=False manually. Hope this helps!