13.3.1 L1-regularized Logistic Regression as Embedded Feature Selection (L13: Feature Selection)

Поделиться
HTML-код
  • Опубликовано: 7 сен 2024
  • Sebastian's books: sebastianrasch...
    Without going into the nitty-gritty details behind logistic regression, this lecture explains how/why we can consider an L1 penalty --- a modification of the loss function -- as an embedded feature selection method.
    Slides: sebastianrasch...
    Code: github.com/ras...
    Links to the logistic regression videos I referenced:
    sebastianrasch...
    -------
    This video is part of my Introduction of Machine Learning course.
    Next video: • 13.3.2 Decision Trees ...
    The complete playlist: • Intro to Machine Learn...
    A handy overview page with links to the materials: sebastianrasch...
    -------
    If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Комментарии • 5

  • @AyushSharma-jm6ki
    @AyushSharma-jm6ki Год назад

    @sebastian amazing video. Thanks for sharing. I am getting deeper understanding of these topics with your videos.

  • @amrelsayeh4446
    @amrelsayeh4446 2 месяца назад

    @sebastian At 13:20, why is the solution between the global minimum and the penalty minimum lie somewhere where one of the weights is zero. In other words, why it should lie at the corner of the penalty function not just at the line. between the global minimum and the penalty minimum.

  • @arunthiru6729
    @arunthiru6729 Год назад +3

    @sebastian I think using Logistic Regression directly for feature selection based on respective weights/coefficients means we are assuming all dimensions/features are independent. I understand this is not the correct way to do this. Pls advise.

    • @SebastianRaschka
      @SebastianRaschka  Год назад +1

      Yes, this assumption is correct. ML is full of trade-offs 😅. If you cannot make this assumption, I recommend the sequential feature selection approach

    • @wayne7936
      @wayne7936 4 месяца назад

      Thanks for pointing this out!