SVM10 The Kernel Trick (Part1: Basis Expansion)

Поделиться
HTML-код
  • Опубликовано: 4 ноя 2024

Комментарии • 17

  • @andriy123
    @andriy123 3 года назад +5

    Finally I found a good explanation for kernels in svm which helps .. thanks a lot

  • @xkhokokox
    @xkhokokox 3 года назад +2

    Probably the first time I really understand this now. Thanks alot sir!

  • @SumitSharma-pu6yi
    @SumitSharma-pu6yi 2 года назад

    very clear explanation.. please create more content.. couldnt thank you enough

  • @lilaberkani4376
    @lilaberkani4376 3 года назад +5

    I can't thank you enough for this great explanation! you saved me a looooot of time!

  • @bhavanarebba9718
    @bhavanarebba9718 Год назад

    Great Video, but I did not understand the part where d2 = 1.5 d1 + 0.5 d1 ^2. Is this just a random expression that you considered or am I missing something? Your response will be much appreciated! Thanks in advance! :)

    • @zardouayassir7359
      @zardouayassir7359  Год назад

      The original feature space is x, which has d1 dimensions. The transformed (expanded feature space) is phi(x). The transformation phi can take several forms. In this video, I considered a quadratic transformation. In this case, the number of dimensions of phi(x) is d2. There is a relation between d2 and d1, which is d2 = 1.5 d1 + 0.5 d1 ^2. But do not forget that this relation applies only if the transformation phi is quadratic. Based on the nature of the transformation phi, the relation between d1 and d2 can be mathematically derived. Hope this helps.

  • @Tyokok
    @Tyokok 3 года назад +1

    Thanks for the video! Can you please point me to the derivation of operation number =1.5d1 + 0.5 sqr(d1) ? Thank you!

  • @dharmairwanda1319
    @dharmairwanda1319 3 года назад

    Hi, I just try a 3D modelling software with radial basis functions. Is it same with this kernel that you explain? I've no statistic background, just try to understand briefly.

  • @kelgamel2203
    @kelgamel2203 3 года назад

    Hello, first of all thank you so much for this! Can you tell me where you obtained the 5149 additions from? I don't understand how it happened

    • @zardouayassir7359
      @zardouayassir7359  3 года назад +3

      Hi, kamal! 5149 because in the dot product of two vectors, the number of additions = number of multiplications - 1 = 5150 - 1 = 5149. If you don't get it, consider two vectors a and b such that a = [a1, a2, a3, a4] and b = [b1, b2, b3, b4]. Their dot product is equal to: a.b = a1×b1 + a2×b2 + a3×b3 + a4×b4. The number of multiplications is 4 (which is equal to the dimension of a and b), and the number of additions is (4-1) = 3.

  • @lahcenbouhlal574
    @lahcenbouhlal574 3 года назад

    Dear professor,
    I am writing this E-mail requesting your help to get some further explanations concerning the article you published:(LSSVM based initialization approach for parameter estimation of dynamical systems), [doi:10.1088/1742-6596/490/1/012004 ]
    My best regards.

  • @sohamgoswami6244
    @sohamgoswami6244 3 года назад

    Thanks a lot Sir..

  • @travel6142
    @travel6142 2 года назад

    what is the alpha term (alpha_i)? I missed this part..

    • @zardouayassir7359
      @zardouayassir7359  2 года назад

      It's the largangian coefficient, a variable that if optimized allows to easily compute parameters w, b.