Ali Ghodsi, Lec 12: Soft margin Support Vector Machine (svm)

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024
  • Description

Комментарии • 7

  • @mostafanakhaei2487
    @mostafanakhaei2487 4 года назад +2

    Best Lecture ever. I am a fan of your lectures and sometimes I watch them several times.

  • @Ardneevar81
    @Ardneevar81 6 лет назад

    One of the best videos i have come across on SOFT margin.

  • @utkarshkulshrestha2026
    @utkarshkulshrestha2026 5 лет назад

    Very nicely explained. Subtle mathematics and sufficient explanation about it.

  • @vandinhtran4338
    @vandinhtran4338 8 лет назад

    Great lecture

  • @TSITSOS007
    @TSITSOS007 8 лет назад +1

    very nice video ! You helped me very much, thank you. " ξ " is not spelled zeta but like this translate.google.gr/#el/en/%CE%BE from a Greek fan of your videos :D

  • @pranavchat141
    @pranavchat141 3 года назад

    @7:58 How do we compute beta0? In non kernel version we get beta0 by putting value of beta(transpose) on the equation of line y(beta(trans)* xi + beta) = 1. Any ideas?

    • @alexandros27.
      @alexandros27. 3 года назад +1

      Beta0 is a singular value . So we can only calculate it in the non kernel version . It will remain the same even when the data are projected to higher dimensions. Since beta0 is a single value we cannot use the kernel to change its dimensions