Derivatives Of Activation Functions (C1W3L08)

Поделиться
HTML-код
  • Опубликовано: 19 ноя 2024

Комментарии • 15

  • @eleora8995
    @eleora8995 5 лет назад +5

    Thank you professor for your act of kindness

  • @godsgrace9475
    @godsgrace9475 2 месяца назад

    7 years and still Relevant.

  • @saanvisharma2081
    @saanvisharma2081 5 лет назад +8

    God bless this man!!!

    • @rp88imxoimxo27
      @rp88imxoimxo27 4 года назад +2

      There are no gods expect Andrew NG

  • @GurpreetSingh-th1di
    @GurpreetSingh-th1di 6 лет назад +15

    Thanks Giving This content for free

  • @wajdilas
    @wajdilas 6 лет назад +2

    Very informative, thank you.

  • @ugurkanates9223
    @ugurkanates9223 5 лет назад

    thanks for amazing course for free

  • @sandipansarkar9211
    @sandipansarkar9211 3 года назад

    nice explanation

  • @aniketbhand1824
    @aniketbhand1824 4 месяца назад

    in relu's derivative it should be (z) for z>=0 or 1 for (z)>=0 ?

  • @anugrahtriramadhan9300
    @anugrahtriramadhan9300 3 года назад

    God bless

  • @yusuferoglu9287
    @yusuferoglu9287 5 лет назад +1

    is there anyone that can help me about derivative of softmax act. function implementation in pure java neural network?

  • @sayantanmazumdar9371
    @sayantanmazumdar9371 3 года назад

    sigmoid(x)*(1-sigmoid(x)) , gives you derivative but this formula also does the same
    (e^(-x))*((1+e^(-x))^(-2))
    I think both should work

  • @raymondchang9481
    @raymondchang9481 Год назад

    what a god