Размер видео: 1280 X 720853 X 480640 X 360
Показать панель управления
Автовоспроизведение
Автоповтор
Thank you professor for your act of kindness
7 years and still Relevant.
God bless this man!!!
There are no gods expect Andrew NG
Thanks Giving This content for free
Very informative, thank you.
thanks for amazing course for free
nice explanation
in relu's derivative it should be (z) for z>=0 or 1 for (z)>=0 ?
God bless
is there anyone that can help me about derivative of softmax act. function implementation in pure java neural network?
sigmoid(x)*(1-sigmoid(x)) , gives you derivative but this formula also does the same(e^(-x))*((1+e^(-x))^(-2))I think both should work
what a god
Thank you professor for your act of kindness
7 years and still Relevant.
God bless this man!!!
There are no gods expect Andrew NG
Thanks Giving This content for free
Very informative, thank you.
thanks for amazing course for free
nice explanation
in relu's derivative it should be (z) for z>=0 or 1 for (z)>=0 ?
God bless
is there anyone that can help me about derivative of softmax act. function implementation in pure java neural network?
sigmoid(x)*(1-sigmoid(x)) , gives you derivative but this formula also does the same
(e^(-x))*((1+e^(-x))^(-2))
I think both should work
what a god