Deep Neural Network Python from scratch | L layer Model | No Tensorflow

Поделиться
HTML-код
  • Опубликовано: 27 дек 2024

Комментарии • 46

  • @hamafel7559
    @hamafel7559 Год назад +5

    Dude....your videos are FABULOUS!!
    keep going!!we need you!!

  • @Felloggs
    @Felloggs Год назад +3

    Bro, your content and you as well are AWESOME.
    Liked and subbed, keep it up!

  • @matteomanzi8498
    @matteomanzi8498 3 месяца назад +2

    It's a pleasure to learn with your lesson!

    • @MachineLearningWithJay
      @MachineLearningWithJay  2 месяца назад

      Hi, glad they are helpful! It’s a pleasure to create these videos!

  • @lokendrakumar6212
    @lokendrakumar6212 9 месяцев назад

    Bro this is the simplest explanation in whatever I have seen

  • @matthiasblumrich6802
    @matthiasblumrich6802 10 месяцев назад +1

    You and Josh Starmer (StatQuest) totally demystify DNNs. Thanks!!!

  • @lecturesfromleeds614
    @lecturesfromleeds614 10 месяцев назад

    Straight forward and to the point! Good video

  • @engineeringdecrypted9052
    @engineeringdecrypted9052 3 месяца назад +1

    Thanks bro ! you are simply great teacher

  • @threetime-ne2dc
    @threetime-ne2dc Год назад

    thank u, it the best and clear vedio I have watched😍U an extremely handsome man!!!

  • @divyagarh
    @divyagarh 10 месяцев назад +1

    Good stuff. No videos for a year? please keep uploading. Thank you

  • @subhambasuroychowdhury9698
    @subhambasuroychowdhury9698 2 года назад +1

    Awesome Video man!!!

  • @ikrameounadi7075
    @ikrameounadi7075 2 года назад +1

    hey , great playlists , i can now say i can really understand deep learning , can you please make a video explaining the perceptron algorithm and its complexity along with kernels ?!

  • @mustafatuncer4780
    @mustafatuncer4780 Год назад +1

    I really liked it. I would also like for you to create a video about LSTMs and Transformers (from scratch).

  • @confuseartist831
    @confuseartist831 2 года назад +1

    you're back....damnnnnn💥💥

  • @svk0071
    @svk0071 7 месяцев назад

    Very helpful. Please make a playlist for GAN and transformers like you made for CNN.

  • @prajwalsatannavar4576
    @prajwalsatannavar4576 Год назад +1

    Too Good man....Thank u so much!

  • @ashraf_isb
    @ashraf_isb Год назад +1

    Thank you so much sir, very much helpful 🙂

  • @ajay0909
    @ajay0909 2 года назад +2

    very decent explanation, would you like to do the same for CNN?

  • @soumyadeepsarkar2119
    @soumyadeepsarkar2119 11 месяцев назад

    thanks a lot bro your videos really helped me

  • @zonunmawiazadeng5018
    @zonunmawiazadeng5018 Год назад

    Hey I implemented Backpropagation on CNN.....dL/dZ = dL/dA dot product on dA/dZ according to your video....In my implemented, dL/dA and dA/dZ are both the shape (training size,image height, image weight, channel size)...If this is corrected, how should we dot product it....

  • @ongyinren3secv96
    @ongyinren3secv96 2 года назад

    I had a problem with the output in Spyder that it only show one iteration which is
    iter:1 cost: 0.697567606727616 train_acc:0.65 test_acc:0.3
    ==
    May I know what possibly is the error?

  • @rodrigoyepez8123
    @rodrigoyepez8123 2 года назад

    I keep getting this error ( ValueError: shapes (1,1) and (10,1000) not aligned: 1 (dim 1) != 10 (dim 0) ) in reference to this line ( grads["dZ" + str(l)] = np.dot(parameters['W' + str(l+1)].T,grads["dZ" + str(l+1)])*derivative_relu(forward_parameters['A' + str(l)]) ) does anyone know why?

  • @ИсмоилОдинаев-й2я

    Hi, Mr. Jay Patel!
    Thanks a lot for such explaining!
    Why you don't use derivative in the output layer (AL) for sigmoid function during backward pass?
    Can we state that the weights of the last layer (WL) learn without taking into account back pass of the output error (AL-Y) through sigmoid?
    If yes, why you and other guys don't use it?

  • @uchindamiphiri1381
    @uchindamiphiri1381 Год назад

    is this video suitable for beginers? If not recommend me what to watch before jumping to this

  • @shubhamraj310
    @shubhamraj310 2 года назад +1

    keep it up✌️💯

  • @chandut1296
    @chandut1296 Год назад +1

    Hi bro, try to do the videos on Pre-trained hugging face transformers

  • @yulupi552
    @yulupi552 Год назад +1

    great content, would you like to do the same for RNN?

  • @tungvuson4682
    @tungvuson4682 Год назад

    hi. what is your dataset name

  • @niladrichakraborti5443
    @niladrichakraborti5443 2 года назад

    Please make a video on COVID-19 detection using chest X Ray

  • @jatingupta3443
    @jatingupta3443 8 месяцев назад

    Thanks bro

  • @algorithmo134
    @algorithmo134 6 месяцев назад

    You made a small mistake while typing the code for derivative_tanh(x) function.
    ✔ The correct code will be :
    def derivative_tanh(x):
    return 1 - np.power(np.tanh(x), 2)

  • @pranavgangapurkar195
    @pranavgangapurkar195 Год назад +1

    you are SVNIT passout na?

  • @TheCodingChamelion
    @TheCodingChamelion 5 месяцев назад +1

    copied from coursera

    • @MachineLearningWithJay
      @MachineLearningWithJay  5 месяцев назад

      Haha.. yeah, I have learned from that only. It's a very good source tbh.
      I have made a video on RUclips, bcoz they don't have any video of it.