Badri Adhikari
Badri Adhikari
  • Видео 153
  • Просмотров 826 946

Видео

Time series decomposition analysis - trend, series, seasonality, and noise
Просмотров 2,7 тыс.2 года назад
Time series is often split into three components: trend, seasonality, and random fluctuation. This video demonstrates how to split a time series data into trend, seasonality, and noise.
How to draw effective concept maps?
Просмотров 1,7 тыс.2 года назад
Many ideas to draw an effective concept map. Read more at: badriadhikari.github.io/concept-map
NVIDIA DLI Deep Learning Certificate and University Ambassador - my experience
Просмотров 2,5 тыс.2 года назад
NVIDIA DLI Deep Learning Certificate and University Ambassador - my experience
[DL] Some pitfalls to avoid when designing your own convolutional neural network
Просмотров 6392 года назад
Deep Learning with Python - Some pitfalls to avoid when designing your own convolutional neural network
[DL] The convolution operation
Просмотров 6242 года назад
Deep Learning with Python - The convolution operation
How does convolution work?
Просмотров 1,1 тыс.2 года назад
Deep Learning with Python - How does convolution work?
The importance of feature engineering in machine learning and deep learning
Просмотров 4952 года назад
The importance of feature engineering in machine learning and deep learning
[DL] What is deep transfer learning?
Просмотров 1,3 тыс.3 года назад
[DL] What is deep transfer learning?
[DL] Deep transfer learning - using VGG16 convolutional base
Просмотров 3,9 тыс.3 года назад
[DL] Deep transfer learning - using VGG16 convolutional base
[DL] Residual networks in deep learning
Просмотров 10 тыс.3 года назад
[DL] Residual networks in deep learning
[DL] Why do residual networks work?
Просмотров 9413 года назад
[DL] Why do residual networks work?
[DL] The purpose of "1 by 1" convolutions
Просмотров 11 тыс.3 года назад
[DL] The purpose of "1 by 1" convolutions
[DL] How to choose a pretrained model?
Просмотров 3,2 тыс.3 года назад
[DL] How to choose a pretrained model?
[DL] Training very deep models using residual connections
Просмотров 4533 года назад
[DL] Training very deep models using residual connections
[DL] Training a very deep CNN model
Просмотров 11 тыс.3 года назад
[DL] Training a very deep CNN model
REALDIST: Real-valued protein distance prediction
Просмотров 3853 года назад
REALDIST: Real-valued protein distance prediction
[DL] LeNet-5 in Tensorflow Keras
Просмотров 2,5 тыс.3 года назад
[DL] LeNet-5 in Tensorflow Keras
[DL] Using VGG 16 pretrained model
Просмотров 2 тыс.3 года назад
[DL] Using VGG 16 pretrained model
[DL] How to prevent overfitting?
Просмотров 7654 года назад
[DL] How to prevent overfitting?
[DL] L1 and L2 regularization
Просмотров 5114 года назад
[DL] L1 and L2 regularization
[DL] Regularization using Dropout
Просмотров 4634 года назад
[DL] Regularization using Dropout
DL Regularization using Batch Normalization
Просмотров 8914 года назад
DL Regularization using Batch Normalization
[DL] Overfitting variance vs Underfitting bias
Просмотров 2844 года назад
[DL] Overfitting variance vs Underfitting bias
[DL] Evaluating machine learning models Measuring generalization
Просмотров 3774 года назад
[DL] Evaluating machine learning models Measuring generalization
[DL] Deep learning workflow Recipe: From data to deep learning model
Просмотров 3974 года назад
[DL] Deep learning workflow Recipe: From data to deep learning model
[DL] How to debug a deep learning development pipeline?
Просмотров 3764 года назад
[DL] How to debug a deep learning development pipeline?
[DL] How to train deeper convolutional neural networks?
Просмотров 2404 года назад
[DL] How to train deeper convolutional neural networks?
[DL] Goals of deep learning
Просмотров 1204 года назад
[DL] Goals of deep learning
[DL] Limitations of deep learning
Просмотров 5154 года назад
[DL] Limitations of deep learning

Комментарии

  • @abdelhakimkhabir
    @abdelhakimkhabir Час назад

    nice video btw

  • @oishibarmon7767
    @oishibarmon7767 2 дня назад

    Thank you

  • @christinali6805
    @christinali6805 11 дней назад

    Thank you very much!

  • @fritzhopper5145
    @fritzhopper5145 17 дней назад

    Thank you very much! Very helpful examples too. I have a question though, is the bias always 1?

  • @abhinavkant
    @abhinavkant 20 дней назад

    Thanks

  • @brandoncazares8452
    @brandoncazares8452 22 дня назад

    Good explanation!

  • @brandoncazares8452
    @brandoncazares8452 23 дня назад

    This is well-explained, thank you sir!

  • @Nirex_
    @Nirex_ 23 дня назад

    Very well explained, thanks.

  • @arashrahmani8372
    @arashrahmani8372 25 дней назад

    Thank you so much. Your explanation was clear and helped me a lot.

  • @ibrahimcetin153
    @ibrahimcetin153 28 дней назад

    I think that it is very good explanation I have ever seen

  • @zFake
    @zFake Месяц назад

    Thank you

  • @sideahsin
    @sideahsin Месяц назад

    did u fart 13:14

  • @wijzeuil
    @wijzeuil Месяц назад

    Very good, this is by far the best explanation of this matter that i have come across. I want to use statsmodels.tsa.seasonal in python and i was having a hard time understanding what a seasonal value of x really means. With this content i understand it much better!

  • @PhD.YouTuber
    @PhD.YouTuber Месяц назад

    You are doing amazing explanation with the clear understanding and knowledge you have. Thanks

  • @BonangLigar1
    @BonangLigar1 2 месяца назад

    On the minimal twice a year delivering the workshop, who arranged the workshop? Us/University or Nvidia? And could tell how you arranged to deliver them, thanks

  • @yugrajsingh5370
    @yugrajsingh5370 2 месяца назад

    Worst explanation of my life

  • @omarmujahid1816
    @omarmujahid1816 2 месяца назад

    Great Video, Thank You!

  • @rachel_yeah
    @rachel_yeah 3 месяца назад

    Thank you for the video :) helped me wrap my head around it

  • @MultiAiChat-jv9cv
    @MultiAiChat-jv9cv 3 месяца назад

    )0

  • @CesarAugusto-wm5mp
    @CesarAugusto-wm5mp 3 месяца назад

    You da best 🎉

  • @CesarAugusto-wm5mp
    @CesarAugusto-wm5mp 3 месяца назад

    Thank you very much 🎉

  • @ThePantafernando
    @ThePantafernando 3 месяца назад

    Is it just me that think those API really full of bad smells? For starters, why would someone write two ways to do the same thing? Thats making stuffs more complicated for no real gain in the send. That way of chaining functions in functional programming really hurt my ways when compared to classical chaining from functional programming.

  • @shwethetwai6135
    @shwethetwai6135 4 месяца назад

    Great explanation!

  • @charlesvictorio8101
    @charlesvictorio8101 4 месяца назад

    In the else if statement, there is a negative sign missing, it should be e ^ (-dE / T). This term represents the probability of accepting the new state, so it makes sense that e ^ (-dE / T) approaches 0 as dE increases and T decreases.

  • @ayoubennaoui6107
    @ayoubennaoui6107 4 месяца назад

    Great explanation, very well put thank you

  • @chitraprabhu1215
    @chitraprabhu1215 4 месяца назад

    I like your teaching style sir,it is 100 times good than my class teacher

  • @aaravkumar9886
    @aaravkumar9886 4 месяца назад

    Amazing explanation! Saw Andrew Ng's course but was having trouble understanding why exactly do residual connections work, now it is crystal clear. Thanks a lot!

  • @obaldalmeida6308
    @obaldalmeida6308 4 месяца назад

    Respected sir, Your course is the best Ai course I have stumbled upon and I am truly grateful to learn from you. But I would like to know where I could get acesss to the rest of the videos as there are many videos and chapter missing from this playlist. Thank you :)

  • @mehdiboudour
    @mehdiboudour 5 месяцев назад

    I've been through 5 videos so far, and it finally clicked in my mind thanks to this proof. Thank you so much.

  • @ابتسامةحياة-س5د
    @ابتسامةحياة-س5د 5 месяцев назад

    😢شرحك يبدو جيد ولكني لا أفهم الا العربية 😔

  • @thecatman9o9
    @thecatman9o9 5 месяцев назад

    just finished this playlist and I can say its a gem on youtube

  • @gauravbhasin2625
    @gauravbhasin2625 5 месяцев назад

    useless video

  • @gauravbhasin2625
    @gauravbhasin2625 5 месяцев назад

    you did not explain the basic architectural difference....

  • @joshuapauley1947
    @joshuapauley1947 5 месяцев назад

    As many comments note--this is a great quick explanation of the use of 1 x 1 convolutions in multiple contexts. I had watched 2 other 15+ minute videos and understood the math behind it. But the application and how it can be useful was missing in those concept videos. THANK YOU, Badri!

  • @xsilverx1198
    @xsilverx1198 6 месяцев назад

    This is amazing. Thank you so much.

  • @arpitakar3384
    @arpitakar3384 6 месяцев назад

    data or github link sir

    • @wijzeuil
      @wijzeuil Месяц назад

      docs.google.com/document/d/1aUpWp4b9sjeLOayPaSDeQtf_aRD289MvWd3kflmcZtk/edit?tab=t.0

  • @sashayakubov6924
    @sashayakubov6924 6 месяцев назад

    this is cool, no one covered this from such a perspective , thank you! I wish you would show the things that work as kernels in our visual process. I wonder: do we have copies of many kernels for convolution? Because obviously it's unlikely the kernel "moves", like when we perform convolution on a computer. So there must be hundreds of little neural kernels that are exact copies of each other!!!

  • @NigatuAsfewu
    @NigatuAsfewu 6 месяцев назад

    Exercise: Apply Uninformed Search Strategies to identify optimal path

  • @bushraw66
    @bushraw66 6 месяцев назад

    Thank you so much, very well explained.

  • @iitmaamoun
    @iitmaamoun 6 месяцев назад

    def fully_connected(input): x=tf.keras.layers.Dropout(0.2)(input) x= tf.keras.layers.Dense(512,activation='relu',kernel_regularizer='l1_l2',activity_regularizer='l1')(x) x=tf.keras.layers.Dense(256)(x) x= tf.keras.layers.BatchNormalization()(x) x=tf.keras.layers.Activation("relu")(x) output=tf.keras.layers.Dense(512,activation='softmax')(x) return tf.keras.Model(input, output)

  • @ryanp9441
    @ryanp9441 7 месяцев назад

    this is soooooooo gooood. Highly recommend to anyone who wants to understand the # of parameters in CNN

  • @6957-c5k
    @6957-c5k 7 месяцев назад

    Learnt something. Thank you professor

  • @tomiokascuteness1909
    @tomiokascuteness1909 7 месяцев назад

    I have a question, in the example at last where C has leaf nodes x and y. Can we prune the tree with roots x and y?

  • @sorieran8695
    @sorieran8695 7 месяцев назад

    Finally someone who can explain it simply. Thank you very much.

  • @tsukuruuu
    @tsukuruuu 7 месяцев назад

    For Convolution layer : Number of Params = (filter width × filter height × input channels + 1) × number of filters For Fully Connected layer: Number of Params= (current layer neurons c * previous layer neurons p) + c.

  • @aniruddhtiwari7378
    @aniruddhtiwari7378 7 месяцев назад

    JAY NEPAL

  • @larayassine790
    @larayassine790 7 месяцев назад

    life saver ! may Allah give u health and strength sir great explanation !!!!

  • @arunkumargopu
    @arunkumargopu 7 месяцев назад

    Good Job Professor

  • @innercircletradertevision
    @innercircletradertevision 8 месяцев назад

    This can be used in trading

  • @Nitish_shrivas
    @Nitish_shrivas 8 месяцев назад

    great explanation