- Видео 153
- Просмотров 826 946
Badri Adhikari
Добавлен 11 янв 2012
Tracing the simulated annealing algorithm to optimize the weights of an AND gate artificial neuron
Tracing the simulated annealing algorithm to optimize the weights of an AND gate artificial neuron
Просмотров: 900
Видео
Time series decomposition analysis - trend, series, seasonality, and noise
Просмотров 2,7 тыс.2 года назад
Time series is often split into three components: trend, seasonality, and random fluctuation. This video demonstrates how to split a time series data into trend, seasonality, and noise.
How to draw effective concept maps?
Просмотров 1,7 тыс.2 года назад
Many ideas to draw an effective concept map. Read more at: badriadhikari.github.io/concept-map
NVIDIA DLI Deep Learning Certificate and University Ambassador - my experience
Просмотров 2,5 тыс.2 года назад
NVIDIA DLI Deep Learning Certificate and University Ambassador - my experience
[DL] Some pitfalls to avoid when designing your own convolutional neural network
Просмотров 6392 года назад
Deep Learning with Python - Some pitfalls to avoid when designing your own convolutional neural network
[DL] The convolution operation
Просмотров 6242 года назад
Deep Learning with Python - The convolution operation
How does convolution work?
Просмотров 1,1 тыс.2 года назад
Deep Learning with Python - How does convolution work?
The importance of feature engineering in machine learning and deep learning
Просмотров 4952 года назад
The importance of feature engineering in machine learning and deep learning
[DL] What is deep transfer learning?
Просмотров 1,3 тыс.3 года назад
[DL] What is deep transfer learning?
[DL] Deep transfer learning - using VGG16 convolutional base
Просмотров 3,9 тыс.3 года назад
[DL] Deep transfer learning - using VGG16 convolutional base
[DL] Residual networks in deep learning
Просмотров 10 тыс.3 года назад
[DL] Residual networks in deep learning
[DL] The purpose of "1 by 1" convolutions
Просмотров 11 тыс.3 года назад
[DL] The purpose of "1 by 1" convolutions
[DL] How to choose a pretrained model?
Просмотров 3,2 тыс.3 года назад
[DL] How to choose a pretrained model?
[DL] Training very deep models using residual connections
Просмотров 4533 года назад
[DL] Training very deep models using residual connections
[DL] Training a very deep CNN model
Просмотров 11 тыс.3 года назад
[DL] Training a very deep CNN model
REALDIST: Real-valued protein distance prediction
Просмотров 3853 года назад
REALDIST: Real-valued protein distance prediction
DL Regularization using Batch Normalization
Просмотров 8914 года назад
DL Regularization using Batch Normalization
[DL] Overfitting variance vs Underfitting bias
Просмотров 2844 года назад
[DL] Overfitting variance vs Underfitting bias
[DL] Evaluating machine learning models Measuring generalization
Просмотров 3774 года назад
[DL] Evaluating machine learning models Measuring generalization
[DL] Deep learning workflow Recipe: From data to deep learning model
Просмотров 3974 года назад
[DL] Deep learning workflow Recipe: From data to deep learning model
[DL] How to debug a deep learning development pipeline?
Просмотров 3764 года назад
[DL] How to debug a deep learning development pipeline?
[DL] How to train deeper convolutional neural networks?
Просмотров 2404 года назад
[DL] How to train deeper convolutional neural networks?
nice video btw
Thank you
Thank you very much!
Thank you very much! Very helpful examples too. I have a question though, is the bias always 1?
Thanks
Good explanation!
This is well-explained, thank you sir!
Very well explained, thanks.
Thank you so much. Your explanation was clear and helped me a lot.
I think that it is very good explanation I have ever seen
Thank you
did u fart 13:14
Very good, this is by far the best explanation of this matter that i have come across. I want to use statsmodels.tsa.seasonal in python and i was having a hard time understanding what a seasonal value of x really means. With this content i understand it much better!
You are doing amazing explanation with the clear understanding and knowledge you have. Thanks
On the minimal twice a year delivering the workshop, who arranged the workshop? Us/University or Nvidia? And could tell how you arranged to deliver them, thanks
Worst explanation of my life
Great Video, Thank You!
Thank you for the video :) helped me wrap my head around it
)0
You da best 🎉
Thank you very much 🎉
Is it just me that think those API really full of bad smells? For starters, why would someone write two ways to do the same thing? Thats making stuffs more complicated for no real gain in the send. That way of chaining functions in functional programming really hurt my ways when compared to classical chaining from functional programming.
Great explanation!
In the else if statement, there is a negative sign missing, it should be e ^ (-dE / T). This term represents the probability of accepting the new state, so it makes sense that e ^ (-dE / T) approaches 0 as dE increases and T decreases.
Great explanation, very well put thank you
I like your teaching style sir,it is 100 times good than my class teacher
Amazing explanation! Saw Andrew Ng's course but was having trouble understanding why exactly do residual connections work, now it is crystal clear. Thanks a lot!
Respected sir, Your course is the best Ai course I have stumbled upon and I am truly grateful to learn from you. But I would like to know where I could get acesss to the rest of the videos as there are many videos and chapter missing from this playlist. Thank you :)
I've been through 5 videos so far, and it finally clicked in my mind thanks to this proof. Thank you so much.
😢شرحك يبدو جيد ولكني لا أفهم الا العربية 😔
just finished this playlist and I can say its a gem on youtube
useless video
you did not explain the basic architectural difference....
As many comments note--this is a great quick explanation of the use of 1 x 1 convolutions in multiple contexts. I had watched 2 other 15+ minute videos and understood the math behind it. But the application and how it can be useful was missing in those concept videos. THANK YOU, Badri!
This is amazing. Thank you so much.
data or github link sir
docs.google.com/document/d/1aUpWp4b9sjeLOayPaSDeQtf_aRD289MvWd3kflmcZtk/edit?tab=t.0
this is cool, no one covered this from such a perspective , thank you! I wish you would show the things that work as kernels in our visual process. I wonder: do we have copies of many kernels for convolution? Because obviously it's unlikely the kernel "moves", like when we perform convolution on a computer. So there must be hundreds of little neural kernels that are exact copies of each other!!!
Exercise: Apply Uninformed Search Strategies to identify optimal path
Thank you so much, very well explained.
def fully_connected(input): x=tf.keras.layers.Dropout(0.2)(input) x= tf.keras.layers.Dense(512,activation='relu',kernel_regularizer='l1_l2',activity_regularizer='l1')(x) x=tf.keras.layers.Dense(256)(x) x= tf.keras.layers.BatchNormalization()(x) x=tf.keras.layers.Activation("relu")(x) output=tf.keras.layers.Dense(512,activation='softmax')(x) return tf.keras.Model(input, output)
this is soooooooo gooood. Highly recommend to anyone who wants to understand the # of parameters in CNN
Learnt something. Thank you professor
I have a question, in the example at last where C has leaf nodes x and y. Can we prune the tree with roots x and y?
Finally someone who can explain it simply. Thank you very much.
For Convolution layer : Number of Params = (filter width × filter height × input channels + 1) × number of filters For Fully Connected layer: Number of Params= (current layer neurons c * previous layer neurons p) + c.
JAY NEPAL
life saver ! may Allah give u health and strength sir great explanation !!!!
Good Job Professor
This can be used in trading
great explanation