Awesome Video..!! I must have watched around 20 videos to understand this concept but this is the place where i understood it crystal clear.!! Thanks u so much for this video.!!
Awesome video! It helps me understand Asymptotic functions a lot. However, there is one part that i did not quite understand, which is at 21.30 when you said logn * logn * logn * ... = nlogn. I am wondering whether it should be (logn)^n instead?
Sir your are the best well explained I don;'t have words to describe. One small request the example you are proving is actually with respect to n always but instead of n if you replace with value say for example at last you use n! but instead if you prove the same thing with number say for example 5! then it would be more understandable many have problem with relating with n. Anyway I don't have right to complain since this is the best tutorial i have ever found for asymptotic notation and I am thankful for this tutorial.
Great explanation, only there's a little mistake. On 14:30 you somehow turned n power 2 to 2 power n and got the wrong result in the end I mean, it's "technically" not wrong, but I suppose you were trying to prove that O(n) is equal to OMEGA(n), which it is but your result tells otherwise.
For binary search, every time problem size becomes half. Let us say, at start, we have problem size n, for the second iteration it becomes n/2, for third iteration it becomes n/4. Like this, after some time, it becomes 1. Therefore n/x = 1. But, we know x is in powers of 2. n/2^k = 1. ==> n = 2^k. Take log on both sides log n = log 2^k log n = k * log 2 ( because log a^m = m * log a) log n = k * 1 (because log 2 = 1) log n = k. Now, if we go back to original equation n/2^k = 1 In the first step, we have size n ( n/2^0) In the second step, we have size n/2 ( n/2^1) In the third step, we have size n/4 ( n/2^2) After k steps, we have size 1 ( n/2^k) Now, we know k = log n Therefore, after log n steps the problem size is 1. That is the reason, we say, binary search takes log n time.
Give lecture on - Brute Force, Greedy method, branch and bound, backtracking , dynamic programming , asymptotic analysis (best,worst,average cases), of time and space , upper and lower bound, basic concept of complexity classes- P, NP,Np-hard,NP-complete, graph and tree algorithm , depth breadth first traversal , tractable and interactable problem
My go-to tutor for all algorithm related courses!
Thank you 😊
Simply the best explanation of asymptotic notations I have ever came across.
may Allah bless you, the best Algo Prof. i have ever seen ! may Allah give you what ever you asking for !
Thank you very much! This is so far the BEST explanation I've come across, cheers!
Sir, this is the best explanation I've seen on this topic to date! many thanks and respect from Pakistan.
hey bro how he is taking n2 for matrix part :(( help
Awesome Video..!! I must have watched around 20 videos to understand this concept but this is the place where i understood it crystal clear.!! Thanks u so much for this video.!!
Vishal Upadhayay
this took me a year to find this video, perfectly explained.
This breakdown is amazing! Thanks!
This was useful to me a lot sir. It will be useful if you refer a link to the video for which you are not explaining in detail. Thank you.
Respect from Kolkata, keep up the good work!
Quick revison of all previous complexities lectures, thanks alot sir
Huge thanks from Poland
The video is just very well explained, many thanks sir
Awesome sir it's very useful for me respect from Pakistan
awsome yaar !!
Even today I understand asymptotic notation in my Mtech after watching these video thanks to you You saved my year ;)
Really really really simplified. Awesome explanation. Great work. Thanks for the explanation
A really simple smooth way to learn asymptotic notations.Thanks for your effort!
Awesome video! It helps me understand Asymptotic functions a lot. However, there is one part that i did not quite understand, which is at 21.30 when you said logn * logn * logn * ... = nlogn. I am wondering whether it should be (logn)^n instead?
Yes , but Ig sir considered it to be (log (n^n) ) {which would be nlog(n)} and not ((log n)^n)
omg thank u !
we have teacher explaining as shit but u r saved us
Very nice explaination sir
Very good
Very nice and easy..thanks for this video
Seriously, best video.
What do the terms lower bound and upper bound denote ?
Thank you so much, boss.
now i know about notations thanku sir
Sir why you not write bigoh of n^n and omega of n^n for the problem time complexity n!
Thank you, it is really simplified
Well explained. I have watched may other videos on Algorithms done by you, and they are extremely helpful. Thank you.
Very helpful and easy to understand
Bow to your Feet Sir !! Great Teacher.
Thanks a lot. It is very simple explanation. Very well done - Five stars.
Thjank you
BEST VIDEOS!!!!
Sir your are the best well explained I don;'t have words to describe. One small request the example you are proving is actually with respect to n always but instead of n if you replace with value say for example at last you use n! but instead if you prove the same thing with number say for example 5! then it would be more understandable many have problem with relating with n. Anyway I don't have right to complain since this is the best tutorial i have ever found for asymptotic notation and I am thankful for this tutorial.
bc itna kya ho gaya
edit: nahi nahi sorry, sahi hi likha hai tune
Very very thankful to you
very much appreciated..thank you sir
thank you, this was quite helpful!
Great explanation, only there's a little mistake. On 14:30 you somehow turned n power 2 to 2 power n and got the wrong result in the end
I mean, it's "technically" not wrong, but I suppose you were trying to prove that O(n) is equal to OMEGA(n), which it is but your result tells otherwise.
Explanation is absolutely correct. You missed the point there girl. He didn't *turned* n^2 to 2^n, He *replaced* it to demonstrate the upper bound!
Its good to understand.......thank you
well explained
upload the videos on recurrence relation
say please
21:25
lg(n)*lg(n)*...*lg(n) = (lg(n))^n
it is equal to nlogn only, logarithmic property (logn)^n=nlogn
Sir I want to know about log value means how you put log ...very confused about log
For binary search, every time problem size becomes half. Let us say, at start, we have problem size n, for the second iteration it becomes n/2, for third iteration it becomes n/4. Like this, after some time, it becomes 1. Therefore n/x = 1. But, we know x is in powers of 2. n/2^k = 1.
==> n = 2^k.
Take log on both sides
log n = log 2^k
log n = k * log 2 ( because log a^m = m * log a)
log n = k * 1 (because log 2 = 1)
log n = k.
Now, if we go back to original equation n/2^k = 1
In the first step, we have size n ( n/2^0)
In the second step, we have size n/2 ( n/2^1)
In the third step, we have size n/4 ( n/2^2)
After k steps, we have size 1 ( n/2^k)
Now, we know k = log n
Therefore, after log n steps the problem size is 1.
That is the reason, we say, binary search takes log n time.
Sir can you please help me to solve the recurrence relation T(n)=nT(n/2)+an^2
well done sir, thanks.
Always the best
Excellent !
how log(n) * log(n) ... n times = nlog(n)?
I think it should be (log(n)) ^ n
Please explain b and n+tree
Easy to understand.........
Understood sir
gud wrk
Thank you.
I believe s=s + Ap[1]; is n + 2 and not n
thanks
Really Big thannnnnks
Give lecture on - Brute Force, Greedy method, branch and bound, backtracking , dynamic programming , asymptotic analysis (best,worst,average cases), of time and space , upper and lower bound, basic concept of complexity classes- P, NP,Np-hard,NP-complete, graph and tree algorithm , depth breadth first traversal , tractable and interactable problem
Mishraji, channel check out karo acche se, yeh saare topic bohout badiya detailing ke saath cover ho chuke hai!
Sir plz take boarding lectutes
sir improve your log math
i didnt get why why we add 1 like
for i=1 to n do...
why its not just n but we also have to add 1
help please
Excellent !