Guys, if you liked this video & want many more such tech educational videos on this channel then please support me by subscribing to this channel & also share it with your friends too ✌
I guess im randomly asking but does anybody know a tool to get back into an instagram account?? I was stupid lost my password. I would love any help you can offer me.
Never stop making videos. This legit prepared me for my exam 100 times better than my professor did. I got an A on the exam because of you. Thank you so much!
That's amazing to know Jacob ✌️ super happy for you and your amazing results too. Would be a great help if you could share our channel & videos with your friends too 😊
Yes that is true i can not understand nothing from my professor too. This guy is pure gold learned almost everything from him. Never stop uploading man you are gifted! Thank you for everything
I have not even watched the video yet and I already know this this the best video I have every seen. I legitimately screamed in joy when I realized this was a Simple Snippets video.
Hehehe whats the full form of ASU ? which institute is this ? I wish I made that kinda money but surely in time I will earn a lot too. Right now my only goal is to provide high quality education to everyone 😇
Most welcome Shreyas, please do share the videos & our channel with your friends too. Thats the biggest help and support you can give back to this channel! 😇
Haha thank you for this feedback. Would be great if you can transfer that 200 dollars to me 🤣 Just kidding. Don't need donations. I'm happy that this video helped you 😊
Nice explanation, but cases(best, worst and average) and asymptotic notations are two independent terms, like best case of linear search also can be mentioned as O(1).
I have a doubt. If we need to find the closest fit to the best case time like you said, then shouldn't Big-Omega(n) have the constant as 2 instead of 1?? 2n < 2n+3 always Instead of 1n as 2n is a closer fit. Please tell me if I'm wrong with reason
With big omega you don't actually need to write what constant you use, whether it's 2n or 2000n it's still just O(n), you just have to find any constant to satisfy g(n) being bigger after n0 and you're set
when f(n) = 2n + 3 Big Omega is Ω(n) Big Theta is θ(n) But for linear seach algorithm f(n) would also be like f(n) = a*n + b; where a and b are some constants Then why Big Omega is Ω(1) in this case?
I don't understand where these constant values are taken from. How do determine if c should be 1 or 2 or whatever? Is it just pick a random number, or are there some logic behind it?
You just drop the constants because what matters is the type of operation happening, not how many times it happens. So, O(2n), O(3n), O(4n), etc. can have their consants dropped to O(n), because they're all the same type of operation regardless (linear).
I can tell this is a good teaching video. Just wished the accent was a little less heavy for those who are not Indian. It's very difficult to understand and it's frustrating because I know how intelligent Indians are but can't seem to find videos where they don't speak with such heavy accents. Oh and I'm also Asian btw before anyone wants to pull the race card.
Equating O(.) notation with worst-case, Ω(.) notation with best-case and Θ(.) with average-case is incorrect. The O/Ω/Θ notations and "caseness" (worst/best/average) are independent concepts. It's a common misconception and I see nobody has pointed it out yet in the comments so I will explain why it's wrong. Let's start with that your mathematical definitions of the O/Ω/Θ notations are generally correct. Maybe would only highlight the fact that this notations are not exclusive to computer science or algorithms, but just describe the upper/lower/tight asymptotic bounds on the growth rate of a given function. Ok so the first minor inaccuracy is that when in 13:51 you have found out that f(n) is O(n) and f(n) is O(n^2) you've said that "when we try to find the O(.) notation we have to find the closest one which matches the f(n)". Well, no we don't have to. We have shown that indeed both O(n) and O(n^2) satisfy the mathematical definition and thus both are true. The reason we prefer the O(n) to O(n^2) is just because it gives as more information (it's a tighter bound). Now the big problem. At 24:10 you decided to analyse the time complexity of the linear search algorithm. So now it's true that it's Ω(1) and it's O(n), however it's NOT Θ(n). There is actually no function g(n) such that the time complexity is Θ(g(n)). That is because indeed Ω(1) is the tightest lower bound (for example it's not Ω(log(n))) and O(n) is the tightest upper bound (for example it's not O(log(n))). So you can see there is no g(n) which satisfies the condition c1*g(n) Θ(1) In the average case we have: Ω(n) and O(n) => Θ(n) // here we could say it's n/2, but we omit the constants So it's worst case Θ(n), best case Θ(1) and average case Θ(n). See that I used Θ(.) notation for each worst/best/average case. And the benefit of using Θ(.) for all cases is that it shows the tight bound. That is for example when we say it's worst case Θ(n) it means that it is not worst case Θ(1) and it is not worse case Θ(n^2). When we would use O(.) notation to describe worse case we can indeed say that it's O(n), but it's also true that it's O(n^2). So using Θ(.) gives us more information (it "forces" us to give the tight bound). This means that we should generally use Θ(.) notation as it gives us the most information. The problem however is that if we want to look at the general case of the algorithm the Θ(.) simply might not exist. So in that circumstance the best we can do is say that in general case this algorithm is O(n) and Ω(1). The only algorithms for which we can describe the general case complexity using Θ(.) notation are once for which worst case Θ(.) is the same as best case Θ(.). For example the problem of finding minimum value in n element array is worst case Θ(n), best case Θ(n) and average case Θ(n). So we can say that this algorithm has (in general) Θ(n) time complexity.
@@ganashree8342 Yeah, I understand what Big O and the others; for easy f(n), it is easy, but the problem for me comes when the f(n) is more complex. I have a lot of issues finding c1, c2, and n0.
Sir is f(n) is different algorithm for same problem because u took differ equation for f(n) and g(n), is f(n) is like refrence and we r comparing with g(n) to find best best ,worst and average case?
Do not get confuse. Lets clear if f(n) = n^3+n^2+1 then g(n) is some derived portion of f(n) which is impacting your algorithm. Therefore here, g(n) can be n^3 i.e. g(n) = n^3 or g(n) = n^3+n or g(n)=n^3+5 etc. Both f(n) and g(n) belongs to same algorithm.
@@Kucchuu I had also the same problem but i can't understand where the g(n) comes from can you explane. you saying derived portion what is derived portion
sir, in the end of the video, you give an example , O(1),O(n),O(n/2)or O(n)....we understand it , but sir when O(logn),O(nlog(n))...same thing happed in same process.... but any example for O(logn),O(nlog(n))?!🤔 tnq.... sir for this type of OSM!😍😍😍 video... as always osm explanation . hope you replay.❤
Guys, if you liked this video & want many more such tech educational videos on this channel then please support me by subscribing to this channel & also share it with your friends too ✌
please make a tutorial on visual basic
I guess im randomly asking but does anybody know a tool to get back into an instagram account??
I was stupid lost my password. I would love any help you can offer me.
How can we imagine the value of g(n) to be n,n2 like that
Never stop making videos. This legit prepared me for my exam 100 times better than my professor did. I got an A on the exam because of you. Thank you so much!
That's amazing to know Jacob ✌️ super happy for you and your amazing results too. Would be a great help if you could share our channel & videos with your friends too 😊
Yes that is true i can not understand nothing from my professor too. This guy is pure gold learned almost everything from him. Never stop uploading man you are gifted! Thank you for everything
@@georgikarastoychev1241 How can i be Software Engineer after 12 th commerce?
It's been 3yrs and this saved my life
I have not even watched the video yet and I already know this this the best video I have every seen. I legitimately screamed in joy when I realized this was a Simple Snippets video.
7mins into the video, I understood Big Oh better. Well played.
the most thoroughly and easily explained tutorial I have ever seen. Thank you a bunch!
Thank you very much, after having tried much to grasp what my lecturer explained with no success, yours has just been through. Keep up the good work!
You are 3x better at explaining this than my college professor at ASU. You should be making the absurd tuition money she does
Hehehe whats the full form of ASU ? which institute is this ?
I wish I made that kinda money but surely in time I will earn a lot too. Right now my only goal is to provide high quality education to everyone 😇
@@SimpleSnippets Most likely Arizona State University, I feel the same way
I am also a student of discrete mathematics at ASU who is finally getting a clear explanation. Thank You!
👍
That's great to know Fredrick 😊
I'm glad I watched this after several videos. Thank you so much
You are amazing!
Straight to the point!
Nice editting!
I truly appreciate it :)
Thanks buddy 🤟 glad you liked it 😊
nd u look cute :p
watching from Africa kenya. im already a teacher now because of this tutorial
Thank you, teacher. we stand with you
Thank you so much for this! Honestly saving my exams by explaining it so clearly I finally understand :')
Hey Bro you saved me my máster course your explanation is awesome, God bless you regards from México
Great to hear!
video starts at 1:17
Thank you for the video, I finally understand the concept because of you, thank you again !
Thank you some much for this video!! Thanks to you in 30 min I understood perfectly what my professor didnt explain properly in 10 hours :))
Glad it helped!
your explanation is the best!!! Thank you a lot!
This is a life saver man, thank you!!
I love it - my professor should learn from you
This covers theory quite well unlike other videos
Thanks, this will surely help me out in my midterm
Thank you so much . I really appreciate your works
Glad you like them!
Quite exemplary and to the point.
Thanks for your work.
Well explained
literally youtube king
best video ever found ❤
Thank you, men. Really helped me
Thank you, excellent explanation
carry on broo..... ur explaination was awesome
Am even using the tutorial to prepare for an exam this morning and is so helpful
Thanks for this video bro...
Most welcome Shreyas, please do share the videos & our channel with your friends too. Thats the biggest help and support you can give back to this channel! 😇
Excellent explanation.
Your explanation is excelent!
Best video on notations 💪
I like your ds lecture now I will complete your ds course thank you I am last year student
Glad you liked it! Please support me by sharing the videos and our channel with your friends too. Thats the biggest help and support you can provide 😇
@@SimpleSnippets you upload your videos on edyoda learning
@@amrutachavan7686 yes I have uploaded some video on edyoda platform 😊
As always amazing video and very nice explanation. Thank you so much!
Glad you liked it!
Excellent video...
I am waiting for this type of lectures.. 🤩🤩🤩🤩
thanks bro really you are a legend
Thank you! Note: small error on Big theta slide, description says "Big Omega"
Within the first 100 seconds this video explained Big-O better than my $200 textbook and my professor… combined.
Haha thank you for this feedback. Would be great if you can transfer that 200 dollars to me 🤣
Just kidding. Don't need donations. I'm happy that this video helped you 😊
Nice explanation, but cases(best, worst and average) and asymptotic notations are two independent terms, like best case of linear search also can be mentioned as O(1).
You are a life saver bro
Very good lecture
Incredibly helpful video ~ thank you
Best explanation ever ❤️❤️❤️
superb explanation
hi, just a little suggestion: it's better to say f(n) is O(g(n)) or f(n) belongs to O(g(n)) instead of saying f(n) = O(g(n))
Have you ever heard of a dialect of English that comes from India. Indian English bro.😁 I am serious
best explain....u r amazing😃
Thank you Vaishnavi 😊
thanks a lot for such a amazing explanation :)
You rock! Thank you for sharing your knowledge
It was amazing. Thank you.
Its a great lecture
Thank you very much. You are a hero!
u r amazing. Thank u soooo much
Thanks for explanation, nice video !
Finally understood it. Thank you so much.
I have a doubt. If we need to find the closest fit to the best case time like you said, then shouldn't Big-Omega(n) have the constant as 2 instead of 1??
2n < 2n+3 always
Instead of 1n as 2n is a closer fit. Please tell me if I'm wrong with reason
With big omega you don't actually need to write what constant you use, whether it's 2n or 2000n it's still just O(n), you just have to find any constant to satisfy g(n) being bigger after n0 and you're set
@@sangodan3031 You're right mate. Thanks.
Excellent job, man!
THANK YOU VERY MUCH SIR
fantastic..pls upload more videos for clearing concept
Thanks man for making such an awesome content
Keep up with the good work, thanks.
Superb sir nice explanation
Glad you liked it! Please support me by sharing the videos and our channel with your friends too. Thats the biggest help and support you can provide 😇
keep up the great work!!
when f(n) = 2n + 3
Big Omega is Ω(n)
Big Theta is θ(n)
But for linear seach algorithm f(n) would also be like f(n) = a*n + b; where a and b are some constants
Then why Big Omega is Ω(1) in this case?
I don't understand where these constant values are taken from. How do determine if c should be 1 or 2 or whatever? Is it just pick a random number, or are there some logic behind it?
You just drop the constants because what matters is the type of operation happening, not how many times it happens. So, O(2n), O(3n), O(4n), etc. can have their consants dropped to O(n), because they're all the same type of operation regardless (linear).
thanks a lot, you are the best 😍
Decent tutorial! Thank you
Glad it was helpful!
I can tell this is a good teaching video. Just wished the accent was a little less heavy for those who are not Indian. It's very difficult to understand and it's frustrating because I know how intelligent Indians are but can't seem to find videos where they don't speak with such heavy accents. Oh and I'm also Asian btw before anyone wants to pull the race card.
Thank you very much.
Equating O(.) notation with worst-case, Ω(.) notation with best-case and Θ(.) with average-case is incorrect. The O/Ω/Θ notations and "caseness" (worst/best/average) are independent concepts.
It's a common misconception and I see nobody has pointed it out yet in the comments so I will explain why it's wrong.
Let's start with that your mathematical definitions of the O/Ω/Θ notations are generally correct.
Maybe would only highlight the fact that this notations are not exclusive to computer science or algorithms, but just describe the upper/lower/tight asymptotic bounds on the growth rate of a given function.
Ok so the first minor inaccuracy is that when in 13:51 you have found out that f(n) is O(n) and f(n) is O(n^2) you've said that "when we try to find the O(.) notation we have to find the closest one which matches the f(n)". Well, no we don't have to. We have shown that indeed both O(n) and O(n^2) satisfy the mathematical definition and thus both are true. The reason we prefer the O(n) to O(n^2) is just because it gives as more information (it's a tighter bound).
Now the big problem. At 24:10 you decided to analyse the time complexity of the linear search algorithm.
So now it's true that it's Ω(1) and it's O(n), however it's NOT Θ(n). There is actually no function g(n) such that the time complexity is Θ(g(n)).
That is because indeed Ω(1) is the tightest lower bound (for example it's not Ω(log(n))) and O(n) is the tightest upper bound (for example it's not O(log(n))). So you can see there is no g(n) which satisfies the condition c1*g(n) Θ(1)
In the average case we have:
Ω(n) and O(n) => Θ(n) // here we could say it's n/2, but we omit the constants
So it's worst case Θ(n), best case Θ(1) and average case Θ(n). See that I used Θ(.) notation for each worst/best/average case. And the benefit of using Θ(.) for all cases is that it shows the tight bound. That is for example when we say it's worst case Θ(n) it means that it is not worst case Θ(1) and it is not worse case Θ(n^2).
When we would use O(.) notation to describe worse case we can indeed say that it's O(n), but it's also true that it's O(n^2).
So using Θ(.) gives us more information (it "forces" us to give the tight bound).
This means that we should generally use Θ(.) notation as it gives us the most information. The problem however is that if we want to look at the general case of the algorithm the Θ(.) simply might not exist. So in that circumstance the best we can do is say that in general case this algorithm is O(n) and Ω(1).
The only algorithms for which we can describe the general case complexity using Θ(.) notation are once for which worst case Θ(.) is the same as best case Θ(.). For example the problem of finding minimum value in n element array is worst case Θ(n), best case Θ(n) and average case Θ(n). So we can say that this algorithm has (in general) Θ(n) time complexity.
great video!!!
Very well done
Thank you very much!
I don't understand one thing in this equation 2n+3
omg d same doubt flashed to me as soon as he explained it.can someone please explain this.
@@ganashree8342 Yeah, I understand what Big O and the others; for easy f(n), it is easy, but the problem for me comes when the f(n) is more complex. I have a lot of issues finding c1, c2, and n0.
Do you have implementation for these concepts. Thank you for your help. It is very clear and simple. It is way better than my university teachings.
Thanks so much man
thanks for this video, even thanks for this playlist dude....:)
Thanks, it helped a lot👍
Thanks a lot 🙏Sir. Can you show some questions on this topic.
Great content. Easy to follow and to the point. Wonderful!
Congratulations on 100k subs.Love your videos.You make learning Fun. Thank You
Thanks a million!
Excellent video
Thank you very much!
You are great!
Sir is f(n) is different algorithm for same problem because u took differ equation for f(n) and g(n), is f(n) is like refrence and we r comparing with g(n) to find best best ,worst and average case?
Do not get confuse. Lets clear if f(n) = n^3+n^2+1 then g(n) is some derived portion of f(n) which is impacting your algorithm. Therefore here, g(n) can be n^3 i.e. g(n) = n^3 or g(n) = n^3+n or g(n)=n^3+5 etc. Both f(n) and g(n) belongs to same algorithm.
@@Kucchuu I had also the same problem but i can't understand where the g(n) comes from can you explane. you saying derived portion what is derived portion
cool video, thx
thnx for the help brother
awesome video!
Thank you 😊 please do Subscribe 👍
In big o notation what is c constant, like u took c as 5 in example so how we have take and what's it's role I'm not understanding 😶
nice video sir
Best❤🔥
Awesome !!!
Thanks bro
Amazing
sir, in the end of the video, you give an example ,
O(1),O(n),O(n/2)or O(n)....we understand it , but sir when O(logn),O(nlog(n))...same thing happed in same process.... but any example for O(logn),O(nlog(n))?!🤔
tnq.... sir for this type of OSM!😍😍😍 video...
as always osm explanation . hope you replay.❤
These time complexities can be seen in recursive kind of algorithms like mergesort :)
Also thank you very much for the compliments. ✌
Best 🎉
very good
great vid
Bhaiya from where can i solve DSA questions...? Coz in geekforgeek , interviewbit they have only solution but not explaination (video explaination)
Thank you ily
Thank yooou!
Hi bro in first line of the definition I think it should be 20:22 Big theta notation