I just noticed that when analysing time complexity, he color codes each operation by type: Teal = Simple actions Red = Loops Purple = Recursive calls That makes it so much easier to follow!
The quality of these videos is amazing. The aspect ratio is perfect, the voice is clear, and I love how they use different colors for functions, recursion, and loops. It's crazy to think that this quality is from almost 10 years ago
The simple thing is : There will be logn levels to divide the array up to 1. and then for each level merge function is applied which itself takes 0(n) time. Therefore multiply n with the total step taken which is logn. The overall time complexity therefore boils down to 0(nlogn).
EDIT: 12/15/17 @8:13 For those who cannot understand why T(n) = 2*T(n/2)+c'*n = 2{ 2*T(n/4)+c'*n/2 } + c'*n, you may refer to my explanation and welcome for discussions. I didn't get it at first also, but now I understand it after grabbing a discrete math book. It's called mathematical induction. Now, T(n) = 2*T(n/2)+c'*n is a general form, and since T(n) = 2*T(n/2)+c'*n, then, T(n/2) = 2*T(n/2/2) + c'*(n/2), which just substitutes n with n/2 in T(n) = 2*T(n/2)+c'*n, and since T(n/2) = 2*T(n/2/2) + c'*(n/2) = 2*T(n/4) + c'n/2, T(n) = 2*T(n/2)+c'*n = 2 * (2*T(n/4) + c'n/2) + c'*n = 4T(n/4) + 2c'n. That's how it comes, and repeat for every n/4, n/8, n/16... etc, we will get the final result. -- Below is my original post, and it does have some errors though. If we expand 2{ 2*T(n/4)+c'*n/2 } + c'*n, it would be 4T(n/4) + 2cn, rather than 2{T(n/2) + c'*n/2 } + c'*n. Apparently, I didn't quite get what T(n/4) is, and mistakenly multiply that coefficient 2 with T(n/4) which makes 2*T(n/4) become T(n/2). So you can ignore all the things below -- I still cannot understand why T(n) = 2*T(n/2)+c'*n = 2{ 2*T(n/4)+c'*n/2 } + c'*n... like, if we expand 2{ 2*T(n/4)+c'*n/2 } + c'*n, it would be 2{T(n/2) + c'*n/2 } + c'*n = 2*T(n/2) + c'*n + c'*n = 2*T(n/2) + 2c'*n compared to original 2*T(n/2)+c'*n There seems to be an extra c'*n... Although someone mentions it's recursive I still haven't got the point... Why...?
Thanks a lot for being my Base Case of searching a Merge Sort Tutorial which will provide me details explanation. And of course, better than my Prof's lecture. :)
At 16:38, I think at level 1 it is n/2 (not n) because we are using only n/2 space, we didn't even entered the right recursion tree. Same goes for the next level, n/4, then n/8 and so on. So, the total space would be n/2+n/4+n/8+.....+1 (or may use infinity for simplification), and not n+n/2+n/4+....+1. But, anyway, it would be in O(n). Correct me if I'm wrong.
Hi Shanmuk, What is it that you do not understand? I can try explaining. Well, maths and programming go together. But, you don't always do such complex calculations to figure out time complexity. Recursion is tricky. I am sure you know enough maths to be able to program, that's why you are here. If you are not getting anything in this tutorial, ask specific questions.
I am having a hard time to follow the time complexity calculation. Maybe I don't know logarithmic and could be the reason. how come T(n/2) would become 2T(n/4)+c'(n/2). where does the c' come from? and i do not understand how 2^log2n T(1) will become nc. maybe it's just me but I could not follow. But thanks for doing this work. Appreciate it. If possible please try to do another video.Thanks
Sorry, I don't understand when calculating T(n) in 08:52 why T(n) = 2*T(n/2)+c'*n = 2{ 2*T(n/4)+c'*n/2 } + c'*n. Why we need to add c'*n in the end? and it also add c'*n in the following steps afterwards.
+apo tessar Because it is recursive function. So for example on 8:52, on the second black stroke he has 4*T(n/4) from this he looked what was exactly T(n/4) equal to, so by the same formula he gets 2T(n/2/2) + cn, when plugs in n/2 to general formula. It is same as 2T(n/4) + cn. than he multiplyed by outer 2 which was there before and got 4T(n/4) + cn. The second cn comes from the same function for n/4 written recursively and other cn stays there as before so at last he added them and got 2cn. this get done K times so recursivley he gets kn. Maybe its bad explanation but its hard to explain by words ))
I still cannot understand it... like, if we expand 2{ 2*T(n/4)+c'*n/2 } + c'*n, it would be 2{T(n/2) + c'*n/2 } + c'*n = 2*T(n/2) + c'*n + c'*n = 2*T(n/2) + 2c'*n compared to original 2*T(n/2)+c'*n There seems to be an extra c'*n... Why...?
+Hang Chen what are you talking about? When you expand expression of course it wont be the same as original..because you are using T(n/4) to evalute instad of T(n/2) He is expanding the whole recursive formula to show you how to get idea of time complexity. Its easier to understand what is going on when u follow original formula..
We can't determine T(n/2^k). But we need to get rid of T(n/2^k), and we know the T(1) = 1. Hence, we replaced with T(1). How T(1) is 1? Think up of a merge sort having 1 element will sorted in 1 unit of time. And, answering to your second question, log of n base 2 returns "x the POWER of 2 which equals to n". We are powering with the same number i.e. x (the POWER of 2) again, we'll get n. Try with some examples, you'll get it. ex: Base is 2. n = 8. Evaluation: 2 ^ log 8 = 2 ^ 3 = 8.
I guess the Big-O notation is more famous ! Big O gives the worst case scenario that is the tightest upper bound. While Theta notation is for function bounded between two curves from down and above.
Yes, We are reducing the expression at each step,, So, K is starting from 1 and increasing till we reach T(1) .. You can either reach me here or write to mycodeschool [AT] gmail [DOT] com
Wow!!!! Awesome explanation. Didn't knew about theta complexity. Everywhere I saw it was big Oh only. And many do not bother about space complexity. Thanks a lot. Please upload videos on heap sort also. Also sir if you could do some difficult time complexity questions on sorting algorithms, it will be very helpful for entrance exams (GATE etc.) Thank you sir!
I think this explanation was good, but skipped over mentioning that it was done by substituting in T(n/2) T(n/4) into the equations once we know the component 2T(n/2), 2T(n/4) etc
I might as well also add that the reason why mergesort(left) and mergesort(right) is T(n/2) is because it assumes its completed and therefore carries the time of completion through to this next part, hence the T rather than simply n/2
I would have porbably joined for a course if you had provided..... No body can ever get these basics in such an interactive way you're providing us with...
Thanks for g8 explanation sir i got clear picture today on sorting .But i am confused when u said to clear extra memory but as these array is store in the static section it is deleted itself .sir Are u taking in case if it is store in heap section?
we have to figure out the running time of two recursive calls on n/2 elements. Each of these two recursive calls takes twice of the running time of mergeSort on an (n/4) element subarray (because we have to halve n/2) plus cn/2 to merge, that's where you get the extra factor.
Hi, should not the space complexity be O(nlogn) ? When the left half is completely decomposed, it occupies "(n/2)*logn" space (logn steps) and the right half takes n/2 space. So total space is n/2*logn + n/2 which is equal to nlogn space.
Can someone explain to me this part T(n)=2T(n/2)+C′.n T(n)=2{2T(n/4)+C′.n/2}+C′.n Why do we multiply by 2 here. And how does C′.n become 2C′.n Also what is meant by the constant C here? nC+C′.nlogn=θ(nlog n)
+ Praveen M It is this, first you have this equation. T(n)=2T(n/2)+C′.n so assume n-->n/2 then substitute n/2 to first equation T(n/2)=2T(n/4)+C′.n/2 then again substitute above equation to the first equation T(n)=2T(n/2)+C′.n Hope you got it now. :-)
just had a small doubt if someone can explain (may be a dumb one..:/ ) When calculating T(n) array is divided into 2 parts so the k is incremented in every step. But I am confused with the calculation. How 2T becomes 4T and cn becomes 2cn.
Thank you so much for this video. I have a question. I usually get confused when calculating the complexity of recursive functions. In this case why is it log n (I know that's the correct answer but why so)?. We have two for loops with complexity of O(n) which is greater than log n. Why don't we derive that the complexity of merge sort is simply O(n). Also since we are going to visit all the nodes in the tree, the complexity should be O(n) right? It'd be very helpful if you can explain this to me.
Unfortunately none of the time complexity maths makes any sense to me. Having said that, the algorithm implementation was definitely a great help! Im hoping that just knowing that it is O(nlogn) time complexity if good enough
@@shashanksahu1971 the narrator "Animesh Nayan" is alive and working at Google. His friend and co-founder Harsha died in car accident. Most of the videos are done by Animesh.
Actually i am confused in deriving 2^k T(n/2^n) + k,c'n, but after watching the video again, i figured it out that as we dig into the levels the term K increases. Correct me if i am wrong.. Once again Thank u, can i reach u here again if i got any doubt ?
For space complexity it is clear. At level L1, there are n/2^1 elements, at level L2 there are n/2^2 elements....so on We know at last level there are 1 elements in array and we say it as n/2^k , then n/2^k =1
HI , Nice explanation, could you please share the code with us,actually over internet mergesort() taking n parameters buts in ur case its only one and i like that very much.
I just noticed that when analysing time complexity, he color codes each operation by type:
Teal = Simple actions
Red = Loops
Purple = Recursive calls
That makes it so much easier to follow!
instablaster...
The quality of these videos is amazing. The aspect ratio is perfect, the voice is clear, and I love how they use different colors for functions, recursion, and loops. It's crazy to think that this quality is from almost 10 years ago
WAYWAYWAYWAY better than my prof's lecture
this is the best that I have seen on Merge sort complexity
Best part, the videos are in the cinematic aspect ratio, so they look awesome on ultra wide monitors!
Wow, his is 2013 video. I'm watching it in 2020 to sit for my campus placements. Great explanation👏💕
Best computer science videos on RUclips. Kudos to you sir
The simple thing is : There will be logn levels to divide the array up to 1. and then for each level merge function is applied which itself takes 0(n) time. Therefore multiply n with the total step taken which is logn. The overall time complexity therefore boils down to 0(nlogn).
Array of size 8, how many levels up to 1 ? 4>2>1 which is 3; log 8 base 2 is 3
absolutely amazing channel ! you are a GREAT teacher thank's alot keep up the hardwork
EDIT: 12/15/17
@8:13
For those who cannot understand why T(n) = 2*T(n/2)+c'*n = 2{ 2*T(n/4)+c'*n/2 } + c'*n, you may refer to my explanation and welcome for discussions. I didn't get it at first also, but now I understand it after grabbing a discrete math book. It's called mathematical induction.
Now, T(n) = 2*T(n/2)+c'*n is a general form, and since T(n) = 2*T(n/2)+c'*n, then, T(n/2) = 2*T(n/2/2) + c'*(n/2), which just substitutes n with n/2 in T(n) = 2*T(n/2)+c'*n, and since T(n/2) = 2*T(n/2/2) + c'*(n/2) = 2*T(n/4) + c'n/2, T(n) = 2*T(n/2)+c'*n = 2 * (2*T(n/4) + c'n/2) + c'*n = 4T(n/4) + 2c'n. That's how it comes, and repeat for every n/4, n/8, n/16... etc, we will get the final result.
-- Below is my original post, and it does have some errors though. If we expand 2{ 2*T(n/4)+c'*n/2 } + c'*n, it would be 4T(n/4) + 2cn, rather than 2{T(n/2) + c'*n/2 } + c'*n. Apparently, I didn't quite get what T(n/4) is, and mistakenly multiply that coefficient 2 with T(n/4) which makes 2*T(n/4) become T(n/2). So you can ignore all the things below --
I still cannot understand why T(n) = 2*T(n/2)+c'*n = 2{ 2*T(n/4)+c'*n/2 } + c'*n... like, if we expand 2{ 2*T(n/4)+c'*n/2 } + c'*n, it would be
2{T(n/2) + c'*n/2 } + c'*n =
2*T(n/2) + c'*n + c'*n =
2*T(n/2) + 2c'*n
compared to original
2*T(n/2)+c'*n
There seems to be an extra c'*n...
Although someone mentions it's recursive I still haven't got the point...
Why...?
THANK YOU SO MUCH FOR THIS DETAILED EXPLANATION! i struggled until I found this post! GREATLY APPRECIATED!
Thanks a lot for being my Base Case of searching a Merge Sort Tutorial which will provide me details explanation. And of course, better than my Prof's lecture. :)
Thank you so much! I needed to understand mergesort for a college activity, and this was the clearer and best explained method i've found :D
Translations: "into" = "times" or "multiplied by"; "by" = "divided by"; "upon" = "over" or "divided by"
Thank you for the cultural translation. :)
haha. Cool !
Hey Man do you know how great work you are doing for others ? people like you makes the world beautiful :)
Unfortunately he is dead.
this guy is actually a god. perfect way of explaining hard concepts
I'm not a programmer and I randomly watched this video and understood every single thing. So simple and clear.
can you please explain me what he taught
Presented really well visually & orally. Great Work...!!!
At 16:38, I think at level 1 it is n/2 (not n) because we are using only n/2 space, we didn't even entered the right recursion tree. Same goes for the next level, n/4, then n/8 and so on. So, the total space would be n/2+n/4+n/8+.....+1 (or may use infinity for simplification), and not n+n/2+n/4+....+1. But, anyway, it would be in O(n).
Correct me if I'm wrong.
Hi Shanmuk,
What is it that you do not understand? I can try explaining. Well, maths and programming go together. But, you don't always do such complex calculations to figure out time complexity. Recursion is tricky. I am sure you know enough maths to be able to program, that's why you are here. If you are not getting anything in this tutorial, ask specific questions.
I am having a hard time to follow the time complexity calculation. Maybe I don't know logarithmic and could be the reason. how come T(n/2) would become 2T(n/4)+c'(n/2). where does the c' come from? and i do not understand how 2^log2n T(1) will become nc. maybe it's just me but I could not follow. But thanks for doing this work. Appreciate it. If possible please try to do another video.Thanks
brilliant teaching!! I wish my Professors could teach like this!
PEOPLE WHO SO EVER HAVE DISLIKED YOUR VIDEO MUST BE UNIVERSITY PROFESSORS BCZ THEY MUST BE JEALOUS WITH YOUR TEACHING EFFICIENCY.
My respects for #HumbleFool
He is not humblefool . He is animesh nayan
@@AyushGupta-mm4jg he is Harsha Suryanarayana from IIIT Allahabad
#humblefool
@@RAHUL-gf3ft he is Animesh Nayan , if you want to hear to audio of Harsha Suryanarayana , then access the Euclid's GCD Algorithm on mycodeschool.
@@RAHUL-gf3ft this iiitA people just💥🦸♀
@@navyasri5077 Proud to be an IIITian
Best explanation I found on youtube till date :)
bro you are an great mathematician
One of the best video explaination of time as well as space complexity !!
5:47 I don't understand. It should be n/2 right?
thanks for the awesome lectures ....please upload the rest of the lectures like heap sorting ,etc
Here for space complexity and absolutely worth it alhamdulilah
Sorry, I don't understand when calculating T(n) in 08:52 why T(n) = 2*T(n/2)+c'*n = 2{ 2*T(n/4)+c'*n/2 } + c'*n. Why we need to add c'*n in the end? and it also add c'*n in the following steps afterwards.
+apo tessar Because it is recursive function. So for example on 8:52, on the second black stroke he has 4*T(n/4) from this he looked what was exactly T(n/4) equal to, so by the same formula he gets 2T(n/2/2) + cn, when plugs in n/2 to general formula. It is same as 2T(n/4) + cn. than he multiplyed by outer 2 which was there before and got 4T(n/4) + cn. The second cn comes from the same function for n/4 written recursively and other cn stays there as before so at last he added them and got 2cn. this get done K times so recursivley he gets kn. Maybe its bad explanation but its hard to explain by words ))
I still cannot understand it... like, if we expand 2{ 2*T(n/4)+c'*n/2 } + c'*n, it would be
2{T(n/2) + c'*n/2 } + c'*n =
2*T(n/2) + c'*n + c'*n =
2*T(n/2) + 2c'*n
compared to original
2*T(n/2)+c'*n
There seems to be an extra c'*n...
Why...?
+Hang Chen what are you talking about? When you expand expression of course it wont be the same as original..because you are using T(n/4) to evalute instad of T(n/2)
He is expanding the whole recursive formula to show you how to get idea of time complexity. Its easier to understand what is going on when u follow original formula..
ruclips.net/video/g1AwUYauqgg/видео.html
please can anyone explain to me what is n and from where we get it in O(nlogn) -space complexity "in case we do not clear extra memory" ?
Why is n/2^k = 1 at 9:29?
And also, why at 10:06, 2^log2n = n?
We can't determine T(n/2^k). But we need to get rid of T(n/2^k), and we know the T(1) = 1. Hence, we replaced with T(1). How T(1) is 1? Think up of a merge sort having 1 element will sorted in 1 unit of time. And, answering to your second question, log of n base 2 returns "x the POWER of 2 which equals to n". We are powering with the same number i.e. x (the POWER of 2) again, we'll get n. Try with some examples, you'll get it. ex: Base is 2. n = 8. Evaluation: 2 ^ log 8 = 2 ^ 3 = 8.
I guess the Big-O notation is more famous !
Big O gives the worst case scenario that is the tightest upper bound.
While Theta notation is for function bounded between two curves from down and above.
mycodeschool at 9:13 are you using plug & chug method?
Yes, We are reducing the expression at each step,, So, K is starting from 1 and increasing till we reach T(1) .. You can either reach me here or write to mycodeschool [AT] gmail [DOT] com
A fascinating explanation as expected, just as a remark, we could use master theorem and get the complexity easily
far more better than my teacher's lecture!
hey.. thanks for the great work. Will be good to have videos for Bucket Sort, Radix Sort and Searching Problems (Sequential Search, Binary Search etc)
Still You are the best Bro....Just watch your videos to revise topics :)
Your videos are so interesting, always forget to like. Thank you Sir for such clear explanation and hope for more videos
At 6:56 why is the complexity of Merge() C3.n + C4? Why is it not C3.n?
This is better explenation about it than I never said, thanks a lot :)
liked the space complexity explanation :)
Such a wonderful explanation! Thank you so much.
Incredible explanation, thank you so much!
SIR PLEASE UPLOAD VIDEOS ON HASHTABLES
Nice explanation but @ 8:31, why are we having extra C`n outside? the expression inside the curly braces has already balanced the original equation
Because the outside C'n is from the original T(n) equation. The stuff inside the curly braces are the terms from the expansion of T(n/2).
If you look up some info about "recurrence relations", the expansion will make more sense. Hope that helps!
@@dee5392 thanks a lot for the clarification...finally understood 😊
@@arjundixit5913 you're welcome, glad I could help! have a great day :)
Wow!!!! Awesome explanation. Didn't knew about theta complexity. Everywhere I saw it was big Oh only. And many do not bother about space complexity. Thanks a lot. Please upload videos on heap sort also. Also sir if you could do some difficult time complexity questions on sorting algorithms, it will be very helpful for entrance exams (GATE etc.) Thank you sir!
This great programmer is not between us.
fossiiita.github.io/humblefoolcup/humblefool/humblefool.html
can u plz make a video for greedy algorithms nd dynamic programming
this guy is dead
what the fuck?
Yes that's true
fossiiita.github.io/humblefoolcup/humblefool/humblefool.html
It's sad as fuck ....
Manas Chakrabarti wow smh
Why the memory consumed in stack not taken into account like in the space complexity analysis of fibonacci, exponential modulation etc
Awesome explaination..i was just looking for an easy explaination..and you ended my search..;-)
at 8:32. Where does the additional c'n come from
I think this explanation was good, but skipped over mentioning that it was done by substituting in T(n/2) T(n/4) into the equations once we know the component 2T(n/2), 2T(n/4) etc
I might as well also add that the reason why mergesort(left) and mergesort(right) is T(n/2) is because it assumes its completed and therefore carries the time of completion through to this next part, hence the T rather than simply n/2
Imagine this great coder with great vision was not died in 2014, this was very unfortunate incidence for every code learner around the world
Thank you! I needed a video to explain the formulas
thank you for your help! I really enjoy watching your videos :)
Great video, thank you for your clear explanation.
I would have porbably joined for a course if you had provided..... No body can ever get these basics in such an interactive way you're providing us with...
Thanks for g8 explanation sir i got clear picture today on sorting .But i am confused when u said to clear extra memory but as these array is store in the static section it is deleted itself .sir Are u taking in case if it is store in heap section?
thanks for giving us this. super helpful and clear. gj
wait why do we reduce it to T(1)? 9:20
because in merge sort in worst case we need to divide the array until only 1 element is present in the array.
is this information really necessary outside of college?
Yes! it is. If you're targeting to work at top organizations. you are expected to know how to analyze your code complexities and optimize if possible.
Your videos are so good. Thanks a lot !!
Gülsüm hanım nasılsınız? Yedi yıl geçmiş aradan şuan ne iş yapıyorsunuz acaba :d
@@adarozer nostalji oldu yorumu görünce :p yazılım firmasında proje yönetimindeyim -.- siz niye düştünüz buralara matematikçilik mi var
@@xc0437 Yok ben vize için çalışırken bu videoya bakmıştım öğrenciyim daha bilgisayar mühendisliği okuyorum. İşinizde başarılar.
@@adarozer ne güzel sizin için başarılı okul ve iş hayatı dilerim
+mycodeschool hi, for the second step, I don't understand how'd you managed to get from c'n to 2c'n. Isn't it supposed to be c'n(1+1/2)?
we have to figure out the running time of two recursive calls on n/2 elements. Each of these two recursive calls takes twice of the running time of mergeSort on an (n/4) element subarray (because we have to halve n/2) plus cn/2 to merge, that's where you get the extra factor.
Great way of explanation...loved it❤❤❤
You are the best, man
Hi, should not the space complexity be O(nlogn) ? When the left half is completely decomposed, it occupies "(n/2)*logn" space (logn steps) and the right half takes n/2 space. So total space is n/2*logn + n/2 which is equal to nlogn space.
How ? Extra space required is just n; left subarray plus right subarray
how to calculate the time complexity of creating left array and right-array
what to return on the base case of mergesort function?
Can someone explain to me this part
T(n)=2T(n/2)+C′.n
T(n)=2{2T(n/4)+C′.n/2}+C′.n
Why do we multiply by 2 here. And how does C′.n become 2C′.n
Also what is meant by the constant C here?
nC+C′.nlogn=θ(nlog n)
+ Praveen M
It is this,
first you have this equation.
T(n)=2T(n/2)+C′.n
so assume n-->n/2
then substitute n/2 to first equation
T(n/2)=2T(n/4)+C′.n/2
then again substitute above equation to the first equation T(n)=2T(n/2)+C′.n
Hope you got it now. :-)
Thank u brother
Can you give a link to logarithmic laws required to solve recursions ?
TH BEST EXPLANATION ONE COULD EVER GET.. :) HaTS oFf
just had a small doubt if someone can explain (may be a dumb one..:/ ) When calculating T(n) array is divided into 2 parts so the k is incremented in every step. But I am confused with the calculation. How 2T becomes 4T and cn becomes 2cn.
It's done recursively, basically just keep plugging in T(n/2) -> T(n).
Thank you so much for this video. I have a question. I usually get confused when calculating the complexity of recursive functions. In this case why is it log n (I know that's the correct answer but why so)?. We have two for loops with complexity of O(n) which is greater than log n. Why don't we derive that the complexity of merge sort is simply O(n). Also since we are going to visit all the nodes in the tree, the complexity should be O(n) right? It'd be very helpful if you can explain this to me.
It will be binary tree with two nodes always and height of binary tree is logn
public class MergeSort {
public static void main(String[] args){
int[] input={9,8,7,6,5,4,3,2};
domergesort(input);
for(int i:input){
System.out.println(i+" ");
}
}
static void domergesort(int[] input){
int n=input.length;
if(n
Could you tell me the time complexity bcoz i am getting 0(n2) because comparing and moving is present
Unfortunately none of the time complexity maths makes any sense to me. Having said that, the algorithm implementation was definitely a great help! Im hoping that just knowing that it is O(nlogn) time complexity if good enough
No not always. To be a computer scientist/engineer you have to know deep down of a code.
what a nice way for teaching.. great man!!
please upload count and radix sort algo
thanks
This great programmer is not between us.fossiiita.github.io/humblefoolcup/humblefool/humblefool.html
@@shashanksahu1971 the narrator "Animesh Nayan" is alive and working at Google. His friend and co-founder Harsha died in car accident. Most of the videos are done by Animesh.
Actually i am confused in deriving 2^k T(n/2^n) + k,c'n, but after watching the video again, i figured it out that as we dig into the levels the term K increases.
Correct me if i am wrong..
Once again Thank u,
can i reach u here again if i got any doubt ?
Thanks a lot for this nice video, but I don't understand how you get n/2k = 1, this is the only step that confuses me. Thanks!
to eventually reach T(1) ...as n=1(n
k=log n; 2^(log n) = n; n/n=1; T(n/n) = c;
For space complexity it is clear.
At level L1, there are n/2^1 elements, at level L2 there are n/2^2 elements....so on We know at last level there are 1 elements in array and we say it as n/2^k , then n/2^k =1
excellent explanation, thanks for the video
HI ,
Nice explanation,
could you please share the code with us,actually over internet mergesort() taking n parameters buts in ur case its only one and i like that very much.
Munni Saik There is a link in previous video to implementation, But anyway this is the code: gist.github.com/mycodeschool/9678029
Very helpful!
Nice explanation the noise in the background is a bit disturbing.
wat is the need to again calculate the time complexity ?
Good explanation.
Thanks a lot :) It is really helpful
hank for play lists
very nice explanation!!!
This person is awesome
Thanks a lot man!
Hi, what is n0 in your video?
You are really awesome sir
the proof is amazing
Hello!
I 'm still stuck in this algorithm. I just can't think of how to do the "merge" method. Does anyone may help me out?
There's a video on Merge Sort by Geeksforgeeks, watch it.
You are my hero.
Thanks! this is perfect
rip brother
nice video thanks very much.. its really helpfull
sir can explain heap short video as soon as possible...
Pls help getting infinite loop at merge (Larr)
thank you thank you thank you so much you are a life saver !
Great videos, thank you so much