Visualizations: ruclips.net/video/Uq6URzo9q6g/видео.html I hope you enjoyed learning about algorithms! And for returning viewers, I hope you enjoy the trip down memory lane!
genuinely love the way you've adapted this into a worlthwhile viewing experience rather than just a compilation, the little titles are so cute, and the new bits of voiceover make this feel like it was always supposed to be one huge video.
My favorite sorting algorithm of all time was an entry in a slow sorting competition, titled "bureaucratic sort". It is not merely spectacularly time inefficient, it wastes tremendous amounts of space as well: generate all possible lists that can be created with the elements of the original list (every permutation of every set in the power set of the original list), then compare each generated list to the original list to see if it might be a sorted version of the original list, then if it qualifies, check if it is sorted. The comparison of lists and checks to see if a list is sorted are, naturally, done as slowly as possible (O(n) for a pair of lists with lengths n and m, with n
Here's a favorite joke algorithm of mine: Intelligent Design sort. It works like this: First, observe that the probability of the array being in the exact order that it's in by chance is 1/(n!), this is so unlikely that we must conclude that the array was put in that order by an intelligent Sorter, who must have sorted the elements by some metric beyond our mortal comprehension. This means that any change we might make to the array would actually make it _less_ sorted, which would be against the Sorter's plan. Therefore, the algorithm is complete. This has O(1) Time Complexity.
I would not be surprised if there's Factorio builds that contain otherwise-unknown algorithms that beat any documented method, whether sorting or some other interesting task
Very nice video. Regarding the bonus section at the end -- you'll no doubt be pleased to hear that the latest SIGBOVIK conference introduced bogoceptionsort! Bogosort may accidentally sort very small lists correctly in only a few iterations. To prevent this, bogoceptionsort first shuffles the *order of the lines of code* that make up the bogosort implementation, then attempts to run it, then checks to see if the list is sorted. This effectively pads the number of elements in the list, making it perform extremely poorly for even lists of size, like, five.
A cool optimisation would be to calculate the chance to order the lines correctly, and to reject a correct solution with that probability. Hope this helps to sort your 5 items in less time!
does anyone else ever get annoyed at Quick Sort being called Quick Sort, like that just feels unfair to the rest of the sorts. why isnt it called like "Partition Sort" or something
You'll have to look at computer history to get an answer. Long story short Tony Hoare (pronounced "hor" - he's British) invented it because his insertion sort implementation wasn't fast enough for some software he created. And it was quite a bit faster than insertion sort, hence the name. And the rest is history. Edit: this was back in 1959, which is an important detail, since not all will known sort algorithms were yet invented.
This series of videos inspired me to create a sorting algorithms visualization that runs on my CASIO graphical calculator, I implemented 16 different algorithms and it was really fun. Thank you. Great video, very helpful and interesting.
sorting algorithm i made (and probably many others too) so, i started with bogo, but then tweaked the randomiser function. it was originally picking 2 random values and swapping them i changed that "swapping" to "comparing" them. i don't know what to call it, but it does work quite well as a sorting algorithm.
bubble sort and shaker sort are definitely the most intuitive for me, as i've unknowingly been doing smth very similar my whole life for real-world situations!
Someone should make a paranoid sort algorithm, like bubble sort, but it swaps items a random amount of times just to make sure it's actually swapped, and should have a save function it spams just in case it crashes. You can also make it randomly mess up or starts over completely, maybe even go through twice and compare the two finished sorts to see if it got the same outcome before determining if it's sorted or not
genuinely I love this so much. I do not know enough math to keep up with your descriptions 100% of the way, but what I can parse is genuinely very interesting. I love sorting algorithms, and I love learning more about how they work, even if I can never fully understand it. Thank you so much for this video! I was enraptured all the way through.
Ahhh I lied I was actually still watching - near the beginning - when I wrote this but by god I am still enraptured. I'm going to start commenting on the little things I'm enjoying as I go along, because there are many, and I couldn't stop myself at just the one comment. First of all: I love your explanation of the use cases for these algorithms. Or, well, I'm currently just in merge sort, I'm unsure if you keep doing it down the line, but still! it's cool to know the pros and cons of each sort, and why one algorithm would be used over another, as in your city name sorting example.
A variation on quantum bogo sort (without the universe destruction): Step 1) go through the entire list to see if it’s sorted, also counting what n is in the process Step 2) with n! parallel processors and n! auxiliary arrays, distribute each element evenly into each open spot in each array, which guarantees that each array is distinct* Step 3) because each auxiliary array is necessarily distinct, and we have n! number of them, exactly one must be sorted. Simply use all our parallel processors to comb through them simultaneously to find the sorted list. Boom, the fasted on average sorting algorithm possible (time complexity of n) The only issue would be the space and processing it requires…
*if the list doesn’t contain strictly distinct values, there will be multiple auxiliary arrays which are sorted, but still only one that is sorted stably We can make this algorithm stable by taking the first auxiliary array (which is necessarily just a copy of the original list) and use it as a “stable” memory storage to help find the one true stably sorted list
Hey, just to letcha know: you are more than welcome to join The Studio so you can stay updated on Holy Grailsort's progress (once we come out of hiatus, which is hopefully soon)! ❤❤
Idk if this would make it faster, but you could try picking the first and last element and move them inwards, swapping elements that are out of order That would reverse a descending array and insertion sort would finish the sort, and it would also get rid of lots of patterns
@bitonic589 That would break stability, unfortunately, but it's still a clever idea! You would have to implement it like Timsort does, but block merge sorts don't work off of pre-existing runs of sorted data.
1:52:30 Quantum bogosort is actually implementable, but would be O(2^n) in all cases, since you need to spend time creating those 2^n “worlds” to destruct. There is another interesting sorting algorithm, which is the “differentiable sorting” algorithm. It takes in a list and returns the permutation required to make it sorted, but the entire algorithm can be differentiated (needed in ML and for incremental computation).
Holy crap. I've been studying computers for years, and always had a soft spot for quicksort, and yet, this is the first I've ever seen the sort-in-place strategy you detailed. I always thought each round would require copying all elements less than the pivot to a new list, and all elements greater to another new list, essentially requiring O(nlogn) memory.
i haven't watched the video fully yet, but what amazed me now is the in-place implementation of Quicksort. I'd usually make auxiliary arrays around the pivot point, write the compared values there and then sort the auxiliary arrays.
I think I came up with a new sorting algorithm The way it works is by comparing the the farthest left two pieces and if they are not correctly sorted it deletes both of them and then moves one to the right then repeats until it is sorted
I cannot believe the amount of work and attention to detail plus the succinct, concise, and sensible quick-tutorial on asymptotic notations. In fact, I happen to be learning about it in grad-level CS algo class rn. Your video has helped me immensely and in total contrast to the quest for faster algorithm, I hope your channel grows in astronomical Big O! ❤
22:10 I may have mentioned this in the original video, but radix sort *can* be used on strings (as long as characters have a fixed-size representation). It's most efficient with fixed-size strings, but can even be used on variable length strings.
One flaw with Quantum Bogo Sort is that you can't use a traditional RNG function because they're deterministic. You have to use an RNG function that is dependent on true randomness
i am trying to make a sorting visualizer in python by using your terminal and using pygame for the sounds. i didn't understand many sorting algorithms but this helped me understand some of the algorithms. i also included one of your sorting algorithms (baiai sort) inside. thank you for the explanation and peace.
I'm pretty sure I said this on the original video, but when we got to the sorting networks and bitonic, my mind goes to Factorio belt management theory.
Thank you so much for this in-depth video. My only knowledge/exposure to sorting algorithms before this were those meme videos where sorting algorithms make funny sounds. Now I have come away confused yet mystified, and with favorite sorting algorithms being Pancake Sort and Power Sort.
Tbh you really don’t need to care about space complexity TOO much, because if you count the memory needed to store the original array, all algorithms in this video would become O(n) space complexity, thus merge sort is good enough
1. Quicksort can include smarter pivot-selection techniques to guarantee O(n*log(n)) time in the worst case. 2. Shellsort can be O(n*log(n)^2) if you choose the sequence of gaps more carefully. Additional details in replies.
Explanation for 1: there is an algorithm called "median of medians." It is an O(n) algorithm that finds some value in the list that is greater than (or equal to) at least 30% of the others in the list, and also less than (or equal to) another 30% of them. By using it to choose pivots, we will always shrink the list by a constant factor on each step, guaranteeing logarithmically-many recursive steps.
Explanation for 2: if we choose the sequence of 3-smooth numbers, we never swap an element more than once on a given iteration. Since there are O(log(n)^2) 3-smooth numbers less than n, we perform that many linear-time iterations.
I have a good joke sorting algorithm, Increment sort, so basically, it compares adjacent pieces right to left, like reverse bubble sort, but if the left is greater or equal to the righ, decrement left by 1 and increment right by 1, not reccomened for few unique values.
pairwise bogo sorting network: given a list X of size n, generate a new list P containing all ascending pairs of integers from 0 to n-1. shuffle P and use it to compare every pair of numbers in X, swapping them if necessary. if X isn't sorted throw your computer in the ocean or something idk
That's a cool method! Although I think it only works on numbers and has a negligible effect on performance, so usually we just stick with the general method.
You can improve gnome sort, when use put the index in a variable you turned back into a variable. When the piece is at is correct destination you can just go to the saved index.
Wasn't there an algorithm that can solve any NP-problem in its minimal time complexity by random generating algorithms and checking if there answer is correct? It's just generliced bogo-sort, but would have been worth a mention.
Are there different considerations based on properties of the data, like numerous peices of data with the same values? In such a data set is there anything of note happening when a secondary sort method is used? (Like sorting files by album title, and secondarily by name, or track number? What about if the data is already partly sorted instead of random?
For "A-then-B" you can just sort by A but break ties by B, or you can sort by B, then stable sort by A, or you can use a recursive procedure more like MSD radix sort.
I think I saw somewhere that the time complexity of Shell sort is O(n (log n)^2), which is roughly n^1.2, but I couldn't tell you why that's the time complexity
Fun fact, Bill Gates published a really neat paper on pancake sort! I wish I was smart enough to understand it. I'd watch a video of someone explaining that paper online.
It explicitly stated, “if necessary, introduce gap g so A and B have no common values.” If A and B had no common values to begin with, there’s no need to introduce gap g in the first place.
Visualizations: ruclips.net/video/Uq6URzo9q6g/видео.html
I hope you enjoyed learning about algorithms! And for returning viewers, I hope you enjoy the trip down memory lane!
こんにちは、クヴィナ・サイダキ!
Great vidéo. I'm gonna play with my C128 ASM just for the fun of it, trying to implement some of them and programming the VDP
did musicombo wat ch this video
I Optimized Porportion Extend Sort with this: Sorting ¼ of the list then choose the median. (FOR UNDER 32 ELEMENTS ONLY).
genuinely love the way you've adapted this into a worlthwhile viewing experience rather than just a compilation, the little titles are so cute, and the new bits of voiceover make this feel like it was always supposed to be one huge video.
I was going to make that Adam Sandler joke but I understand why you are here
kuvina i am rooting for you
mood
also omg is ðat patricia taxxon ( 'o')
i loved your love rap explanation in rhythm heaven iceberg megamix ( ^u^)b
patricia ily
Omg hii you're my favorite autistic furry youtuber yippee! /genuine
Helped me realize I'm autistic myself
@@RadioactiveBluePlatypusoh oh hi /gen
our favorite enby buddy
My favorite sorting algorithm of all time was an entry in a slow sorting competition, titled "bureaucratic sort". It is not merely spectacularly time inefficient, it wastes tremendous amounts of space as well: generate all possible lists that can be created with the elements of the original list (every permutation of every set in the power set of the original list), then compare each generated list to the original list to see if it might be a sorted version of the original list, then if it qualifies, check if it is sorted. The comparison of lists and checks to see if a list is sorted are, naturally, done as slowly as possible (O(n) for a pair of lists with lengths n and m, with n
Have I watched each of the individual videos before? Yes. Will I watch this compilation? Absolutely.
There's also a visualization-only companion!
@@wyattstevens8574 Watched that too!
Here's a favorite joke algorithm of mine: Intelligent Design sort.
It works like this: First, observe that the probability of the array being in the exact order that it's in by chance is 1/(n!), this is so unlikely that we must conclude that the array was put in that order by an intelligent Sorter, who must have sorted the elements by some metric beyond our mortal comprehension. This means that any change we might make to the array would actually make it _less_ sorted, which would be against the Sorter's plan. Therefore, the algorithm is complete. This has O(1) Time Complexity.
WAIT WHA-
The idea of sorting networks is really reminding me of how factorio balancers work
That was my first thought too
I would not be surprised if there's Factorio builds that contain otherwise-unknown algorithms that beat any documented method, whether sorting or some other interesting task
Two hours of high quality and well-thought-out content? Am I dreaming??
I'm a huge fan of all of the icons! They are all very clean and well designed!
Great work on all the visuals and research in the series!!
Very nice video. Regarding the bonus section at the end -- you'll no doubt be pleased to hear that the latest SIGBOVIK conference introduced bogoceptionsort! Bogosort may accidentally sort very small lists correctly in only a few iterations. To prevent this, bogoceptionsort first shuffles the *order of the lines of code* that make up the bogosort implementation, then attempts to run it, then checks to see if the list is sorted. This effectively pads the number of elements in the list, making it perform extremely poorly for even lists of size, like, five.
A cool optimisation would be to calculate the chance to order the lines correctly, and to reject a correct solution with that probability. Hope this helps to sort your 5 items in less time!
does anyone else ever get annoyed at Quick Sort being called Quick Sort, like that just feels unfair to the rest of the sorts. why isnt it called like "Partition Sort" or something
and like it Clearly has weaknesses. it is horrible on an already sorted list.
Pivot sort is more descriptive I think
I think it is due to the fact that it is one of the fastest algorithms known, so they just called it quick sort and got it done with.
Rectangle sort because the sub-lists are rectangular
You'll have to look at computer history to get an answer. Long story short Tony Hoare (pronounced "hor" - he's British) invented it because his insertion sort implementation wasn't fast enough for some software he created. And it was quite a bit faster than insertion sort, hence the name. And the rest is history.
Edit: this was back in 1959, which is an important detail, since not all will known sort algorithms were yet invented.
This series of videos inspired me to create a sorting algorithms visualization that runs on my CASIO graphical calculator, I implemented 16 different algorithms and it was really fun. Thank you.
Great video, very helpful and interesting.
you are interesting too, implemented 16 algorithms on a CASIO
20:38 there is an among us hidden in the purple bar
yeah i know
Didn't notice that!
Really something among us
went looking for this comment
There goes my plan to make a sorting algorithm explanation. I can just redirect people here now.
There goes the ideas, being used by others
Just checked out your channel because of this comment. Did subscribe.
I don’t understand any of how block sort works but I’m glad computers do
gotta admit, 80% of block sort flew over my head after sqrt, but i loved this entire video anyway, thank you so much
Forever proud of actually using bogosort back in uni and getting it accepted
Please elaborate
I need the story. Please.
sorting algorithm i made (and probably many others too)
so, i started with bogo, but then tweaked the randomiser function.
it was originally picking 2 random values and swapping them
i changed that "swapping" to "comparing" them.
i don't know what to call it, but it does work quite well as a sorting algorithm.
I think it's called either bubble bogo sort or exchange bogo sort
bubble sort and shaker sort are definitely the most intuitive for me, as i've unknowingly been doing smth very similar my whole life for real-world situations!
I think insertion sort is more intuitive than bubble sort. Bubble is easier to code, but it's harder to convince yourself that it works
i use radix lsd base 2 sort irl
Once again, your explanation for Grailsort makes me smile ❤
Musicombo
kuvina, patricia taxxon and jan misali should all collaborate sometime
Someone should make a paranoid sort algorithm, like bubble sort, but it swaps items a random amount of times just to make sure it's actually swapped, and should have a save function it spams just in case it crashes. You can also make it randomly mess up or starts over completely, maybe even go through twice and compare the two finished sorts to see if it got the same outcome before determining if it's sorted or not
Brilliant.
genuinely I love this so much. I do not know enough math to keep up with your descriptions 100% of the way, but what I can parse is genuinely very interesting. I love sorting algorithms, and I love learning more about how they work, even if I can never fully understand it. Thank you so much for this video! I was enraptured all the way through.
Ahhh I lied I was actually still watching - near the beginning - when I wrote this but by god I am still enraptured. I'm going to start commenting on the little things I'm enjoying as I go along, because there are many, and I couldn't stop myself at just the one comment. First of all: I love your explanation of the use cases for these algorithms. Or, well, I'm currently just in merge sort, I'm unsure if you keep doing it down the line, but still! it's cool to know the pros and cons of each sort, and why one algorithm would be used over another, as in your city name sorting example.
20:38 >:0!!!!!
A variation on quantum bogo sort (without the universe destruction):
Step 1) go through the entire list to see if it’s sorted, also counting what n is in the process
Step 2) with n! parallel processors and n! auxiliary arrays, distribute each element evenly into each open spot in each array, which guarantees that each array is distinct*
Step 3) because each auxiliary array is necessarily distinct, and we have n! number of them, exactly one must be sorted. Simply use all our parallel processors to comb through them simultaneously to find the sorted list.
Boom, the fasted on average sorting algorithm possible (time complexity of n) The only issue would be the space and processing it requires…
*if the list doesn’t contain strictly distinct values, there will be multiple auxiliary arrays which are sorted, but still only one that is sorted stably
We can make this algorithm stable by taking the first auxiliary array (which is necessarily just a copy of the original list) and use it as a “stable” memory storage to help find the one true stably sorted list
Hey, just to letcha know: you are more than welcome to join The Studio so you can stay updated on Holy Grailsort's progress (once we come out of hiatus, which is hopefully soon)! ❤❤
Idk if this would make it faster, but you could try picking the first and last element and move them inwards, swapping elements that are out of order
That would reverse a descending array and insertion sort would finish the sort, and it would also get rid of lots of patterns
@bitonic589 That would break stability, unfortunately, but it's still a clever idea! You would have to implement it like Timsort does, but block merge sorts don't work off of pre-existing runs of sorted data.
I’m learning math and science for college majors at 10:30pm. I fell proud.
I don’t even know how many times I’ve rewatched this video by now
1:52:30 Quantum bogosort is actually implementable, but would be O(2^n) in all cases, since you need to spend time creating those 2^n “worlds” to destruct.
There is another interesting sorting algorithm, which is the “differentiable sorting” algorithm. It takes in a list and returns the permutation required to make it sorted, but the entire algorithm can be differentiated (needed in ML and for incremental computation).
actually o(n!)
Holy crap. I've been studying computers for years, and always had a soft spot for quicksort, and yet, this is the first I've ever seen the sort-in-place strategy you detailed. I always thought each round would require copying all elements less than the pivot to a new list, and all elements greater to another new list, essentially requiring O(nlogn) memory.
i haven't watched the video fully yet, but what amazed me now is the in-place implementation of Quicksort. I'd usually make auxiliary arrays around the pivot point, write the compared values there and then sort the auxiliary arrays.
I came looking for one of those “every __ explained” videos but i got something much better
I think I came up with a new sorting algorithm
The way it works is by comparing the the farthest left two pieces and if they are not correctly sorted it deletes both of them and then moves one to the right then repeats until it is sorted
but don't you lose data from doing so?
This is similar to stalin sort
I cannot believe the amount of work and attention to detail plus the succinct, concise, and sensible quick-tutorial on asymptotic notations. In fact, I happen to be learning about it in grad-level CS algo class rn. Your video has helped me immensely and in total contrast to the quest for faster algorithm, I hope your channel grows in astronomical Big O! ❤
Stumbled across this awesome video and liked it 5 minutes in. It’s great, but I would suggest adding a touch more emotion in to it. Great video!
Great work, congratulation. Certainly watch one time is not enough. But understanding level again increased in my situation.
I once needed to sort a list, but didn't knew any sorting algorythms, so I accidentally wrote bubble sort.
22:10 I may have mentioned this in the original video, but radix sort *can* be used on strings (as long as characters have a fixed-size representation). It's most efficient with fixed-size strings, but can even be used on variable length strings.
bitonic sort visualizes the swaps that are needed to make a belt balancer in the video game Factorio lol
Fantastic Work !!
very impressive
I'm so glad you included my favorite sorting algorithm, miracle sort!
One flaw with Quantum Bogo Sort is that you can't use a traditional RNG function because they're deterministic. You have to use an RNG function that is dependent on true randomness
i will not need this information. but it begs to be watched
i am trying to make a sorting visualizer in python by using your terminal and using pygame for the sounds. i didn't understand many sorting algorithms but this helped me understand some of the algorithms. i also included one of your sorting algorithms (baiai sort) inside. thank you for the explanation and peace.
I'm pretty sure I said this on the original video, but when we got to the sorting networks and bitonic, my mind goes to Factorio belt management theory.
Thank you so much for this in-depth video. My only knowledge/exposure to sorting algorithms before this were those meme videos where sorting algorithms make funny sounds. Now I have come away confused yet mystified, and with favorite sorting algorithms being Pancake Sort and Power Sort.
very enjoyable, thank you. shell sort is indeed a favorite.
20:38
Quite suspicious indeed
49:56 this feels so much like a meme template and i love it
I am less than a minute into the video and I need you to know that I love you
Now I can understand the things
Tbh you really don’t need to care about space complexity TOO much, because if you count the memory needed to store the original array, all algorithms in this video would become O(n) space complexity, thus merge sort is good enough
you should do a longer video about joke algorithms (especially more obscure ones like hanoi sort), theyre very fun
each algorithm has a little icon !? very cute i love it
Minor typo - 1:05:15 says O(nlgon) instead of O(nlogn) in the magenta rectangle
Honestly quite incredible
Great work. Thanks
new Kuvina video! I already love it
1. Quicksort can include smarter pivot-selection techniques to guarantee O(n*log(n)) time in the worst case.
2. Shellsort can be O(n*log(n)^2) if you choose the sequence of gaps more carefully.
Additional details in replies.
Explanation for 1: there is an algorithm called "median of medians." It is an O(n) algorithm that finds some value in the list that is greater than (or equal to) at least 30% of the others in the list, and also less than (or equal to) another 30% of them. By using it to choose pivots, we will always shrink the list by a constant factor on each step, guaranteeing logarithmically-many recursive steps.
Explanation for 2: if we choose the sequence of 3-smooth numbers, we never swap an element more than once on a given iteration. Since there are O(log(n)^2) 3-smooth numbers less than n, we perform that many linear-time iterations.
I have a good joke sorting algorithm, Increment sort, so basically, it compares adjacent pieces right to left, like reverse bubble sort, but if the left is greater or equal to the righ, decrement left by 1 and increment right by 1, not reccomened for few unique values.
you make the best videos!
ive already seen all 4 videos, is there anything new in this one?
not really I just redid some audio and visuals to make it easier to watch, and added segues between the sections
How much sorting algorithm do you want?
Me: *_Yes_*
I like weave sort!
pairwise bogo sorting network: given a list X of size n, generate a new list P containing all ascending pairs of integers from 0 to n-1. shuffle P and use it to compare every pair of numbers in X, swapping them if necessary. if X isn't sorted throw your computer in the ocean or something idk
update: i made it and it's every bit as horrible as i had hoped
*@[**37:04**]:* This is also the same algorithm used by Earthbound.
1:05:13 Typo! "and building it is O(nlgon)"
I'm impressed by how many people have noticed that. But I guess it shows people are really paying attention to the video!
Baiai sort can also be called Odd Even Insertion(because it’s also “odd even”ish.
Why does nobody get rid of the parts like “the rest are in part 2!”
do you really need a temp variable to swap values? I thought
{
a=1;
b=2;
a=a+b;
b=a-b;
a=a-b;
}
now
a=2, b=1
That's a cool method! Although I think it only works on numbers and has a negligible effect on performance, so usually we just stick with the general method.
yay new kuvina video :3
as pictures
as pictures
Auxillerlilly
uhh...yeah?
You can improve gnome sort, when use put the index in a variable you turned back into a variable. When the piece is at is correct destination you can just go to the saved index.
At that point that’s insertion sort and you might as well use that instead.
29:22. Is it a mistake, that there's 3 instead of 4 or is it just a joke?
Wasn't there an algorithm that can solve any NP-problem in its minimal time complexity by random generating algorithms and checking if there answer is correct?
It's just generliced bogo-sort, but would have been worth a mention.
Are there different considerations based on properties of the data, like numerous peices of data with the same values? In such a data set is there anything of note happening when a secondary sort method is used? (Like sorting files by album title, and secondarily by name, or track number?
What about if the data is already partly sorted instead of random?
That's where adaptive algorithms come in, which are covered in part 3 !
For "A-then-B" you can just sort by A but break ties by B, or you can sort by B, then stable sort by A, or you can use a recursive procedure more like MSD radix sort.
@@NXTangl does anybody know how windows explorer does it?
@@LeoStaley Probably stable sort.
Me at 12AM:
4:59
"...Yeah."
yes.
I think I saw somewhere that the time complexity of Shell sort is O(n (log n)^2), which is roughly n^1.2, but I couldn't tell you why that's the time complexity
O(n log^2 n) is a smaller time complexity than n^1.2 or even n^1.00001
this is so good
Radix Sort could work on any set of finite elements.
Fun fact, Bill Gates published a really neat paper on pancake sort! I wish I was smart enough to understand it. I'd watch a video of someone explaining that paper online.
You're the best
cool now i can watch this when binging it the 581st time
Isn't MSD radix sort faster than LSD because you can cut short and not examine every digit?
Was hoping to see an explanation of shatter sort 😢
There are basically no explanations of it online
this is fucking cinema
identity crisis sort should randomly start off with quick or merge
1:13:40 Like CycleSort's cycling method 😮
please make your videos darker i need to know the details of every sorting algorithm but my eyes hurt
which one is most used in practice?
*@[**1:52:40**]:* I guess it isn't literally named after a real-world genocide perpetrator for nothing...
Go figure with how destructive it is.
I'm only at the start of the video right now, but I just want to note that ska sort doesn't seem to be included.
Thats 117 minutes
Here is my case 3 for block sort: There is no g.
It explicitly stated, “if necessary, introduce gap g so A and B have no common values.” If A and B had no common values to begin with, there’s no need to introduce gap g in the first place.
radix sort is so cool
gnome sort is my favorite
When I learnt about insertion sort I came up with binary sort lol
now explain every shuffling algorithm
It's 117 minutes and 33 seconds! How dare you lie!
Can't wait to get hired at Google/Facebook/Papa Johns
Whatt is a pivot???????????????