When choosing a sorting algorithm you probably consider factors like speed, efficiency, or the fit for your purpose, but have you ever considered the why not factor? Why not have an algorithm that takes a nap every iteration or one that snaps half your input out of existence? Now if the why not factor is what you’re looking for in your next enterprise scale project, I’ve got 5 sorting algorithms just for you.
Bogosort
Have you ever thought about combining your love of blackjack with your love of sorting lists of inputs? No, well too bad because I’ve got the algorithm just for you, bogosort. Bogosort works by randomizing the order of the input you give it and then checking if it’s sorted. If the input is not sorted it randomizes it again and again until the input is sorted. You can think of it as hitting a slot machine over and over and over again until you hit it big.
function isSorted(arr) {
for (let i = 1; i < arr.length; i++) {
if (arr[i - 1] > arr[i]) {
return false;
}
}
return true;
}
function shuffle(arr) {
for (let i = arr.length - 1; i > 0; i--) {
const j = Math.floor(Math.random() * (i + 1));
[arr[i], arr[j]] = [arr[j], arr[i]];
}
return arr;
}
function bogosort(arr) {
while (!isSorted(arr)) {
arr = shuffle(arr);
}
return arr;
}
Time Complexity
Now I know what you’re thinking, time complexity isn’t necessary here since it’ll run perfectly every time! I completely agree, but for the sake of our engineering curiosity let’s see what kind of numbers we’ll be putting up with bogosort.
Average Case Scenario:
O(n!) AKA 😱 Sort . How is this the case? Well, it’s because we will be randomly shuffling over and over for each element in a given input. Each randomization is a reset button, and the previous shuffle does not affect any future shuffles. This is more of a statistics problem, and luckily we’re computer scientists so we don’t have to worry about that.
Worst Case Scenario:
O(∞) AKA Heat Death. In the worst-case scenario we might need to leave our servers running overnight or till the heat death of the universe. In my mind, this is what Isaac Asimov had in mind when he wrote The Last Question. As we randomize our input each time, we may never get a sorted array. There is a possibility we never get a sorted input, but let’s not think about this nightmare 👀.
Best Case Scenario:
O(1) AKA The promised land. If our prayers are answered, and on the first iteration we randomly generate a sorted array we have the perfect time complexity. Our first loop will be the only one we need! Now how can you argue against those odds O(1)👀.
Thanos Sort
This one I found while reading one of the most trusted sources on the internet, the YouTube comment section, so if you’re a Marvel fan you’ll be happy to use this one in production. Thanos sort works exactly how you think it does unless you thought wrong. In Thanos sort you randomly eliminate half of the input until the array is sorted.
function thanosSort(arr) {
while (!isSorted(arr)) {
const randomIndex = Math.floor(Math.random() * arr.length);
arr.splice(randomIndex, 1);
}
return arr;
}
Time Complexity
O(n!) AKA O(n😱) Sort. Since there are random eliminations, for the good of the universe, we’re dealing with the possibility that we’re going to have to hit every possible permutation of the input. Another drawback is the fact that you will be losing some data (50% of each iteration). However, that’s a small price to pay to keep your inputs perfectly sorted, as all things should be.
Sleep Sort
We always hear about the importance of using multithreading to increase performance in our applications. Well lucky for you I’m here to teach you another multithreading algorithm that is guaranteed (Due to further review by my lawyers I cannot make any more guarantees) to impress your boss. Sleep sort works by creating a thread for each field in a given input. Then each thread is put to sleep according to the value in its corresponding field. Once the thread wakes up it will insert into the next available slot in a new array. So an array of [10, 5, 7 8] will create 4 threads the first going to sleep for 10 milliseconds, then the second thread 5 milliseconds, then 7 milliseconds, and finally 8 milliseconds. The 5-millisecond thread will wake up first so it will insert into the new array first, and so on and so forth.
function sleepSort(arr) {
const sorted = [];
const delay = 10;
arr.forEach(num => {
setTimeout(() => {
sorted.push(num);
if (sorted.length === arr.length) {
console.log("Sorted array:", sorted);
}
}, num * delay);
});
}
Time Complexity
The time complexity here is dependent on the time delay set (milliseconds in my example), and the processing power of your computer. However, one teeny tiny downside, if you can even call it that, is with a large enough input it may take years to sort a fairly simple input. For example, the input: [9999999, 1] will take approximately 115 days to finish since the 9999999 won’t wake up till then. Also, another super teeny tiny flaw is the fact you won’t be able to sort negative inputs. However, this is a small price to pay for a multithreading solution 👀.
Miracle Sort
This is the sorting algorithm for all those who are truly faithful. The algorithm relies on an external miracle to sort whatever input you give it. It will continuously check if the input has been sorted by the work of some divine intervention, or once again if heat death has engulfed what is left of humanity.
function miracleSort(arr) {
let sorted = false;
while(!sorted){
for (let i = 1; i < arr.length; i++) {
if (arr[i - 1] > arr[i]) {
return sorted = false;
break;
}
sorted = true;
}
}
return sorted;
}
Time Complexity
O(Miracle) or O(🙏) for the young engineers. Since this algorithm relies on some phenomenon to sort our input, the time complexity is also dependent on how quickly that external factor sorts it. I would like to think that divine intervention is an O(1) time complexity, but since god works in mysterious ways we will never know.
Slow Sort
Have you ever thought hmmm why don’t we try to make our sorting algorithms purposely slow, no you haven’t thought that, well this algorithm is just for you? Most of us are familiar with the divide-and-conquer set of sorting algorithms. Those are algorithms that break down (divide) inputs into smaller subsets, sort them(conquer), and then merge them back together again. However, what if I told you divide and conquer had a cooler twin called mulitply and surrender. Slowsort falls into the latter of the two. Slow Sort works by dividing the input we give it into two halves. Then the two halves are recursively sorted. Once the two halves are sorted, the algorithm finds the max element and moves it to the end of the input. However, instead of merging the two halves, it repeats the process and once again does the entirety over again.
function slowSort(arr, i, j) {
if (i >= j) {
return;
}
let m = Math.floor((i + j) / 2);
slowSort(arr, i, m);
slowSort(arr, m + 1, j);
if (arr[j] < arr[m]) {
[arr[j], arr[m]] = [arr[m], arr[j]];
}
slowSort(arr, i, j - 1);
}
Time Complexity
As you may have already guessed it’s: O(n!). This has been a common trend in this article, but once again the notorious n! or n😱 has struck again. Since slow sort has to have an iteration for every element in an input we have a less than desirable result. Unlike some of the other ones I’m unable to think of some mental gymnastics to save the time complexity of this one, that and it’s 2 A.M. as I write this so I barely am conscious enough to figure out what keys I’m pressing let alone figure out the time complexity of a hypothetical thought experiment.
Conclusion
I have gifted each and every one of you at least one algorithm to try out in your production builds, so I am excited to read about any future promotions or mainframe meltdowns. Please use these algorithms wisely 🙏.

Leave a reply to m Cancel reply