[HN Gopher] Unconventional Sorting Algorithms
___________________________________________________________________
Unconventional Sorting Algorithms
Author : devev
Score : 59 points
Date : 2021-10-21 16:00 UTC (7 hours ago)
(HTM) web link (codingkaiser.blog)
(TXT) w3m dump (codingkaiser.blog)
| throw3849 wrote:
| Australia sorting algorithm:
|
| Stay at your index!!!
| NotAnOtter wrote:
| Isn't 'sleep sort' linear with respect to array size?
|
| O(n) sort achieved?
| [deleted]
| boothby wrote:
| It's perversely linear with respect to the largest element in
| the array, if that's greater than the runtime of the underlying
| heapsort. Good luck with negative entries.
|
| Response to dead comment:
|
| > Couldn't this be solved by scaling everything by the largest
| entry. Of course that requires first a pass to find the largest
| entry but that's not as expensive?
|
| One presumes that we don't have infinite precision timers, so
| you should only scale it down so the smallest gap between
| elements is a few clock cycles (unless you're satisfied with
| breaking the fiction that sleepsort is not heapsort). This
| seems like a mildly interesting question. My intuition is that
| finding the smallest abs(a[i]-a[j]) with i != j should be just
| as hard as sorting the list on a classical computer, but it
| seems like a quantum computer might find the smallest gap
| faster.
| kruxigt wrote:
| Couldn't this be solved by scaling everything by the largest
| entry. Of course that requires first a pass to find the
| largest entry but that's not as expensive?
| dfox wrote:
| It is if you ignore how the underlying OS implements
| multitasking and sleep() in particular. And the kernel-side
| implementation usually involves some kind of priority queue of
| sleeping threads. So taken as a whole sleep-sort is just an
| convoluted implementation of heap-sort that outsources the
| actual sorting to OS.
| thehappypm wrote:
| I really like sleep sort because it trades (fundamentally)
| cycles and memory for time.
|
| In theory you could do this in hardware. Hardware timers are
| pretty trivial, you could imagine a hardware chip that
| accepts a number, then writes the value to the next free slot
| in an array when the time has elapsed. As timers go off, the
| array fills up. Obviously you need to handle collisions and
| whatnot but nothing in this screams that you need to go
| O(n^2).
| Sharlin wrote:
| You need _n_ separate programmable timers, though. If you
| only have a constant number of timers, then you're back to
| implementing something approaching a priority queue to
| schedule the events (possibly in hardware if you want)
| dmurray wrote:
| The kernel could use a different implementation of
| multitasking, e.g. keep the threads in a circular doubly
| linked list instead of a priority queue. This wouldn't be
| optimal for a general-purpose OS, but works for our custom
| O(n) sorting machine.
| scarmig wrote:
| def everett_sort(list): shuffle(list) if
| is_sorted(list): return else:
| suicide()
| boothby wrote:
| I recently found a fun linear-time systolic sorting algorithm.
| I'd describe the serial implementation as unconventionally
| obvious. It's like insertion sort, without all the bother of
| swapping. def index_sort(a): n = len(a)
| output = [None] * n for i in range(n): #can run in
| parallel, damn the GIL index = 0 for j in
| range(n): if a[j] < a[i]: index += 1
| elif a[j] == a[i] and j < i: index += 1
| output[index] = a[i] return output
| Snild wrote:
| > linear-time
|
| But... there are two nested `range(n)` for loops. Typo?
| boothby wrote:
| That's the serial implementation, which is clearly O(n^2).
| The systolic algorithm runs in linear time on specialty
| hardware (similar to a tensor core, which can do comparisons
| in a square matrix, and can perform row-sums). Or, if we
| ignore communication overheads, it can run in O(n) time with
| O(n) parallel workers.
| Sohcahtoa82 wrote:
| I like Intelligent Design sort:
|
| > The probability of the original input list being in the exact
| order it's in is 1/(n!). There is such a small likelihood of this
| that it's clearly absurd to say that this happened by chance, so
| it must have been consciously put in that order by an intelligent
| Sorter. Therefore it's safe to assume that it's already optimally
| sorted in some way that transcends our naive mortal understanding
| of "ascending order". Any attempt to change that order to conform
| to our own preconceptions would actually make it less sorted.
|
| https://www.dangermouse.net/esoteric/intelligentdesignsort.h...
| r0f1 wrote:
| What about this one? This one was mentioned a couple of weeks
| back and is really funny.
|
| https://news.ycombinator.com/item?id=28758106
| fjfaase wrote:
| Usage sort: About sorting a sequence of numbered books while you
| are picking books out of the sequence when you need them and
| insert them in afterwards (moving the books in between). What is
| the best strategy for where to put the books back? for a
| discussion see: https://www.iwriteiam.nl/Dpuzzle.html#usagesort
| axiosgunnar wrote:
| Slightly off-topic, the article was fine except for one nitbit...
|
| I wonder what the author, "hilariously" making one sorting
| function print "<val> sent to gulag" and calling it "stalin-
| sort", would think of "hitler-sort" which prints "<val> burned in
| oven"?
|
| Please don't normalize mass-murderers, even for memes.
| karaterobot wrote:
| It's an obvious joke, the point of which is that the Stalin
| Sort is destructive and doesn't actually get you what you want,
| and you'd never want to use it.
| googlryas wrote:
| I think OP gets the joke and is just saying it is in poor
| taste. Is it particularly different than their example of
| Hitler-sort burning stuff in an oven on failure? Or
| Columbine-sort, where the 15 coolest numbers get shot and
| then the rest of the numbers can get sorted.
| axiosgunnar wrote:
| If you downvote, you must write what is wrong with that
| statement.
| michaelmior wrote:
| Quantum bogosort is probably my personal favorite. Randomly
| permute the input using a quantum source of entropy. If the input
| is not sorted, destroy the universe. The only universe remaining
| is one where the input was already sorted. Runs in O(n) time.
| Sharlin wrote:
| (the problem of destroying the universe left as an exercise to
| the reader)
| wahern wrote:
| For the operation of this algorithm, is there any meaningful
| difference between destroying the universe and destroying the
| observer(s)? https://en.wikipedia.org/wiki/Quantum_suicide_an
| d_immortalit...
|
| EDIT: If you allow for the objects to spontaneously sort
| themselves, the runtime is O(1).
| drdeca wrote:
| Surely O(n) for checking whether it is sorted?
| btheshoe wrote:
| (the problem of destroying which universe is also left as
| an exercise to the reader)
| contravariant wrote:
| You could just destroy all universes to ensure it's
| sorted in the remaining reality.
| michaelmior wrote:
| I'm confused. What reality is remaining if all universes
| are destroyed?
| contravariant wrote:
| Even without destroying the universe you can sort your list
| with a quantum source of entropy, just use:
| def cosmicraysort(list): while not sorted(list):
| pass
| forinti wrote:
| My contribution: Power Sort.
|
| Start with p=0 and add 2^n for each n. Then subtract the largest
| power of 2 successively to get your integers ordered.
|
| Con: p will be very big.
|
| Pro: you don't need ifs!
| Paedor wrote:
| Fun fact, when you consider the bits required to contain p,
| this is equivalent to bucket sort with a bucket size of 1.
| Snild wrote:
| Ooh, that's a fun one.
|
| Another con: you'd better hope your input contains no
| duplicates. :)
| forinti wrote:
| Easily solvable with a hash.
|
| I have thought this through (I'm ashamed to admit).
| dbaupp wrote:
| Duplicates could also be handled by using (L+1)^n, rather
| than 2^n, where L is the length of the input: even if all
| elements are identical one will end up with a unique value p
| = L * (L+1)^n.
|
| For an efficient implementation, one might want to round L+1
| up to the nearest power of 2 to get _crucial_ micro-
| optimisations based on instructions for bit scanning.
|
| (I think this ends up being a very complicated phrasing of a
| counting sort.)
| dragontamer wrote:
| Bogosort is always a fun one, because I think its one of the
| better introductions to NP-complete problems.
|
| If you imagine that there's no "easy" algorithm to return a
| sorted list (but sorted lists do exist), then the only
| methodology possible is to try all combinations. And you do this
| either systematically, or randomly.
|
| Random NP-complete algorithms, such as WalkSAT, are quite good in
| practice. Systematic NP-complete algorithms, such as DPLL-SAT,
| are more "obvious" in how they work, and are sufficient for
| smaller sizes.
| DeathArrow wrote:
| for i = 1 to n do
|
| for j = 1 to n do
|
| if A[i] < A[j] then
|
| swap A[i] and A[j
| unnouinceput wrote:
| fastest swap implementation, assembler:
|
| PUSH A[j]; PUSH A[i]; POP A[j]; POP A[i];
| unnouinceput wrote:
| I know this article is meant as a joke but the first variant of
| bogosort, which is actually randomsort, it will be the fastest
| algorithm once we start using quantum computing.
|
| Having an array and allocating "n"=="array length" qubits for it
| then implementing parallel version of randomsort (no while, while
| is serialization!) will be the fastest sorting algorithm.
| t34zesrg wrote:
| NIH admits Fauci lied to congress and funded covid-19:
|
| https://www.nationalreview.com/news/nih-admits-to-funding-ga...
| nayuki wrote:
| My favorite is still slowsort.
| https://en.wikipedia.org/wiki/Slowsort
|
| > It is a reluctant algorithm based on the principle of multiply
| and surrender. It was published in 1986 by Andrei Broder and
| Jorge Stolfi in their paper Pessimal Algorithms and Simplexity
| Analysis.
|
| https://www.mipmip.org/tidbits/pasa.pdf
|
| > The slowsort algorithm is a perfect illustration of the
| multiply and surrender paradigm, which is perhaps the single most
| important paradigm in the development of reluctant algorithms.
| The basic multiply and surrender strategy consists in replacing
| the problem at hand by two or more subproblems, each slightly
| simpler than the original, and continue multiplying subproblems
| and subsubproblems recursively in this fashion as long as
| possible. At some point the subproblems will all become so simple
| that their solution can no longer be postponed, and we will have
| to surrender. Experience shows that, in most cases, by the time
| this point is reached the total work will be substantially higher
| than what could have been wasted by a more direct approach.
| cratermoon wrote:
| My favorite is SleepSort
| https://rosettacode.org/wiki/Sorting_algorithms/Sleep_sort
| fauscist1984 wrote:
| NIH admits Fauci lied about funding Wuhan gain-of-function
| experiments
|
| https://www.washingtonexaminer.com/opinion/nih-admits-fauci-...
| WJW wrote:
| Not related to sorting algorithms.
| edgyquant wrote:
| This has been posted in multiple threads today
___________________________________________________________________
(page generated 2021-10-21 23:01 UTC)