Why is quicksort better than other sorting algorithms in practice? (2013)

tambourine_man | 117 points

There are annual contests for sorting large amounts of data, e.g.:

http://sortbenchmark.org/

(I could be wrong but I don't think the winners ever used Quicksort.)

amelius | 4 years ago

There are too many different tradeoffs and dimensions to the problem to say that one sorting algorithm is the "best". E.g. quicksort without fallback has a denial of service attack against it.

Aardwolf | 4 years ago

Oh my. Comparison sorts are still slower than counting sorts on typical architectures, esp. radix sort. (Coupled with insertion sort of small sizes.)

Or burstsort if the data size is truly huge.

As usual, StackOverflow is missing the forest for the trees.

AstralStorm | 4 years ago

if the input doesn't fit into memory than usually merge sort is better, because it is sequential and we do not need to jump around

burakcosk | 4 years ago

I checked the Haskell implementation of sort a while ago. Apparently it used to be Quicksort, which has a really elegant recursive implementation in Haskell, but they found that mergesort was actually faster. I wonder if it has anything to do with how a programming language implements lists, or if people just assume QuickSort is the fastest because of threads like these and don't bother trying out alternatives.

c3534l | 4 years ago

*for scar execution.

SIMD and GPGPU record-breaking parallel implementations use radix, bitonic, etc.

alecco | 4 years ago

People usually choose quicksort because it is in-place and there are simple mitigations for inputs that exhibit poor performance.

justincredible | 4 years ago

gdfg

sabhishek78 | 4 years ago

Because it is available in almost every standard library?

In most cases developers don't care for the worst case because they either have no idea what you are talking about or, if they actually understand the problem, they try to ensure they don't need to sort arbitrary amount of user-controllable data.

lmilcin | 4 years ago