Sorting algorithm
In computer science, a sorting algorithm is an algorithm that puts elements of a list in a certain order. The
most-used orders are numerical order and lexicographical order. Efficient sorting is important for optimizing the use
of other algorithms (such as search and merge algorithms) that require sorted lists to work correctly; it is also oftenuseful for canonicalizing data and for producing human-readable output. More formally, the output must satisfy two
conditions:
1. The output is in nondecreasing order (each element is no smaller than the previous element according to the
desired total order);
2. The output is a permutation, or reordering, of the input.
Since the dawn of computing, the sorting problem has attracted a great dealof research, perhaps due to the
complexity of solving it efficiently despite its simple, familiar statement. For example, bubble sort was analyzed as
early as 1956.[1] Although many consider it a solved problem, useful new sorting algorithms are still being invented
(for example, library sort was first published in 2004). Sorting algorithms are prevalent in introductory computer
scienceclasses, where the abundance of algorithms for the problem provides a gentle introduction to a variety of
core algorithm concepts, such as big O notation, divide and conquer algorithms, data structures, randomized
algorithms, best, worst and average case analysis, time-space tradeoffs, and lower bounds.

Classification
Sorting algorithms used in computer science are often classified by:
•Computational complexity (worst, average and best behaviour) of element comparisons in terms of the size of the
list
. For typical sorting algorithms good behavior is
and bad behavior is
. (See Big
O notation.) Ideal behavior for a sort is

, but this is not possible in the average case. Comparison-based

sorting algorithms, which evaluate the elements of the list via an abstract key comparisonoperation, need at least
comparisons for most inputs.
• Computational complexity of swaps (for "in place" algorithms).
• Memory usage (and use of other computer resources). In particular, some sorting algorithms are "in place". This
means that they need only
or
memory beyond the items being sorted and they don't need to
•
•
•
•
•

create auxiliary locations for data to be temporarilystored, as in other sorting algorithms.
Recursion. Some algorithms are either recursive or non-recursive, while others may be both (e.g., merge sort).
Stability: stable sorting algorithms maintain the relative order of records with equal keys (i.e., values). See
below for more information.
Whether or not they are a comparison sort. A comparison sort examines the data only by comparing twoelements
with a comparison operator.
General method: insertion, exchange, selection, merging, etc.. Exchange sorts include bubble sort and quicksort.
Selection sorts include shaker sort and heapsort.
Adaptability: Whether or not the presortedness of the input affects the running time. Algorithms that take this into
account are known to be adaptive.

Stability
Stable sorting algorithms maintainthe relative order of records with equal keys. If all keys are different then this
distinction is not necessary. But if there are equal keys, then a sorting algorithm is stable if whenever there are two
records (let's say R and S) with the same key, and R appears before S in the original list, then R will always appear
before S in the sorted list. When equal elements are indistinguishable, suchas with integers, or more generally, any
data where the entire element is the key, stability is not an issue. However, assume that the following pairs of

Sorting algorithm

2

numbers are to be sorted by their first component:
(4, 2)

(3, 7)

(3, 1)

(5, 6)

In this case, two different results are possible, one which maintains the relative order of records with equal keys, and...