Design

Algorithm Improvement through Performance Measurement: Part 1

By Victor J. Duvanenko, September 15, 2009

Optimizing sort algorithms for today's CPUs

Sorting algorithms present some of the most interesting core challenges within computer science and engineering. A variety of sorting algorithms have been invented and studies over decades, with each algorithm providing a unique performance advantage over others, and some poor ones providing none at all. The web is a great resource for comparison and visualization (see [1], [2], [3], for example) and many algorithm text books are available.

In this multipart series I explore these algorithms, but with a purpose of discovering how to optimize for the current generation of CPUs. In other words, we are going to explore some of the key aspects of today's CPUs. But there is nothing better than tuning real code to squeeze more performance and seeing how far it can go. This can be really addicting, but is one of the most enjoyable addictions of computer engineering.

So, let's dive right in and explore several of these sorting algorithms. A sorting algorithm takes an array of numbers as input and puts these numbers in order; for example, a[0] ≤ a[1] ≤ a[2] ≤ a[3] ≤ … ≤ a[N] is the goal. To illustrate: Given a number sequence 5, 3, 7, 6, 4 a sorting algorithm would return 1, 3, 4, 5, 6, 7. In general, the items don't have to be numbers, but just need to be capable of being compared.

Selection Sort

Selection Sort is a classic approach that is described well in many textbooks on algorithms, as well as websites. Listing One provides two implementations: The first is the simplest to understand, which uses STL min_element() function to find the minimum, and the second implementation expands min_element().

The algorithm works by finding the minimum value in the entire array and moving that value to the beginning of the array. Then the next smallest value is searched for, not from the beginning of the array, but from the (beginning+1) array index to the end of the array. This minimum value is placed in array[ beginning+1 ] location. These steps continue, finding the minimum and reducing the search range by one element during each iteration of the loop, moving the minimum element to the front of the current search range. This process goes on until the search range is reduced to zero. The result is a sorted array with minimum value at the beginning of the array and maximum value at the end.

Another way to view Selection Sort, is that one sorted element is moved to the sorted portion of the array (at the beginning of the array) and the next iteration works on the unsorted portion of the array. As the algorithm progresses, the unsorted portion of the array gets progressively smaller, and the sorted portion gets progressively larger, until the sorted portion takes over the entire array.

Selection sort is an O(n2) algorithm, which can be seen from the number of comparisons in each iteration: (n-1) + (n-2) + (n-3) + … + 3 + 2 + 1 = n(n-1)/2 comparisons.

Several nice animations of the Selection Sort and other sorting algorithms in action are provided on [2],[3] and [4]. Take a look at these, if only to muse about how many creative ways there are to sort.
Selection Sort distinguishes itself by performing the least number of writes to the array of all sorting algorithms. It performs only n writes to the array. Few writes may be an important attribute in some environment, such as arrays stored in flash memory, where writes are very expensive. [4]

Note that our initial implementation for Selection Sort does not work properly for arrays size of zero. This is a common problem with corner cases and can be easily fixed with a single if statement at the front that checks the size to be one or more, and simply does nothing for smaller sizes. Sorting algorithms should work for arrays of size 0 and 1, and should do nothing in those cases, not even access any items within the array. The algorithm should not return an error in these two cases, since it's not wrong to have 0 or 1 elements in an array, there is just nothing to be done.

Selection Sort requires no additional storage and manipulates the array itself, swapping elements within the array — this is "in-place behavior". Selection Sort always performs the same number of comparisons and writes to the array no matter what the data is, always doing n(n-1)/2 comparisons and 2(n-1) writes to the array. It is 2(n-1) writes because swapping takes two writes to the array — the algorithms does two writes per loop. The (n-1) term comes from the fact that when the last element is reached no writes are necessary, since it's the only element left and is the maximum value within the entire array. In fact, the algorithm does not process the last element.

Insertion Sort

This O(n2) algorithm works by keeping a portion of the array always sorted and then inserting a new element into its proper place within this sorted portion. Insertion Sort is analogous to keeping a handful of sorted cards, then picking up a new card from an unsorted pile, scanning the sorted cards in hand from right to left and inserting the new card into its proper place [5], [6]. This method is quite different from Selection Sort which finds the minimum element within the unsorted portion of the array. Insertion Sort, on the other hand, does not look for a minimum, but looks for a point to insert one element taken from the unsorted portion of the array. This insertion point is within the already sorted portion of the array, growing the sorted portion larger by one element. Insertion Sort keeps growing the sorted portion of the array, and shrinking the unsorted portion, by one element per iteration of the loop. Listing Two provides this algorithm adapted from pseudo-code in [2].

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task.
However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

Video

This month's Dr. Dobb's Journal

This month,
Dr. Dobb's Journal is devoted to mobile programming. We introduce you to Apple's new Swift programming language, discuss the perils of being the third-most-popular mobile platform, revisit SQLite on Android
, and much more!