private static void quicksort(int[] numbers, int startIndex, int endIndex) {
if (endIndex <= startIndex) {
return;
}
int high = partition(numbers, startIndex, endIndex);
quicksort(numbers, startIndex, high);
quicksort(numbers, high + 1, endIndex);
}
private static int partition(int[] numbers, int startIndex, int endIndex) {
// Select the middle value as the pivot.
int midpoint = startIndex + (endIndex - startIndex) / 2;
int pivot = numbers[midpoint];
int low = startIndex;
int high = endIndex;
boolean done = false;
while (!done) {
while (numbers[low] < pivot) {
low++;
}
while (pivot < numbers[high]) {
high--;
}
if (low >= high) {
done = true;
} else {
swap(numbers, low, high);
low++;
high--;
}
}
return high;
}
Trace the execution of the partition algorithm (not the quicksort algorithm!) on the following array. Show the state of the array after each swap. Indicate the low and high partitions in your final answer.
\([8, 11, 15, 8, 14, 2, 9, 8]\)
Here is an example of tracing quicksort on a small array:
[9, 7, 8, 6]
[6, 7, 8, 9] (swap 9 and 6)
/ \
[6, 7] [8, 9]
/ \ / \
[6] [7] [8] [9]
Trace the execution of quicksort on the following array. In this case, the key is the number, and the superscript indicates the record.
\([1^a, 1^b, 1^c, 1^d]\)
Is quicksort stable? Justify your answer.
Write recurrences expressing both the worst-case and best-case number of comparisons performed by quicksort. Assume that the partition operation requires exactly \(n-1\) comparisons. (This is not true of the partition implementation above, but it is possible to develop an implementation that does exactly n-1 comparisons: each element is compared to the pivot exactly once, except for the pivot itself.)
Solve the worst case recurrence you developed in the previous question.
Copy and past the quicksort code above into a code editor. Add a
static variable to count the total number of comparisons.
Instrument your partition
method to update this counter on each
call. Rather than explicitly counting instructions, just use the
statement:
comparisonCount += endIndex - startIndex;
to reflect the theoretical minimum number of comparisons.
Create a main
and create a length-five input array that results
in the worst case performance. You should be able to check the
resulting number of comparisons against your theoretical
prediction from the previous exercise. Copy your worst case input
here:
Recall that merge sort performs exactly \(n\log_2 n - n + 1\) comparisons in the worst case. Which is strictly fewer than the number of comparisons performed by quicksort in the average case: \(A(n) \approx 1.39 (n +1) \log_2 n\). Despite this, quicksort is usually faster than mergesort. How can this be true?