DETERMINE HOW TO DO BUBBLE AND INSERT SORTS; DETERMINE THEIR ASYMPTOTIC COMPLEXITY. Bubble Sort pseudocode: DO scan through all unsorted locations of the array IF the items in the current and next locations are out of order interchange the items in those locations WHILE one or more interchanges were done in the scan Bubble Sort in English: Start at index zero of the array and, going in order towards the last index, search for two consecutive entries that are out of order (i.e. the higher index is a lower number). When two are found, swap them, and continue towards the last index in the same manner. Bubble Sort asymptotic complexity: Notice that in the first scan the largest value will be bubbled to the (n-1)th index. In the second scan, the second largest value will be bubbled to the (n-2)th index. So there will be at most n scans, each of which iterates through n indices. That makes the worst-case time requirement n times n, or O(n^2). Other optimizations can be made, but they will not change the worst-case time requirement. Insertion Sort pseudocode: start with a collection of n unsorted items and zero sorted items WHILE the collection of unsorted items has size >= 1 move one item from the unsorted collection to its proper place in the sorted collection END Insertion Sort in English: Insertion sort uses two piles of items, one unsorted and one sorted. Initially, there will be n unsorted items and zero sorted items. What you must do is repetitively take one of the unsorted items out of the unsorted pile and put it into its proper place in the sorted pile. After the first step there will be n-1 unsorted items and 1 sorted item, after the second step there will be n-2 unsorted items and 2 sorted items, etc. When no more items remain in the unsorted pile, they must all have been moved into the sorted pile, and so you now have n sorted items. Insertion Sort asymptotic complexity: You take each item and determine where in the current sorted set it belongs. In worst-case, determining this will require searching through all the items in the sorted set. So the first item requires searching through zero items, the second item requires searching through 1 item...and the (i)th item requires searching through i-1 items. The expression for the total number of items you must search through is: 0 + 1 + 2 + 3 + ... + n-2 + n-1, which equals (1/2)(n)(n-1). The product of this is (1/2)(n^2-n) and from that we calculate the Big-Oh to be O(n^2). WRITE SELECT SORT FROM LEFT TO RIGHT AND RIGHT TO LEFT; SORT UP AND SORT DOWN. Code is appended. WRITE NON-RECURSIVE MERGESORT (SEE DS&SD: 5.3) Code is given on pages 118 and 119 of DS&SD. IMPROVE QUICK SORT WITH "MEDIAN OF 3" PIVOT The idea for this is that quicksort works most efficiently when the pivot is the median (i.e. the middle) number. Then each partition will have exactly half the size of its parent partition. This is basically mergesort and runs in O(n log n) time. In the other extreme (worst-case), a pivot choice of either a max or a min results in one partition being very large and one partition being empty. This is effectively insertion sort, which runs in O(n^2) time. Choosing as a pivot the median of three numbers increases the chances of it being an effective choice (i.e. closer to the median) and decreases your chances of it being an ineffective choice (i.e. closer to either the max or the min). Code for Median-of-3 Quick Sort: Use the code from Section 5 of Lecture Notes 15, with one change. Replace: int p = partition(x, a+1, b, x[a]); With: int p; if ( x[a].compareTo(x[a+1]) * x[a].compareTo(x[a+2]) != 1) //-1 means unique median, 0 means equality p = partition(x, a+1, b, x[a]); else if ( x[a+1].compareTo(x[a]) * x[a+1].compareTo(x[a+2]) != 1) p = partition(x, a+1, b, x[a+1]); else p = partition(x, a+1, b, x[a+2]);