Divide-and-Conquer and QuickSort
In the previous article, we discussed about finding the complexities of algorithms. In the same article, we also came to know about Selection Sort and Bubble Sort, which are two simple algorithms and have an average case complexity of O(n2). But, we have a better algorithm at our disposal, the QuickSort which has an average case complexity of O(n log n). The worst case behavior is O(n2), but that can be avoided in most of the cases if implemented properly.
QuickSort makes use of the Divide-and-Conquer strategy, which entails breaking the existing problem into two or more sub-problems and then solving the sub problems independently. Finally, the solved subproblems are used to solve the bigger problem. (Note: I assume you are familiar with recursion. If you are not, I suggest reading up about recursion from ) For example, Merge Sort is a Divide-and-Conquer algorithm. What it does is, it takes the unsorted list; if the list is of size 2 or less it is trivial to sort it, otherwise it divides the list into two sub-lists. These sub-lists are recursively sorted. Now when the two sub-lists have been sorted, the bigger sorted list is created by checking the first element from both the lists, whichever is smaller is placed into the new list and removed from the sub-list. This process goes on till both the sub-lists are empty.
There are many applications of the divide-and-conquer strategy. One of them, we just discussed and the other is QuickSort as I mentioned. Apart from these there is Euclid’s algorithm to find the Greatest Common Divisor of two numbers, which can in turn be used to find the Lowest Common Multiple, which I suggest you should read up in case you are not aware of it. Also, the famous Tower of Hanoi problem is a classic example of Divide-and-Conquer. Here I list the algorithm for recursively solving Tower of Hanoi using Divide and Conquer.
What this is doing is, Beg is the tower from where the disks are to be transferred to End, using Aux as the auxiliary tower for intermediate storage. The algorithm then transfers the N-1 disks from the top of Beg to Aux, using End as the auxiliary storage. Once this is done, the bottom most disk is transferred to End. Thereafter, the N-1 disks we had put in Aux are transferred to End, using Beg as the auxiliary. So elegant, isn’t it? Other slightly more complicated algorithms are the Cooley-Tukey algorithm to multiply two very large numbers (of the order of 1000 digits), the Karatsuba’s algorithm which achieves the same purpose and to find the Discrete Fourier Transform.
Once you get the hang of what QuickSort does, you will find it to be one of the most intuitive sorting algorithms which comes with a bonus of a good complexity. As I said, QuickSort uses the Divide-and-Conquer paradigm. What it essentially does in each step is, choose a ‘pivot’. A pivot can be any element in the list, the list is then divided into two sub-lists. One sublist consists of elements lesser than the pivot (left sub-list) and the other sub-list consists of elements greater than the pivot (right-sublist). What is now left to do is, sort these sublists recursively using the same procedure and then concatenate the left sub-list, the pivot and the right sub-list.
It is easy to visualize how this works. The problem is that every iteration eats up O(N) space. We can improve on that. Removing the space requirements will speed up the execution even more. To improve on the QuickSort algorithm, we would use two functions here. The first being Partition, which will place the pivot in the correct position in the existing array and ensure that all the elements to the left of the pivot are less than the pivot, and all the elements to the right of the pivot are greater than it. Lets see the partition function first. The other function is the QuickSort function which will find the pivot and partition the list based on that and then recursively sort the two sub-lists.
Thus, what this function does is, it first puts the pivot at the index right and maintains a position variable, FinalPosition (initially equal to left). Everytime it finds a value smaller or equal to the pivot, it keeps that value at the index FinalPosition and increments FinalPosition. In the end, all the elements before FinalPosition are smaller than or equal to the Pivot. So, the Pivot is finally placed at FinalPosition. This value is returned. Lets see how the partition function is utilized to complete the;
I think it is quite apparent that we are making a lot of savings in terms of space as well as time. Since there are no returning of lists, which can become quite unwieldy with increase in the size of the list, we are definitely making better use of space and time. 4. Tasks for You 1. Read up about the Master Theorem. 2. Use Master Theorem to find the average-case complexity of Quick Sort
 Yedidyah Langsam, Moshe J. Augenstein, Aaron M. Tenenbaum, Data Structures Using C and C+ +, second edition, Prentice-Hall.