Over the past couple of weeks, the lectures have been focusing on the topic of sorting and efficiency. Sorting is one of the main processes in computer science, and it involves arranging elements from a list in some desired order. Computer scientists have come up with numerous sorting algorithms over the past couple of decades and some of them are naturally more efficient than others. Needless to say, it is important for computer scientists to find the most efficient sorting algorithms in order to optimize performance, and hence we analyze some of the most common sorting algorithms below.
Among the most famous algorithms are bubble, selection, insertion, merge and quick sorts. Bubble sort involves comparing adjacent items and exchanging those out of order; in each subsequent pass through the list, the algorithm places the next largest value in its proper place. It sort is by far the least efficient algorithm, having a performance of O(n^2) in every single case and having to exchange values in every single comparison in its worst case. Not very far ahead is selection sort; in each pass of this sorting algorithm, the smallest value in the list is placed in its proper location. It has an average performance of O(n^2), but unlike bubble sort only one exchange has to be made in each pass through the list. Insertion sort is slightly more efficient than selection. This algorithm always keeps a sorted sublist in the lower indices of the list; in each pass through the list, the next item is placed back into the right position in this sublist, making the sorted sublist one element larger until it contains the entire list. Insertion sort has an average and worst case performance of O(n^2) like bubble and selection; however, it is much more efficient when it comes to its best case, in which its running time becomes O(n).
Compared to bubble, selection and insertion sorts, merge and quick sorts are considerably more efficient. Merge sort is a recursive sorting algorithm that works by repeatedly splitting the list in half until its two halves are either empty or have a single item; then, it works by merging the two halves in the correct order, continually until the entire list is merged in the correct way. Quick sort involves recursively partitioning the given list using a pivot value into two different parts: the left (everything that has a lesser value than the pivot) and the right (everything that has a greater value than the pivot), and recursively calling quick sort on each of these parts until they have length less than two, in which case they are already sorted. Quick and merge sorts have an average efficiency of O(n log n), so that their performance is roughly equivalent; however, in its worst case, quick sort's performance drops to O(n^2) as the pivot can be selected to be the greatest element of the list in each case, splitting the list into two very uneven halves.
Python's built-in sorting algorithm, however, is by far the most efficient, as it is a hybrid of different algorithms including merge and insertion sorts; it also takes into account the ordering of the given list, making the algorithm more efficient based on the specific case.