Sorting Techniques

Sorting is arranging the data in ascending or descending order.  There are number of sorting techniques available which differs in use of various data structure and its storage. Various sorting methods have been explored here. Various terms used with sorting are as follows:

  • Comparison based sorting: In this approach, elements of an array are compared with each other to find the sorted array. (eg. insertion sort. merge sort, bubble sort, quick sort etc.
  • Non-Comparison based sorting: In this approach, elements of array are not compared with each other to find the sorted array. ( eg. bucket sort, radix sort, counting sort etc.)
  • Internal Sorting: Entire data to be sorted can be adjusted at a time in the main memory. (eg. bubble sort. insertion sort. etc.)
  • External Sorting: Entire data to be sorted cannot be accommodated in the memory at the same time and some data be kept in auxiliary memory such as hard disk, floppy disk, tapes etc. (eg. quick sort. merge sort. etc.)
  • Stable sorting: The technique in which that relative positioning of element is maintained throughout the process after sorting. (eg. merge sort, insertion sort, bubble sort)
  • Unstable sorting : The technique in which that relative positioning of element is not maintained.(quicksort, heap sort, and selection sort)
  • Adaptive Sorting : The technique which get affected by the nature of input data (i.e. sorted or unsorted data eg. bubble sort, insertion sort, quick sort etc)
  • Non-Adaptive Sorting : The technique which does not get affected by the nature of input data (eg. selection sort, merge sort, heap sort etc.)
  • In-place/Outplace technique – A technique is inplace if it does not use any extra memory to sort the array otherwise it is said to be outplace (eg. in comparison based techniques mentioned above only merge sort is outplaced. Among the non-comparison based techniques mentioned above all are outplaced techniques.

Complexity Analysis of Different Sorting Algorithms

AlgorithmData
Structure
Best
(Run Time)
Average
(Run Time)
Worst
(Run Time)
Space
Complexity
QuicksortArrayO(n log(n))O(n log(n))O(n^2)O(n)/0(logn)
MergesortArrayO(n log(n))O(n log(n))O(n log(n))O(n)
HeapsortArrayO(n log(n))O(n log(n))O(n log(n))O(1)
Bubble SortArrayO(n)O(n^2)O(n^2)O(1)
Insertion SortArrayO(n)O(n^2)O(n^2)O(1)
Select SortArrayO(n^2)O(n^2)O(n^2)O(1)
Bucket SortArrayO(n+k)O(n+k)O(n^2)O(nk)
Radix SortArrayO(nk)O(nk)O(nk)O(n+k)

You are invited to raise queries by writing your question in the  box given below

Published by Expert_Talk

I like sharing article and discussion on emerging technologies. Love to share my knowledge on various latest trends and technologies.

10 thoughts on “Sorting Techniques

  1. Is there a way to improve quick sort time complexity O() by reducing the number of comparison or movement ?

  2. In Quicksort, we are not performing any shift or movement of elements as used in insertion sort. There we tried a tradeoff between the two operations to reduce the overall complexity but the overall running cost is still 0(n^2).
    In Quick sort, you can work on identifying the best element to be chosen as pivot. As pivot element does the partition and nature of partition defines the running cost.

    1. In Comparison to both Merger Sort and Quick Sort:
      – Quick sort is an internal algorithm which is based on divide and conquer strategy.
      – Merge sort is an external algorithm and based on divide and conquer strategy.

      FOR QuickSort – For array quicksort is much efficient than merge sort because, one of the main sources of efficiency in quicksort is locality of reference, which makes easy for the computer to access the memory locations that are near to each other tends to be faster than memory locations scattered throughout memory. The partitioning step in quicksort. Typically has excellent locality, since it accesses consecutive array elements near the front and the back. So, quicksort is better than all other sorting algorithms. Additionally, quicksort is much faster because, it is in-place sort which means without needing to create any auxiliary arrays to hold temporary values. For arrays, merge sort loses due to the use of extra O(N) storage space.

      FOR MergerSort – In linked lists cells are often scattered throughout memory, there is no locality bonus to accessing adjacent linked list cells, which is one of the most important advantage for huge performance of quick sort.the benefits of working in-place no longer apply, since merge sort’s linked list algorithm doesn’t need any extra auxiliary storage space,merge operation of merge sort can be implemented without extra space for linked lists. Quick sort is fast in linked lists but, merge sort is faster because it more evenly splits the lists in half and does less work (In partitioning step).

      Quicksort usually is better than mergesort for two reasons:
      FIRSTLY, Quicksort has better locality of reference than mergesort, which means that the accesses performed in quicksort are usually faster than the corresponding accesses in mergesort.
      SECONDLY, Quicksort uses worst-case O(log n) memory (if implemented correctly), while mergesort requires O(n) memory due to the overhead of merging.

      MergeSort possesses advantages over QuickSort as below:
      Merge sort is more efficient and works faster than quick sort in case of larger array size or datasets. Quick sort is more efficient and works faster than merge sort in case of smaller array size or datasets. Sorting method : The quick sort is internal sorting method where the data is sorted in main memory.
      FIRSTLY, It can be applied to files of any size.
      SECONDLY, If heap sort is used for the in-memory part of the merge, its operation can be overlapped with I/O.
      THIRDLY, Reading of the input during the run-creation step is sequential ==> Not much seeking.

  3. How can we define the usage of RAM model as an example in steps of a case of this statement A[i] = B[i][j] + C in relation to Sorting Techniques?

    1. Count the number of steps as described in RAM model for each statement that are defined in the sorting algorithm . The total steps will be the running cost

    1. In Merge sort, the sub-problems are exactly balanced at each division, therefore, it will go to minimum levels (i.e nlogn). Moreover, this sorting is non-adaptive in nature so particular set of input didn’t affect its performance as a result it gives running time theta(n log n).
      Though in ascending sequence, you may observe some reduction in some constant numbers of operations but asymptotically the run time will always be theta(n log n).

  4. Can we use bucket sort with non-integers(0.21, 0.29)?
    If yes how they will be arranged in a buckets?

    1. If you are provided the low and high range of given data, as it is mentioned in your query (0.21- 0.29) in that case you can make a logic of reading only the fractional part and place the number corresponding to that digit. You can take 9 bucket numbered from 21 to 29.

Leave a reply to Sarvar Arifdjanov U1810017 Cancel reply