Skip to main content

Comparison Based Sorting VS Non-Comparison Based Sorting

 Comparison Based Sorting VS Non-Comparison Based Sorting

 

It may come as a surprise to many of you, but it's feasible to arrange or classify things without comparing them to one another. Instead, then making assumptions about the data they will sort, some sorting algorithms actually conduct the sorting without comparing the pieces. Non-comparison sorting is the method, and non-comparison-based sorting algorithms are the algorithms used. Counting sort, Radix sort, and Bucket Sort are examples of non-comparison sorting algorithms. Counting sort uses key-value sorting, while Radix sort examines individual bits of keys. Because they sort in O(n) time, these algorithms are sometimes known as Liner sorting algorithms. Since they have presumptions about the data, they can skip the comparison decision tree.

 

So, you can roughly divide the sorting algorithms into two types based on how they function: comparison-based, like QuickSort or MergeSort, and non-comparison-based, like Counting Sort or Radix Sort. The comparison decision, as the name implies, is what distinguishes a comparison-based sorting algorithm from one that does not use comparisons.

 

A comparator is necessary to compare numbers or items of the first category. To arrange components, this comparator essentially specifies the ordering as in the numerical order, lexical order, also known as dictionary order. On the other side, non-comparison-based sorting algorithms rely on integer arithmetic on the keys rather than comparison.

 

COMPARISON-BASED SORTING TECHNIQUES :

1.     MERGE SORT :

 

Merge sort is a sorting technique based on the divide and conquer technique.

Merge sort first divides the array into equal halves and then combines them in a sorted manner.

It keeps on dividing the list into equal halves until it can no more be divided i.e. only one element in the list.

Then, merge sort combines the smaller sorted lists. i.e. In Merge Sort, the given unsorted array with n elements is divided into n sub-arrays, each having one element because a single element is always sorted in itself. Then, it repeatedly merges these sub-arrays, to produce new sorted sub-arrays, and in the end, one complete sorted array is produced.

It requires an equal amount of additional space as the unsorted array. Hence its not at all recommended for searching large unsorted arrays

 




Applications of Merge sort:

1.      Merge Sort is useful for sorting linked lists in O(n log n) time.

2.    Merge sort can be implemented without extra space for linked lists.

3.      Merge sort is used for counting inversions in a list.

4.      Merge sort is used in external sorting.

 

2.     QUICK SORT

It is one of the most efficient sorting algorithms and is based on the splitting of an array (partition) into smaller ones and swapping (exchange) based on the comparison with the 'pivot' element selected. Due to this, quicksort is also called as "Partition Exchange" sort. Like Merge sort, Quick sort also falls into the category of divide and conquer approach of problem-solving methodology.



All of this is done in place so that instead of making copies of subarrays being sorted, the method only sorts the subarray. Initially, this method is invoked with the arguments.

 

 Application:

1.      Commercial computing

2.      Numerical computing

3.      Information Search

 

Non – Comparison Based Sorting Techniques

 

3.     Counting Sort

The counting sort algorithm sorts the elements by counting the total no of times each unique element occurs in the array. An auxiliary array is used to store the count of the elements and perform some arithmetic operations on these count values to determine the positions of the elements.




Algorithm

Step 1: Find the maximum value in the given array.
Step 2: Initialize an auxiliary array of size maximum value plus one.
Step 3: Store the count of each element in their respective index in the auxiliary array. For example, store the count of 5 at index 5 in the auxiliary array.
Step 4: Calculate the cumulative sum and store it in the auxiliary space.
Step 5: Temporary array is used to store the sorted elements.
Step 6: Traverse the given array from left to right.
Step 7: Store the current element at index b[current element] - 1 in the temporary array where b is the auxiliary array.
Step 8: Decrement b[current element] by 1.
Step 9: Return the temporary array.

 

Applications of Counting Sort Algorithm

If the range of input data is not much bigger than the number of objects to be sorted, counting sort is efficient. Consider the following scenario: the data is 10, 5, 10K, 5K, and the input sequence is 1 to 10K.

It isn't a sorting system based on comparisons. It has an O(n) running time complexity, with space proportional to the data range.

It's frequently used as a subroutine in other sorting algorithms, such as radix sort.

Counting sort counts the occurrences of the data object in O using partial hashing (1).

The counting sort can also be used with negative inputs.

 

4.     Radix Sort

The radix sort algorithm sorts the elements by comparing significant digits from least significant digit to most significant digit.

 



Algorithm

Step 1: Find the maximum value in the given array.
Step 2: Calculate no of digits present in the maximum value.
Step 3: Traverse each significant place starting from the least significant to the most significant place.
Step 4: Sort the elements based on the digit present at the current significant place using counting sort.
Step 5: Return the sorted array.

 

APPLICATIONS

1.      DC3 algorithm (Kärkkäinen-Sanders-Burkhardt) while making a suffix array.

2.      Places where there are numbers in large ranges.

3.      It was used in card sorting machines that had 80 columns, and in each column, the machine could punch a hole only in 12 places.

4.      The Radix sort algorithm locates locations where there are numbers in extensive ranges.

5.      Radix sort can be applied to data that can be sorted lexicographically

6.      It is used for constructing a suffix array

 

 

Comparison Parameter

Non-Comparison Based Sorting

Comparison Based Sorting

SPEED

Faster

The limit speed for non-comparison-based algorithms it's O(n) i.e. linear time.

Comparatively slower

The limit of speed for a comparison-based sorting algorithm is O(NlogN)

COMPARATOR 

Does not require a comparator

 

Requires a comparator

USAGE

A non-comparison-based sorting algorithm can not be used to sort anything other than integers.

A Comparison-based sorting algorithm can use to sort any object provided it provides Comparator to define order.

 

MEMORY COMPLEXITY

The memory complexity for the non-comparison-based sorting algorithm is always O(n).

The best case for memory complexity with the comparison-based sorting is O(1)

CPU COMPLEXITY

The lower bound of CPU complexity is O(n)

The lower bound of CPU complexity is O(NlogN)

 

 

 

 

WHY DO WE NEED NON_COMPARISON-BASED SORTING ALGORITHMS?

 

Why do we even need non-comparison-based algorithms when comparison-based sorting algorithms work just fine, some of you may be asking. To sort n items, Bubble sort, Selection Sort, and Insertion Sort all require roughly O(n^2) time, according to the performance of the comparison-based sorting algorithm.

While the worst-case time for heap sort, quick sort, and merge sort is O(NlogN) and the best-case time is O(n^2).

Can the sorting algorithm do better than O(NlogN)? Can we use these sorting algorithms to sort things in O(n) time? So, the response is no.

Any comparison-based sorting algorithm will, as has been demonstrated, require at least O(NlogN) operations to sort n elements; as a result, you need a non-comparison-based sorting algorithm that enables linear time sorting of elements, to sort things in O(n).

 

Comments

Post a Comment