Quick Sort Algorithm

Design and Analysis of Algorithms

Sadhil Chhabra
7 min readMar 22, 2021

Quick sort or partition exchange sort or comparison sort is not a stable sort.

Working Principle: Quick sort works on Divide and Conquer strategy. First of all, it partitions the list of items into two sub-lists based on the pivot element. All elements in the first sub-list are arranged to be smaller than the pivot while all elements in the second sub-list are arranged to be larger than the pivot. The same partitioning and arranging process is performed repeatedly on the resulting sub lists until the whole list of items are sorted.

Basic Operation: Pick an element from the array (key value), partition the remaining elements into those greater than and less than this key value and recursively sort the partitions.

Implementation details: Starting on each end of the table, we move two pointers i and j towards the middle of the table. Whenever the element in the lower part is larger than the pivot and the element in the upper part is smaller, we exchange them. When the pointers cross, we move the pivot at that position.

General Structure of Recursive Quick Sort

//start from both the ends of array
Sorting Table (Start,…, End)
//J is the location of partitioning
Partition Table (Start,…, End)
//first partition
Sort Table (Start…J-1)
//second partition
Sort Table (J+1,… End)

This method applies very well on larger tables. At each step, the goal is to place a particular record in its final position within the table. In doing so, all records which precede this record have smaller keys, while all records that follow it have larger keys. This technique essentially partitions table into two sub-tables.

In nutshell
1. If there are one or less elements in the array to be sorted then return immediately.
2. Else take the leftmost elements from the array as a pivot point or key value.
3. Split the array into two parts: one with elements larger than the key value and the other part with elements smaller and equal to the key value.
4. Recursively repeat the algorithm for both parts of the original array.

Explanations: Initially, array elements are divided into two parts-key value and the remaining array elements.

After swapping of key element, array elements are divided into three parts-
— First part with values less than the key value.
— Second part with key value only.
— Third part with values greater than the key value. That is,

Please note that usually the first element is selected as a key or pivot value. Also note that the best sorting with minimum number of steps is to select a key value as a median value.

Advantages of Quick Sort

— It works well with large number of elements.
— No extra space is required so it is faster than merge sort.

Disadvantages of Quick Sort

— Its worst-case performance is similar to average performances of the bubble, insertion or selection sorts.

Example of Quick Sort

Suppose A is an array with 12 integer elements as follows:

Now, we use the first element 44 as pivot element or pivot key.
The goal is to place this pivot element 44 in its correct position.
Beginning with the last element 66, we scan from right to left, till we get the first element less than 44 i.e. 22.
Interchange 44 and 22.
So, the array elements look like-

88 and 66 to the right of 44 are greater than 44.
Now scanning from left to right, we stop at first element greater than 44 i.e. 55; Interchange 44 and 55.
Array elements look like-

In general,
We scan from right to left, stop at lesser element.
— We scan from left to right, stop at greater element.
Now scanning from right to left, we stop at element less than 44 i.e. 40
Interchange 44 and 40.
Array elements look like-

Beginning with 40 we scan from left to right, for element greater than 44 i.e. 77; Interchange 44 and 77.
Array elements look like-

Beginning with 77, scan from left to right for element less than 44 . We do
not meet such a number before meeting 44.
This means that all numbers have been scanned and compared with pivot element 44; also that all numbers less than 44 now form sub-list of numbers to left of 44 and all numbers greater than 44 form sub-list of numbers to the right of 44.
Thus, 44 is correctly placed in its final position and the task of sorting the original list A, now has been reduced to the task of sorting each of the above sub-lists.
What has to be done is that now you have two different sub-arrays:-

Now implement the quick sort technique to these sub-arrays where 22 happens to be the pivot element of first sub-array and 90 happens to be the pivot element of second sub-array.
When you implement this technique both 22 and 90 will be placed in its final position.

Function Quick Sort

C++_Program of Quick Sort

#include<bits/stdc++.h>
#include<iostream>
using namespace std;
int partition(int A[], int low, int high) //Partition Function
{
int t,i,j,flag;
t = A[low];
i = low;
j = high+1;
do
{
do
i++;
while(A[i]<t && i<=high);
do
j--;
while(t<A[j]);
if(i<j)
{
flag = A[i];
A[i] = A[j];
A[j] = flag;
}
}
while(i<j);
A[low] = A[j];
A[j] = t;
return(j);
}
void quickSort(int A[], int low, int high) //Quick-Sort Function
{
int j;
if(low<high)
{
j = partition(A,low,high);
quickSort(A,low,j-1);
quickSort(A,j+1,high);
}
}
int main()
{
int A[30],Numb,i;
cout<<"Enter the number of elements to be sorted: ";
cin>>Numb;
cout<<"\nEnter the elements:"<<endl;
for(i=0; i<Numb; i++)
{
cin>>A[i];
}
quickSort(A,0,Numb-1);
cout<<"\nSorted Data: ";
for(i=0; i<Numb; i++)
{
cout<<" "<<A[i];
}
return 0;
}

Output-1(after running)
Enter the number of elements to be sorted: 10
Enter the elements: 11 2 9 13 57 25 17 1 90 3
Sorted Data: 1 2 3 9 11 13 17 25 57 90

Time Complexity of Quick Sort Algorithm

The running time of a sorting algorithm is usually measured by the number f(n) of comparisons required to sort n-elements.

The worst case occurs when the list is already sorted. Then the first element requires n comparisons to recognize that it remains in the first position. Furthermore, the sub-list will be empty but the 2nd sub-list will have n — 1 elements. Accordingly, the 2nd element will require n — 1 comparisons to recognize that it remains in the 2nd position. And so on. Consequently there will be a total of:-
f(n) = 1 + (n–1) + … + 2 + 1
= n(n+1)/2
= (n²)/2 + O(n)
= O (n²) comparisons.

Note that the average complexity of O(n log n) comes from the fact that on average each reduction step of the algorithm produces two sub-lists.
Accordingly:-
1. Reducing the initial list places 1 element and produces two sub-lists.
2. Reducing the two sub-list places 2 elements and produces four sub-lists.
3. Reducing the four sub-list places 4 elements and produces eight sub-lists.
4. Reducing the eight sub-lists places 8 elements and produces sixteen sub lists.
And so on.

Worst Case (Sorted Array)

The worst case for quick sort occurs when the pivot is a minimum or maximum of all the elements in the list. That is,

It can be re-written as follows:-
C(n) = C (n–1) + n
Or
C(n) = n+ (n–1) + (n–2) + … + 2 + 1
But as we know that-
1 + 2 + 3+ 4 +… +n = n(n+1)/2 = 1/2 (n²).
So,
C(n) = 𝛉 (n²).
Thus, the time complexity of quick sort in worst case is 𝛉 (n²).
Let us see the best case now.

Best Case (split in the middle)

If the array is always partitioned at the mid then it brings the best case efficiency of an algorithm. The recurrence relation for quick sort for obtaining best case time complexity is as follows:

and C(1) = 0.
To solve equation-1 by Master Theorem, as we know that it states that-

We get:
C(n) = 2 C(n/2) +n
Here, f (n) ɛ n¹ ;so d=1
Now a=2 and b= 2
As from case-2, we get a= (b^d)
i.e. 2 = 2¹, we get
T(n) i.e. C(n) = 𝛉((n^d) log n)) = 𝛉(n¹ log n) = 𝛉 (n log n).
Thus, the best case complexity of quick sort is 𝛉 (n log n).

In nutshell, we can say for quick sort:

Time complexities best, average and worst case

Thank you So much!

--

--