Quicksort algorithm in C

Quicksort Algorithm

Quicksort algorithm is a sorting algorithm which uses divide and conquer technique. Basic idea of algorithm is to divide the inputs around a pivot and then sort the two part one on the left side and other right.

In this article we will learn about the quicksort algorithm, code for it and then how we can apply modified quick sort on problems. Best part of quicksort algorithm is that it removes need of extra space required in merge sort. However, it also comes with a caveat that is input space may not be always divided into into two halves equally which can degrade the performance of algorithm considerably. We will see when this case arises.

Quicksort algorithm analysis

We will understand algorithm using an example. Let’s we have an array of integer A  = [3,4,10,5,6,7,9,10,19,2,1]. There are three parts of algorithm :

1. Select a pivot and then place it at it’s correct position.
2. Once pivot is on it’s place, sort left part of the array.
3. Then sort the right part.

Step 2 and 3 can be easily implemented in recursive function.
Step 1: Partition 
Select first element of input space i.e. A[0] as pivot pivot = A[0]. Put this pivot into correct position such that all the elements on the left side of the pivot are smaller than pivot and on right side are greater than pivot.

a. Start from the next element in input space and find the first element which is greater than pivot which was selected above. Let that be ith position.
b. Start from end of array and find first element from end which is smaller than pivot selected. Let it be jth position.
If i and j have not crossed each other i.e i<j then swap element at ith and jth positions, and repeat a dn b.
Once i and j cross each other, swap pivot with element at jth position.  After this step, pivot will be at its correct position and array will be divided into two parts, one on to the left side of pivot and other on to right side.

For input array A  = [3,2,19,10,6,7,10,1,4,5] below diagram shows the pivot and execution of function.
quicksort algorithm

We can see that all the elements on left side of pivot are less and right are greater than pivot. So our pivot is now at correct position.

int partition(int a[], int start, int end){
    i = start+1;
    j = end;
    pivot =  0;

    while(i<j){
       while(i<=end && a[i] < a[pivot]){
              i++;
        }
        while(j>=0 && a[j] > a[pivot]){
            j--;
        }
        if(i<j)
            swap(A[i],A[j])
    }
    swap(a[pivot], a[j]);
    return j;
}

Step 2 : Sort
Apply quick sort algorithm on the left part of array.Once can repeat the step one till we have a single or only two elements.

Step 3: Sort
This is same step as step two applied on right part of the array.

Quicksort implementation in C

#include<stdlib.h>
#include<stdio.h>

void swap(int *a, int *b){
    int temp = *a;
    *a = *b;
    *b = temp;
}

int partition(int A[], int start, int end){

        int i= start+1;
        int j = start;
        int pivot = start;

        for(; i<end; i++){
            if(A[i]<A[pivot])
                swap(&A[i],&A[++j]);
        }
        if(j<end)
            swap(&A[pivot],&A[j]);

        return j;
}

void quickSort(int A[], int start, int end){
        int part ;
        if(start <end) {
                part  =  partition(A, start, end);
                quickSort(A, start,part);
                quickSort(A, part+1, end);
        }
        return;
}

/* Driver program */

int main(){
        int a[]= {0,1,1,1,0,0,0,3,3,3,4,4};
        int n  =  sizeof(a)/sizeof(a[0]);
        
        quickSort(a,0,n);
}

Thanks Prince Mishra for Python implementation of quicksort

# Classic divide and conquer
# partition and combine

def partition(arr, start, end):
    pivot = start

    # all elements smaller than pivot should be to its left
    # all elements greater than pivot should be to ite right

    i = start + 1
    j = end

    while i < j:
        while i<end and arr[i] <= arr[pivot]:
            # find first element greater than pivot
            i+=1
        while j>start and arr[j] > arr[pivot]:
            # find first element smaller than pivot
            j-=1
        if i < j:
            # swap if they haven't crossed
            arr[i], arr[j] = arr[j], arr[i]
    ret = pivot
    if arr[j] <= arr[pivot]:
        arr[pivot], arr[j] = arr[j], arr[pivot]
        ret = j
    return ret

def quicksort(arr, start, end):
    # terminal condition
    if start >= end:
        return

    # propagation
    partition_index = partition(arr, start, end)
    quicksort(arr, start, partition_index-1)
    quicksort(arr, partition_index+1, end)


test_arr = [8,3,4,1,8,7,4,6,6,9]
quicksort(test_arr, 0, len(test_arr)-1)
print test_arr

Complexity of quick sort depends on the partitions. If partitions done are balanced, then quicksort performs as good as merge sort where as if partitions are unbalanced, then it becomes as bad as insertion sort. 
Worst case behavior of quick sort is notoriously O(n2), when the input set is all sorted and pivot is either first or last element of array. In that case, array will be divided into two parts, one parts has only one element and other part contains n-1 elements. 
Similarly when we work on subarray with n-1 elements it is again divided into two subarrays of size 1 and n-2. In order to completely sort array it will split for n-1 times and each time it requires to traverse n element to find correct position of pivot.Hence the recurrence relation will be T(n) = T (n-1) + T(0) + O(n) where O(n) is partition cost.
Solving this equation T(n)  = O(n2). 
Hence the overall complexity of quicksort comes out as O(n2>).

Worst case of quicksort happens when input array is already sorted in increasing or decreasing order. Even though worst case is O(n2), in average case if pivot is picked wisely, quick sort can perform sorting in O(nlogn). Best part is there is no extra space involved in it. Hidden constants are also very small as compared to merge sort. Also it works very efficiently with virtual memory.

Best case partitioning happens when two partitions are roughly equal in size that is of size n/2. In that case recurrence relation becomes:  T(n) = 2 T(n/2) + O(n); which is solved into T(n) = O(nlogn).

Average case of quicksort is closer to best case than worst case.

Selection of Pivot
If array is completely sorted, then worst case behavior of quicksort is O(n2), so there comes another problem. How can we select pivot so that we have two sub arrays are nearly halves of the input array. There are many solutions proposed.

1. Taking median of array as pivot. So how to select median of an unsorted array. We look into this problem separately, but yes it guarantees two halves of same size.
2. Selecting pivot randomly. This requires heuristics algorithms to select pivot.

This is basic article explaining the concept of quicksort. In next post we would see how we can use this algorithm to find out Kth largest or smallest element in an array of integers.

Things to ponder on

  1. Is quick sort stable? No it is not.
  2. Is it suitable for parallel execution ?
  3. Can we use insert sort when we have small number of elements in a sub array? Left as an exercise.

Yes, we can run left and right sub arrays on parallel machines and then combine results to form the larger array.
Please leave your comment in case you find something wrong or you have some improved version. If you are interested in contributing to website or have an interview experience to share, please contact us.