Understanding Quick Sort Algorithm in Python

Introduction

Sorting is a fundamental operation in computer science, and there are various algorithms available for this task. One of the most efficient and widely used sorting algorithms is Quick Sort. Quick Sort is known for its average-case time complexity of O(n log n), making it a preferred choice for sorting large datasets. In this article, we will explore the Quick Sort algorithm in Python and understand its implementation step-by-step.

Overview of Quick Sort

Quick Sort follows a divide-and-conquer approach to sort elements. The algorithm selects a pivot element from the array and partitions the other elements into two sub-arrays based on whether they are smaller or larger than the pivot. This process is repeated recursively for each sub-array until the entire array is sorted.

Steps to Implement Quick Sort in Python

Step 1: Choose a Pivot:

The first step in Quick Sort is to select a pivot element. There are various methods to choose a pivot, such as selecting the first, middle, or last element of the array. For simplicity, we will choose the last element as the pivot.

Step 2: Partition the Array:

In this step, we partition the array around the pivot element. We maintain two pointers, ‘i’ and ‘j’, to keep track of elements smaller and larger than the pivot, respectively. We iterate through the array, moving elements to the left of ‘i’ if they are smaller than the pivot. Whenever we encounter an element greater than the pivot, we swap it with the element at ‘j’. Finally, we swap the pivot with the element at ‘i+1’ to place it in the correct position.

Step 3: Recursively Sort Sub-arrays:

After partitioning the array, we recursively apply Quick Sort to the sub-arrays on the left and right of the pivot. This process continues until the sub-arrays contain only one element or are empty.

Step 4: Combine the Sorted Sub-arrays:

Once the recursion ends, we have sorted sub-arrays. To obtain the final sorted array, we combine the left sub-array, the pivot element, and the right sub-array.

Implementation of Quick Sort in Python:


def partition(array, low, high):
    pivot = array[high]
    index = low - 1
    for j in range(low, high):
        if array[j] < pivot:
            index += 1
            array[index], array[j] = array[j], array[index]

    array[index + 1], array[high] = array[high], array[index + 1]

    return index + 1 

def quickSort(array, low, high):
    if low < high:
        mid = partition(array,low,high)
        quickSort(array, low, mid - 1)
        quickSort(array, mid+1, high)
        
if __name__ == "__main__":
    array = [10, 8, 7, 9, 2, 4]
    quickSort(array,0,len(array)-1)
    print("The sorted array is: ")
    for i in array:
        print(i,end=' ')
    print("n")

Output


The sorted array is: 
2 4 7 8 9 10

Complexity Analysis of Quick Sort

Time Complexity

The time complexity of Quick Sort can vary based on the choice of pivot and the distribution of elements in the input array. However, on average, Quick Sort has a time complexity of O(n log n) for the average and best cases.

  • Best Case: The best case scenario occurs when the pivot divides the array into two sub-arrays of equal size. In this case, each partitioning step reduces the input size by half. As a result, the algorithm makes log n recursive calls, where n is the number of elements in the input array. Therefore, the best-case time complexity of Quick Sort is O(n log n).
  • Average Case: The average case time complexity of Quick Sort is also O(n log n). Although the algorithm may encounter some imbalanced partitions during the recursion, the average time complexity remains efficient due to the randomized selection of the pivot.
  • Worst Case: The worst-case scenario occurs when the pivot chosen is always the smallest or largest element, resulting in a highly imbalanced partitioning. In such cases, the algorithm reduces the input size by only one element at each step. This leads to a worst-case time complexity of O(n^2). However, the probability of encountering the worst case is extremely low, especially when using a randomized pivot selection technique.

Overall, Quick Sort’s average and best-case time complexity of O(n log n) make it one of the fastest sorting algorithms available.

Space Complexity

The space complexity of Quick Sort primarily depends on the implementation method used.

  • Recursive Implementation: The recursive implementation of Quick Sort typically has a space complexity of O(log n) due to the recursive function calls. This space is occupied by the call stack, which keeps track of the recursive calls and their corresponding variables.
  • In-place Implementation: An optimized version of Quick Sort can be implemented in-place, without requiring additional space for recursion. In this case, the space complexity is reduced to O(1) as the algorithm sorts the array in situ without using additional memory.

Note: It’s important to mention that although Quick Sort has a lower average time complexity compared to other sorting algorithms like Merge Sort, it may not always be the best choice for sorting if stability or worst-case scenarios are critical considerations. In such cases, alternative sorting algorithms may be more appropriate.

Conclusion

Quick Sort is a highly efficient sorting algorithm with an average-case time complexity of O(n log n). It follows the divide-and-conquer approach, making it suitable for sorting large datasets. Python provides a concise and readable syntax to implement Quick Sort. By understanding the steps involved in Quick Sort and implementing it correctly, you can efficiently sort arrays and improve the performance of your programs that involve sorting operations.

Share your love
Subhankar Rakshit
Subhankar Rakshit

Hey there! I’m Subhankar Rakshit, the brains behind PySeek. I’m a Post Graduate in Computer Science. PySeek is where I channel my love for Python programming and share it with the world through engaging and informative blogs.

Articles: 147